A year after the United States bombed its pandemic performance in front of the world, the Delta variant opened the stage for a face-saving encore. If the U.S. had learned from its mishandling of the original SARS-CoV-2 virus, it would have been better prepared for the variant that was already ravaging India.
Instead, after a quiet spring, President Joe Biden all but declared victory against SARS-CoV-2. The CDC ended indoor masking for vaccinated people, pitting two of the most effective interventions against each other. As cases fell, Abbott Laboratories, which makes a rapid COVID-19 test, discarded inventory, canceled contracts, and laid off workers, The New York Times reported. Florida and Georgia scaled back on reporting COVID-19 data, according to Kaiser Health News. Models failed to predict Delta’s early arrival. The variant then ripped through the U.S.’s half-vaccinated populace and once again pushed hospitals and health-care workers to the brink. Delta’s extreme transmissibility would have challenged any nation, but the U.S. nonetheless set itself up for failure. Delta was an audition for the next pandemic, and one that America flubbed. How can a country hope to stay 10 steps ahead of tomorrow’s viruses when it can’t stay one step ahead of today’s?
America’s frustrating inability to learn from the recent past shouldn’t be surprising to anyone familiar with the history of public health. Almost 20 years ago, the historians of medicine Elizabeth Fee and Theodore Brown lamented that the U.S. had “failed to sustain progress in any coherent manner” in its capacity to handle infectious diseases. With every new pathogen—cholera in the 1830s, HIV in the 1980s—Americans rediscover the weaknesses in the country’s health system, briefly attempt to address the problem, and then “let our interest lapse when the immediate crisis seems to be over,” Fee and Brown wrote. The result is a Sisyphean cycle of panic and neglect that is now spinning in its third century. Progress is always undone; promise, always unfulfilled. Fee died in 2018, two years before SARS-CoV-2 arose. But in documenting America’s past, she foresaw its pandemic present—and its likely future.
More Americans have been killed by the new coronavirus than the influenza pandemic of 1918, despite a century of intervening medical advancement. The U.S. was ranked first among nations in pandemic preparedness but has among the highest death rates in the industrialized world. It invests more in medical care than any comparable country, but its hospitals have been overwhelmed. It helped develop COVID-19 vaccines at near-miraculous and record-breaking speed, but its vaccination rates plateaued so quickly that it is now 38th in the world. COVID-19 revealed that the U.S., despite many superficial strengths, is alarmingly vulnerable to new diseases—and such diseases are inevitable. As the global population grows, as the climate changes, and as humans push into spaces occupied by wild animals, future pandemics become more likely. We are not guaranteed the luxury of facing just one a century, or even one at a time.
It might seem ridiculous to think about future pandemics now, as the U.S. is consumed by debates over booster shots, reopened schools, and vaccine mandates. Prepare for the next one? Let’s get through this one first! But America must do both together, precisely because of the cycle that Fee and Brown bemoaned. Today’s actions are already writing the opening chapters of the next pandemic’s history.
Internationally, Joe Biden has made several important commitments. At the United Nations General Assembly last week, he called for a new council of national leaders and a new international fund, both focused on infectious threats—forward-looking measures that experts had recommended well before COVID-19.
But domestically, many public-health experts, historians, and legal scholars worry that the U.S. is lapsing into neglect, that the temporary wave of investments isn’t being channeled into the right areas, and that COVID-19 might actually leave the U.S. weaker against whatever emerges next. Donald Trump’s egregious mismanagement made it easy to believe that events would have played out differently with a halfway-competent commander who executed preexisting pandemic plans. But that ignores the many vulnerabilities that would have made the U.S. brittle under any administration. Even without Trump, “we’d still have been in a whole lot of trouble,” Gregg Gonsalves, a global-health activist and an epidemiologist at Yale, told me. “The weaknesses were in the rootstock, not high up in the trees.”
The panic-neglect cycle is not inevitable but demands recognition and resistance. “A pandemic is a course correction to the trajectory of civilization,” Alex de Waal, of Tufts University and the author of New Pandemics, Old Politics, told me. “Historical pandemics challenged us to make some fairly fundamental changes to the way in which society is organized.” Just as cholera forced our cities to be rebuilt for sanitation, COVID-19 should make us rethink the way we ventilate our buildings, as my colleague Sarah Zhang argued. But beyond overhauling its physical infrastructure, the U.S. must also address its deep social weaknesses—a health-care system that millions can’t access, a public-health system that’s been rotting for decades, and extreme inequities that leave large swaths of society susceptible to a new virus.
Early last year, some experts suggested to me that America’s COVID-19 failure stemmed from its modern inexperience with infectious disease; having now been tested, it might do better next time. But preparedness doesn’t come automatically, and neither does its absence. “Katrina didn’t happen because Louisiana never had a hurricane before; it happened because of policy choices that led to catastrophe,” Gonsalves said. The arc of history does not automatically bend toward preparedness. It must be bent.
On September 3, the White House announced a new strategy to prepare for future pandemics. Drafted by the Office of Science and Technology Policy, and the National Security Council, the plan would cost the U.S. $65 billion over the next seven to 10 years. In return, the country would get new vaccines, medicines, and diagnostic tests; new ways of spotting and tracking threatening pathogens; better protective equipment and replenished stockpiles; sturdier supply chains; and a centralized mission control that would coordinate all the above across agencies. The plan, in rhetoric and tactics, resembles those that were written before COVID-19 and never fully enacted. It seems to suggest all the right things.
But the response from the health experts I’ve talked with has been surprisingly mixed. “It’s underwhelming,” Mike Osterholm, an epidemiologist at the University of Minnesota, told me. “That $65 billion should have been a down payment, not the entire program. It’s a rounding error for our federal budget, and yet our entire existence going forward depends on this.” The pandemic plan compares itself to the Apollo program, but the government spent four times as much, adjusted for inflation, to put astronauts on the Moon. Meanwhile, the COVID-19 pandemic may end up costing the U.S. an estimated $16 trillion.
“I completely agree that it will take more investment,” Eric Lander, OSTP director and Biden’s science adviser, told me; he noted that the published plan is just one element of a broader pandemic-preparedness effort that is being developed. But even the $65 billion that the plan has called for might not fully materialize. Biden originally wanted to ask Congress to immediately invest $30 billion but eventually called for just half that amount, in a compromise with moderate Democrats who sought to slash it even further. The idea of shortchanging pandemic preparedness after the events of 2020 “should be unthinkable,” wrote former CDC Director Tom Frieden and former Senator Tom Daschle in The Hill. But it is already happening.
Others worry about the way the budget is being distributed. About $24 billion has been earmarked for technologies that can create vaccines against a new virus within 100 days. Another $12 billion will go toward new antiviral drugs, and $5 billion toward diagnostic tests. These goals are, individually, sensible enough. But devoting two-thirds of the full budget toward them suggests that COVID-19’s lessons haven’t been learned.
America failed to test sufficiently throughout the pandemic even though rigorous tests have long been available. Antiviral drugs played a bit part because they typically provide incremental benefits over basic medical care, and can be overly expensive even when they work. And vaccines were already produced far faster than experts had estimated and were more effective than they had hoped; accelerating that process won’t help if people can’t or won’t get vaccinated, and especially if they equate faster development with nefarious corner-cutting, as many Americans did this year. Every adult in the U.S. has been eligible for vaccines since mid-April; in that time, more Americans have died of COVID-19 per capita than people in Germany, Canada, Rwanda, Vietnam, or more than 130 other countries did in the pre-vaccine era.
“We’re so focused on these high-tech solutions because they appear to be what a high-income country would do,” Alexandra Phelan, an expert on international law and global health policy at Georgetown University, told me. And indeed, the Biden administration has gone all in on vaccines, trading them off against other countermeasures, such as masks and testing, and blaming “the unvaccinated” for America’s ongoing pandemic predicament. The promise of biomedical panaceas is deeply ingrained in the U.S. psyche, but COVID should have shown that medical magic bullets lose their power when deployed in a profoundly unequal society. There are other ways of thinking about preparedness. And there are reasons those ways were lost.
In 1849, after investigating a devastating outbreak of typhus in what is now Poland, the physician Rudolf Virchow wrote, “The answer to the question as to how to prevent outbreaks … is quite simple: education, together with its daughters, freedom and welfare.” Virchow was one of many 19th-century thinkers who correctly understood that epidemics were tied to poverty, overcrowding, squalor, and hazardous working conditions—conditions that inattentive civil servants and aristocrats had done nothing to address. These social problems influenced which communities got sick and which stayed healthy. Diseases exploit society’s cracks, and so “medicine is a social science,” Virchow famously said. Similar insights dawned across the Atlantic, where American physicians and politicians tackled the problem of urban cholera by fixing poor sanitation and dilapidated housing. But as the 19th century gave way to the 20th, this social understanding of disease was ousted by a new paradigm.
When scientists realized that infectious diseases are caused by microscopic organisms, they gained convenient villains. Germ theory’s pioneers, such as Robert Koch, put forward “an extraordinarily powerful vision of the pathogen as an entity that could be vanquished,” Alex de Waal, of Tufts, told me. And that vision, created at a time when European powers were carving up other parts of the world, was cloaked in metaphors of imperialism, technocracy, and war. Microbes were enemies that could be conquered through the technological subjugation of nature. “The implication was that if we have just the right weapons, then just as an individual can recover from an illness and be the same again, so too can a society,” de Waal said. “We didn’t have to pay attention to the pesky details of the social world, or see ourselves as part of a continuum that includes the other life-forms or the natural environment.”
Germ theory allowed people to collapse everything about disease into battles between pathogens and patients. Social matters such as inequality, housing, education, race, culture, psychology, and politics became irrelevancies. Ignoring them was noble; it made medicine and science more apolitical and objective. Ignoring them was also easier; instead of staring into the abyss of society’s intractable ills, physicians could simply stare at a bug under a microscope and devise ways of killing it. Somehow, they even convinced themselves that improved health would “ultimately reduce poverty and other social inequities,” wrote Allan Brandt and Martha Gardner in 2000.
This worldview accelerated a growing rift between the fields of medicine (which cares for sick individuals) and public health (which prevents sickness in communities). In the 19th century, these disciplines were overlapping and complementary. In the 20th, they split into distinct professions, served by different academic schools. Medicine, in particular, became concentrated in hospitals, separating physicians from their surrounding communities and further disconnecting them from the social causes of disease. It also tied them to a profit-driven system that saw the preventive work of public health as a financial threat. “Some suggested that if prevention could eliminate all disease, there would be no need for medicine in the future,” Brandt and Gardner wrote.
This was a political conflict as much as an ideological one. In the 1920s, the medical establishment flexed its growing power by lobbying the Republican-controlled Congress and White House to erode public-health services including school-based nursing, outpatient dispensaries, and centers that provided pre- and postnatal care to mothers and infants. Such services were examples of “socialized medicine,” unnecessary to those who were convinced that diseases could best be addressed by individual doctors treating individual patients. Health care receded from communities and became entrenched in hospitals. Decades later, these changes influenced America’s response to COVID-19. Both the Trump and Biden administrations have described the pandemic in military metaphors. Politicians, physicians, and the public still prioritize biomedical solutions over social ones. Medicine still overpowers public health, which never recovered from being “relegated to a secondary status: less prestigious than clinical medicine [and] less amply financed,” wrote the sociologist Paul Starr. It stayed that way for a century.
During the pandemic, many of the public-health experts who appeared in news reports hailed from wealthy coastal universities, creating a perception of the field as well funded and elite. That perception is false. In the early 1930s, the U.S. was spending just 3.3 cents of every medical dollar on public health, and much of the rest on hospitals, medicines, and private health care. And despite a 90-year span that saw the creation of the CDC, the rise and fall of polio, the emergence of HIV, and relentless calls for more funding, that figure recently stood at … 2.5 cents. Every attempt to boost it eventually receded, and every investment saw an equal and opposite disinvestment. A preparedness fund that was created in 2002 has lost half its budget, accounting for inflation. Zika money was cannibalized from Ebola money. America’s historical modus operandi has been to “give responsibility to the local public-health department but no power, money, or infrastructure to make change,” Ruqaiijah Yearby, a health-law expert at Saint Louis University, told me.
Lisa Macon Harrison, who directs the department that serves Granville and Vance Counties, in North Carolina, told me that to protect her community of 100,000 people from infectious diseases—HIV, sexually transmitted infections, rabies, and more—the state gives her $4,147 a year. That’s 90 times less than what she actually needs. She raises the shortfall herself through grants and local dollars.
Trifling budgets mean smaller staff, which turns mandatory services into optional ones. Public-health workers have to cope with not just infectious diseases but air and water pollution, food safety, maternal and child health, the opioid crisis, and tobacco control. But with local departments having lost 55,000 jobs since the 2008 recession, many had to pause their usual duties to deal with COVID-19. Even then, they didn’t have staff to do the most basic version of contact tracing—calling people up—let alone the ideal form, wherein community health workers help exposed people find food, services, and places to isolate. When vaccines were authorized, departments had to scale back on testing so that overworked staff could focus on getting shots into arms; even that wasn’t enough, and half of states hired armies of consultants to manage the campaign, The Washington Post reported.
In May, the Biden administration said that it would invest $7.4 billion in recruiting and training public-health workers, creating tens of thousands of jobs. But those new workers would be air-dropped into an infrastructure that is quite literally crumbling. Many public-health departments are housed in buildings that were erected in the 1940s and ’50s, when polio money was abundant; they are now falling apart. “There’s a trash can in the hallway in front of my environmental-health supervisor’s office to catch rain that might come through the ceiling,” Harrison told me. And between their reliance on fax machines and decades-old data systems, “it feels like we’re using a Rubik’s Cube and an abacus to do pandemic response,” Harrison added.
Last year, America’s data systems proved to be utterly inadequate for tracking a rapidly spreading virus. Volunteer efforts such as the COVID Tracking Project (launched by The Atlantic) had to fill in for the CDC. Academics created a wide range of models, some of which were misleadingly inaccurate. “For hurricanes, we don’t ask well-intentioned academics to stop their day jobs and tell us where landfall will happen,” the CDC’s Dylan George told me. “We turn to the National Hurricane Center.” Similarly, George hopes that policy makers can eventually turn to the CDC’s newly launched Center for Forecasting and Outbreak Analytics, where he is director of operations. With initial funding of about $200 million, the center aims to accurately track and predict the paths of pathogens, communicate those predictions with nuance, and help leaders make informed decisions quickly.
But public health’s long-standing neglect means that simply making the system fit for purpose is a mammoth undertaking that can’t be accomplished with emergency funds—especially not when those funds go primarily toward biomedical countermeasures. That’s “a welfare scheme for university scientists and big organizations, and it’s not going to trickle down to the West Virginia Department of Health,” Gregg Gonsalves, the health activist and epidemiologist, told me. What the U.S. needs, as several reports have recommended and as some senators have proposed, is a stable and protected stream of money that can’t be diverted to the emergency of the day. That would allow health departments to properly rebuild without constantly fearing the wrecking ball of complacency. Biden’s $7.4 billion bolus is a welcome start—but just a start. And though his new pandemic-preparedness plan commits $6.5 billion toward strengthening the U.S. public-health system over the next decade, it might take $4.5 billion a year to actually do the job.
“Nobody should read that plan as the limit of what needs to be done,” Eric Lander, the president’s science adviser, told me. “I have no disagreement that a major effort and very substantial funding are needed,” and, he noted, the administration’s science and technology advisers will be developing a more comprehensive strategy. “But is pandemic preparedness the lens through which to fix public health?” Lander asked. “I think those issues are bigger—they’re everyday problems, and we need to shine a spotlight on them every day.”
But here is public health’s bind: Though it is so fundamental that it can’t (and arguably shouldn’t) be tied to any one type of emergency, emergencies are also the one force that can provide enough urgency to strengthen a system that, under normal circumstances, is allowed to rot. When a doctor saves a patient, that person is grateful. When an epidemiologist prevents someone from catching a virus, that person never knows. Public health “is invisible if successful, which can make it a target for policy makers,” Ruqaiijah Yearby, the health-law expert, told me. And during this pandemic, the target has widened, as overworked and under-resourced officials face aggressive protests. “Our workforce is doing 15-hour days and rather than being glorified, they’re being vilified and threatened with bodily harm and death,” Harrison told me. According to an ongoing investigation by the Associated Press and Kaiser Health News, the U.S. has lost at least 303 state or local public-health leaders since April 2020, many because of burnout and harassment.
Even though 62 percent of Americans believe that pandemic-related restrictions were worth the cost, Republican legislators in 26 states have passed laws that curtail the possibility of quarantines and mask mandates, as Lauren Weber and Anna Maria Barry-Jester of KHN have reported. Supporters characterize these laws as checks on executive power, but several do the opposite, allowing states to block local officials or schools from making decisions to protect their communities. Come the next pandemic (or the next variant), “there’s a real risk that we are going into the worst of all worlds,” Alex Phelan, of Georgetown University, told me. “We’re removing emergency actions without the preventive care that would allow people to protect their own health.” This would be dangerous for any community, let alone those in the U.S. that are structurally vulnerable to infectious disease in ways that are still being ignored.
Biden’s new pandemic plan contains another telling detail about how the U.S. thinks about preparedness. The parts about vaccines and therapeutics contain several detailed and explicit strategies. The part about vulnerable communities is a single bullet point that calls for strategies to be developed.
This isn’t a new bias. In 2008, Philip Blumenshine and his colleagues argued that America’s flu-pandemic plans overlooked the disproportionate toll that such a disaster would take on socially disadvantaged people. Low-income and minority groups would be more exposed to airborne viruses because they’re more likely to live in crowded housing, use public transportation, and hold low-wage jobs that don’t allow them to work from home or take time off when sick. When exposed, they’d be more susceptible to disease because their baseline health is poorer, and they’re less likely to be vaccinated. With less access to health insurance or primary care, they’d die in greater numbers. These predictions all came to pass during the H1N1 swine-flu pandemic of 2009.
When SARS-CoV-2 arrived a decade later, history repeated itself. The new coronavirus disproportionately infected essential workers, who were forced to risk exposure for the sake of their livelihood; disproportionately killed Pacific Islander, Latino, Indigenous, and Black Americans; and struck people who’d been packed into settings at society’s margins—prisons, nursing homes, meatpacking facilities. “We’ve built a system in which many people are living on the edge, and pandemics prey on those vulnerabilities,” Julia Raifman, a health-policy researcher at Boston University, told me.
Such patterns are not inevitable. “It is very clear, from evidence and history, that robust public-health systems rely on provision of social services,” Eric Reinhart, a political anthropologist and physician at Northwestern University, told me. “That should just be a political given, and it is not. You have Democrats who don’t even say this, let alone Republicans.” America’s ethos of rugged individualism pushes people across the political spectrum to see social vulnerability as a personal failure rather than the consequence of centuries of racist and classist policy, and as a problem for each person to solve on their own rather than a societal responsibility. And America’s biomedical bias fosters the seductive belief that these sorts of social inequities won’t matter if a vaccine can be made quickly enough.
But inequity reduction is not a side quest of pandemic preparedness. It is arguably the central pillar—if not for moral reasons, then for basic epidemiological ones. Infectious diseases can spread, from the vulnerable to the privileged. “Our inequality makes me vulnerable,” Mary Bassett, who studies health equity at Harvard, told me. “And that’s not a necessary feature of our lives. It can be changed.”
“To be ready for the next pandemic, we need to make sure that there’s an even footing in our societal structures,” Seema Mohapatra, a health-law expert at Southern Methodist University, in Dallas, told me. That vision of preparedness is closer to what 19th-century thinkers lobbied for, and what the 20th century swept aside. It means shifting the spotlight away from pathogens themselves and onto the living and working conditions that allow pathogens to flourish. It means measuring preparedness not just in terms of syringes, sequencers, and supply chains but also in terms of paid sick leave, safe public housing, eviction moratoriums, decarceration, food assistance, and universal health care. It means accompanying mandates for social distancing and the like with financial assistance for those who might lose work, or free accommodation where exposed people can quarantine from their family. It means rebuilding the health policies that Ronald Reagan began shredding in the 1980s and that later administrations further frayed. It means restoring trust in government and community through public services. “It’s very hard to achieve effective containment when the people you’re working with don’t think you care about them,” Arrianna Marie Planey, a medical geographer at the University of North Carolina at Chapel Hill, told me.
In this light, the American Rescue Plan—the $1.9 trillion economic-stimulus bill that Biden signed in March—is secretly a pandemic-preparedness bill. Beyond specifically funding public health, it also includes unemployment insurance, food-stamp benefits, child tax credits, and other policies that are projected to cut the poverty rate for 2021 by a third, and by even more for Black and Hispanic people. These measures aren’t billed as ways of steeling America against future pandemics—but they are. Also on the horizon is a set of recommendations from the COVID-19 Health Equity Task Force, which Biden established on his first full day of office. “The president has told many of us privately, and said publicly, that equity has to be at the heart of what we do in this pandemic,” Vivek Murthy, the surgeon general, told me.
Some of the American Rescue Plan’s measures are temporary, and their future depends on the $3.5 trillion social-policy bill that Democrats are now struggling to pass, drawing opposition from within their own party. “Health equity requires multiple generations of work, and politicians want outcomes that can be achieved in time to be recognized by an electorate,” Planey told me. That electorate is tiring of the pandemic, and of the lessons it revealed.
Last year, “for a moment, we were able to see the invisible infrastructure of society,” Sarah Willen, an anthropologist at the University of Connecticut who studies Americans’ conceptions of health equity, told me. “But that seismic effect has passed.” Socially privileged people now also enjoy the privilege of immunity, while those with low incomes, food insecurity, eviction risk, and jobs in grocery stores and agricultural settings are disproportionately likely to be unvaccinated. Once, they were deemed “essential”; now they’re treated as obstinate annoyances who stand between vaccinated America and a normal life.
The pull of the normal is strong, and our metaphors accentuate it. We describe the pandemic’s course in terms of “waves,” which crest and then collapse to baseline. We bill COVID-19 as a “crisis”—a word that evokes decisive moments and turning points, “and that, whether you want to or not, indexes itself against normality,” Reinhart told me. “The idea that something new can be born out of it is lost,” because people long to claw their way back to a precrisis state, forgetting that the crisis was itself born of those conditions.
Better ideas might come from communities for whom “normal” was something to survive, not revert to. Many Puerto Ricans, for example, face multiple daily crises including violence, poverty, power outages, and storms, Mónica Feliú-Mójer, of the nonprofit Ciencia Puerto Rico, told me. “They’re always preparing,” she said, “and they’ve built support networks and mutual-aid systems to take care of each other.” Over the past year, Ciencia PR has given small grants to local leaders to fortify their communities against COVID-19. While some set up testing and vaccination clinics, others organized food deliveries or educational events. One cleaned up a dilapidated children’s park to create a low-risk outdoor space where people could safely reconnect. Such efforts recognize that resisting pandemics is about solidarity as well as science, Feliú-Mójer told me.
The panic-neglect cycle is not irresistible. Some of the people I spoke with expressed hope that the U.S. can defy it, just not through the obvious means of temporarily increased biomedical funding. Instead, they placed their faith in grassroots activists who are pushing for fair labor policies, better housing, health-care access, and other issues of social equity. Such people would probably never think of their work as a way of buffering against a pandemic, but it very much is—and against other health problems, natural disasters, and climate change besides. These threats are varied, but they all wreak their effects on the same society. And that society can be as susceptible as it allows itself to be.