US policymakers knew a pandemic was coming: Why they ignored the warnings

FILE PHOTO: A nurse wearing personal protective equipment watches an ambulance driving away outside of Elmhurst Hospital during the ongoing outbreak of the coronavirus disease (COVID-19) in the Queens borough of New York, U.S., April 20, 2020. REUTERS/Lucas Jackson/File Photo

In 2008 the National Intelligence Council (NIC) published its Global Trends 2025, of which I was the principal author. The report’s whole page seventy-five was devoted to the potential emergence of a “global pandemic”:

“If a pandemic disease emerges, it probably will occur in an area marked by high population density and close association between humans and animals, such as many areas of China and Southeast Asia…slow public health response would delay the realization that a highly transmissible pathogen had emerged…despite limits imposed on international travel, travelers with mild symptoms, or whoever were asymptomatic would carry the disease to other continents….in the worst case tens to hundreds of millions of Americans within the US Homeland would be become ill and deaths would mount into the tens of millions.” This estimate was clarified in a footnote: “How fast a disease spreads, how many people become sick, how long they stay sick, the mortality, and the symptoms and after-effects will vary according to the specific characteristics of whatever pathogen is responsible for a pandemic. This scenario posits plausible characteristics that fall within a range of possibilities for these variables.”

Sound familiar? I have a grim satisfaction in feeling vindicated for the hard work I and many others within the intelligence community (IC) put into providing the best possible advice about the future, the good and bad. Too many times we were accused of being too pessimistic. I would love to have been wrong in this case given the human devastation this coronavirus has and will cause.  

The mentions in Global Trends 2025 and Global Trends 2030 were not the only warnings that the Intelligence Community provided. The director of national intelligence (DNI) for which the NIC works gives each year an annual threat assessment. Between 2005 and 2013 when I retired, I also compiled it from the latest intelligence analysis contributed by my colleagues across the IC. 

In January 2009, the then-DNI Dennis Blair testified that, “the most pressing transnational health challenge for the United States is still the potential for emergence of a severe pandemic, with the primary candidate being a highly lethal influenza virus.” This was followed up in 2010 with a warning about how much a pandemic could disrupt the economy. 

In 2013, DNI General James Clapper warned that “an easily transmissible, novel respiratory pathogen that kills or incapacitates more than one percent of its victims is among the most disruptive events possible. Such an outbreak would result in a global pandemic.” In recent years, DNI Dan Coates issued similar warning about the dire consequences of a pandemic in 2017 and 2019. Finally, despite US President Donald J. Trump’s assertion that no one could have predicted the coronavirus, by the end of January 2020, a majority of his intelligence dump each morning concerned a possible pandemic.

There are lessons to be drawn from how all the warnings have been ignored. Sure, the policymaker can say that there are too many warnings on all sorts of things. In his or her mind, I and others in the foresight business are like the little boy in the story who calls “fire” so many times it’s impossible to know which one is the “true” warning. The real problem is that we live in an age of wicked problems, something which we in the strategic foresight business and those making policy can’t escape from. While I would venture much has been done since 9/11 to improve the strategic foresight part of the equation, our policymaking process hasn’t much changed since the beginning of the Cold War, if even then. Maybe such a long list of threats which are enumerated each year in the annual threat assessment can’t be processed by any one policymaker or even a set of them who sit regularly in the White House Situation Room to deliberate on the direction for the country. But shouldn’t we have a process where somebody sits with them? That person must not have a bureaucratic axe to grind except to remind the others of the threats they are ignoring or the unintended effects of their actions.    

Almost a decade ago Leon Fuerth, former deputy national security advisor in the Clinton administration, set out the crux of the problem in his pathbreaking study on Anticipatory Governance:

“A well-functioning Republic needs time for deliberation, and the US Constitution was designed to make sure that this time would be protected. On the other hand, challenges presenting themselves today are increasingly fast- moving and complex: they involve concurrent interactions among events across multiple dimensions of governance; they have no regard for our customary jurisdictional and bureaucratic boundaries; they cannot be broken apart and solved piece by piece; and rather than stabilizing into permanent solutions, they morph into new problems that have to be continually managed. This pattern profoundly challenges the adaptive capacity of our legacy systems of government, which are essentially modeled on the early industrial period: vertical, hierarchical, segmented, mechanical, and sluggish. Our 19th-century government is simply not built for the nature of 21st-century challenges (my emphasis).

He lays out a solution that does not require huge legislative changes but could be instituted tomorrow if the various administrations over the years were serious about dealing with the increasing complexity.  It’s striking that it’s not just Republican administrations, but also Democratic ones who have ignored his suggestions. Three basic changes as outlined in the Fuerth study are needed: “integrating foresight and policy, networking governance, and using feedback for applied learning.”   

As we are concerned most here with foresight, Fuerth says, “a foresight-generating and horizon-scanning system can help government detect trends and weak signals, visualize alternative futures, and foster better outcomes.” Currently, “the central problem is that no mechanism exists for bringing foresight and policymaking into an effective relationship. This problem is partly political, partly cultural, and partly a matter of inadequate systems-design. The political and cultural issues are very difficult to deal with, but mechanisms can be put in place to ensure that foresight and policy come together by design, rather than by chance. These initiatives focus on ways to institutionalize an “interface” that can integrate foresight into the policy process.”

The Global Trends reports are published every four years and the Annual Threat Assessments yearly.  Both receive a lot of policymaker and media attention during a short period. Policy planners at State, Defense, the National Security Council (NSC) and elsewhere use them religiously, but, as Fuerth wrote, there is no institutionalized “‘interface’ that can integrate foresight into the policy.” 

Unfortunately, the warning about the pandemic illustrates this perfectly. It was given multiple times, not just the once in 2008. But it was certainly not uppermost when the “US CDC program in China was cut down to three US experts and a handful of local hires and the Beijing-based EID program was drawn down.” According to Deborah Seligshon, the former environment, science, technology, and health counselor at the US Embassy in Beijing from 2003 to 2007: “Had this program been up and running at full strength, we wouldn’t have needed to offer to send a team of health experts, as the administration did last month; we would have had people available and familiar with relationships of trust right there in China.” She rightly notes how science cooperation with China has been politicized: “Researchers can’t study disease unless they have access to it, and diseases emerge in different parts of the world.”

It’s high time we bring the government’s decision making into the twenty-first century. The knowledge is there for how to do it. What’s our excuse as a country for ignoring it? 

Mathew J. Burrows is the director of the Atlantic Council’s Foresight, Strategy, and Risks Initiative in the Scowcroft Center for Strategy and Security. Follow him on Twitter @matburrows.

Further reading: