Coronavirus Cybersecurity Internet

New Atlanticist

April 21, 2020

The 5×5—On viral infections online and in the real world

By Simon Handler

This article is part of the monthly 5×5 series by the Cyber Statecraft Initiative, in which five featured experts answer five questions on a common theme, trend, or current event in the world of cyber. Interested in the 5×5 and want to see a particular topic, event, or question covered? Contact Simon Handler with the Cyber Statecraft Initiative at SHandler@atlanticcouncil.org.

In 2008, Microsoft discovered that a computer worm exploited a network service vulnerability in Windows operating systems, infecting organizations ranging from the Houston municipal courts, to militaries and multinational corporations. Dubbed Conficker, the self-replicating computer worm was the most widespread in history, drawing as many as ten million computers into a globe-spanning botnet. While the worm did more to disrupt than destroy, it kicked off a worldwide response effort known as the Conficker Working Group, aimed at more effective public-private sector communication in countering cybercrime.

Nearly a decade later, a malicious worm known as NotPetya, attributed to Russian hacking group Sandworm Team, exploited another Microsoft network service vulnerability with significantly more devastating consequences. While Ukraine was the initial target of NotPetya, the worm’s paralytic effects impacted global enterprises responsible for maritime trade, pharmaceutical manufacturing, healthcare delivery, and more, with a White House estimate of $10 billion in damages—the most costly known cyberattack in history.

Both incidents illustrate how contagious a worm can be––where one may start may be far from where it ends. Cybersecurity often gets reduced to breaches and hacking, but the world has witnessed multiple pandemics in cyberspace and could learn more about response to exponential events.

Cyber Statecraft Initiative experts go 5×5 to ask what the cybersecurity community can learn from global public health and the response to novel coronavirus.

#1 People and computers are not the same, so complete this sentence: The biggest difference in behavior between a computer virus and a viral disease is _________.

Thomas Dullien, CEO, optimyze.cloud Inc.: “Computer networks have different notions of ‘distance’ from the real world; viral diseases can be contained locally, because there is a notion of ‘locality.’ A machine in Wisconsin can infect a machine in Wuhan, a human in Wisconsin will not infect a human in Wuhan.”

Jason Healey, nonresident senior fellow, Cyber Statecraft Initiative; senior research scholar, Columbia University’s School for International and Public Affairs: “…people get scared of viral disease and fear for their lives, not their hard drives.”

Beau Woodscyber safety innovation fellow, Cyber Statecraft Initiative; founder/CEO, Stratigos Security: “…we can update software code in ways we cannot (yet) easily update genetic code.”

Bobbie Stempfleynonresident senior fellow, Cyber Statecraft Initiative; director, CERT Division at the Software Engineering Institute at Carnegie Mellon University: “…the speed in which a computer virus can propagate. A second, but important difference is cost—the cost of recovery and treatment is more transparent in a viral disease than it is for a computer virus.”

Bill Woodcock, executive director, Packet Clearing House: “Polymorphism at scale is a huge distinguishing factor. The quick evolution of viruses is one of their primary distinguishing features, and that depends upon evolutionary pressure, which in turn depends upon random mutation and new generations being tested and succeeding or failing. By contrast, computer code has the potential to change very quickly, but in practice, the multiplier on that change is too low to effectively weed out unsuccessful mutations in timescales that would make them dangerous. The number of processor cycles which would have to be dedicated to achieve that evolution would be an immense cost, and the virtual electronic Petri dish requires development and careful tending, so self-evolving computer viruses aren’t things that just run wild across the internet.”

#2 What can we learn from the response to coronavirus that can be applied to mitigating computer worms or other self-replicating viruses?

Dullien: “Reducing the reproductive number (‘R0’) below one is useful. The trouble is that ‘R0’ in a fast-spreading network worm is only limited by bandwidth; see the trouble with ‘lack of locality.’ This means that reducing R0 to an acceptable level may also mean ‘everything needs to grind to a halt,’ which isn’t desirable either.”

Healey: “There is a tremendous amount to learn, like the importance of early reporting and coordination, global coordination, and impact on the economy, ‘patching’ your immune system with a vaccine, and more. Fortunately, coronavirus doesn’t seem to be mutating, which we can often see when the authors of a self-replicating virus change the code to defeat defenders (though this is directed evolution).”

Woods: “Isolation/segmentation and cyber hygiene are very effective at stopping the spread.”

Stempfley: “Clear articulation of what is happening, rapid consensus on what to do about it, and simple instructions for how to respond have been some of the most useful tools in responding to coronavirus and is a place to improve in the overall responding to computer virus and other significant cyber events. At a macro-level the discussion about the roles of governments, state and federal, industry, nongovernmental organizations, and individuals provide an interesting parallel as there is less consensus on roles for responding to a computer virus/worm of any significance.”

Woodcock: “Avoiding monocultures is the big one, and it’s something that used to be better understood. It now appears that a majority of people have some significant degree of resistance to the novel coronavirus, and that occurs because people aren’t a clonal population. Difference between computer operating systems and major applications are minuscule by comparison. There are seven and a half billion genetically unique individuals on earth, while only about four million people share identical genomes with a single other individual (identical twins) and fewer than ten thousand people share identical genomes with two or more other individuals. By contrast, there are 1.3 billion instances of the Android operating system, with only minor differences between them, 300 million Apple users, likewise divided between only a few versions, 280 million Windows users, also divided between only a few versions, and about 500 million other operating system installations divided among a few hundred Unix and RTOS variants. So, on average, any operating system installation is likely identical to at least tens of millions of others. While there are huge benefits to standardization, there are also huge risks, which are not shared with the biological community.”

More from the Cyber Statecraft Initiative:

#3 Did the cybersecurity community’s response to WannaCry and NotPetya in 2017 demonstrate the lessons learned from responding to Conficker and other worms? Why or why not?

Dullien: “…I think we’re not seeing Slammer-style worms for lack of attacker interest, not excess of defender competence.”

Healey: “I’m not convinced because they were quite different with Conficker causing no real damage but was an extended back and forth between attacker and defender. Greg Rattray and I wrote on this and compared it to an earlier pandemic scare, H5N1. WannaCry and NotPetya had more in common with the late 90s and early 00s when the Internet was plagued with major worms: ILOVEYOU, Code Red, Nimda, SQL Slammer.”

Woods: “In some ways, yes. The security research community—independents and those at companies—quickly began analyzing the malware and shared information openly to allow defenders to quickly respond.”

Stempfley: “The real issue isn’t whether the cybersecurity community’s response improved, as it is improving with each event, rather it is that the linkages and understanding of the interconnected nature of the vectors remain poorly understood. WannaCry and NotPetya required great collaboration between the cybersecurity community, manufacturing, and the healthcare community. So many of these interconnections need to be recognized and exercised in order to be prepared for the next worm and the world where we may be handling several simultaneously.”

Woodcock: “I don’t think we’re seeing rapid evolution in the community’s way of responding to threats. The community muddles through, and other factors, such as the weakness of the target and the aggressiveness of the propagation method, have a much greater influence on the severity of the result.”

#4 What sort of cyber hygiene advice gets shared most widely but is least helpful or even wrong? What is the most helpful, widely shared or not?

Dullien: “‘Don’t click on links’ is the least helpful; it’s like trying to tell a supermarket cashier to recognize which customers may have the flu. Most helpful—simpler is better.”

Healey: “Don’t click on unknown links is cyber social distancing.”

Woods: “Frequent password changes have proven harmful, as it often leads to more predictable passwords and increases organizational cost when people forget the password they just changed. On the other hand, time-based (such as authenticator apps) or token-based (such as Yubikey) multifactor authentication provide higher security, often at a lower cost, and password managers make strong, unique passwords easy.”

Stempfley: “The idea that cyber hygiene is simple and inexpensive, that it is the cyber equivalent of washing one’s hands, has created an impression that the lack of a cyber hygiene action is merely a lack of will. It is that, but the environments are generally complex and managing security debt they’ve inherited with their technology baselines is sometimes misrepresented by the term basic cyber hygiene. Auto-update features, Center for Internet Security (CIS) benchmarks, and other things that can be done at scale make the most difference.”

Woodcock: “We’ve very recently passed the point where passwords and VPNs are useful tools, yet people’s understandings of best-practices aren’t updating quickly, nor are new tools and infrastructure being released quickly enough to supplant those which no longer function. Witness the excruciatingly slow pace at which DNS-Based Authentication of Named Entities (DANE) is replacing the failed Certificate Authority system, as a prime example.”

Cyber Statecraft Initiative Updates

Sign up for the Cyber Statecraft Initiative newsletter to stay up to date on the program’s work.



  • This field is for validation purposes and should be left unchanged.

#5 Do either the Centers for Disease Control (CDC) or the World Health Organization (WHO) serve as an effective model for coordination on cybersecurity incident response to self-replicating infections?

Healey: “Yes, very much so but with one issue and two major exceptions. They are a strong model for large scale coordination by independent experts, but they rely very much on mandatory reporting of disease, which we are far from having. The areas that fit worst starts with the fact that they are run by governments which means, in computer security terms, they will lack the agility, subject matter expertise, or ability to directly adjust the mechanisms of cyberspace the way the private sector can. Also, they rely on what we now know are extremely strict public health laws to impose onerous restrictions, more than can be stomached for cyber issues.”

Woods: “Some people have been calling for a CDC for cybersecurity for many years. While its research, preparedness, and public warning roles seem tempting, other agencies like the Department of Homeland Security (DHS), National Institute of Standards and Technology (NIST), and sector specific agencies largely already cover much of these capabilities. And the incentives don’t seem to be there to entice companies to self-report incidents, even if all organizations had the technical capability to identify and analyze attacks routinely. There has, however, been some promising work on translating aviation near miss reporting mechanisms to cybersecurity, though there is still a long way to go with that type of a system as well.”

Stempfley: “Public health models work to a point as an effective allegory for responding to self-replicating infections in the cyber domain. Their ability to engage at the earliest intervention point while simultaneously sharing information to protect others is the nirvana of the information sharing programs that have been attempted over the past five years. Recognized technical experts working in the public interest are a very useful element of a global response to self-replicating infections. Neither the CDC nor the WHO serve as perfect models, however both have elements that could be useful.”

Woodcock: “Not really; the situation is very different. Computer viruses are most effectively dealt with through automated responses, whereas humans aren’t subject to automation… We’re not building and releasing counter-viruses in the wild, and doing so would be fantastically stupid; yet one of the most effective (though infrequently used) ways of combating the distribution of viral code is to implement the same exploit (and thus distribute through the same vulnerable population) code which deactivates the original problematic code. Computers aren’t subject to disinformation and poor decision-making (at least not in the sense that people are) whereas that’s now one of the largest influences on humans’ ability to cope with pandemics.”

Simon Handler is the assistant director of the Atlantic Council’s Cyber Statecraft Initiative under the Scowcroft Center for Strategy and Security, focused on the nexus of geopolitics and national security with cyberspace. He is a former special assistant in the United States Senate. Follow him on Twitter @SimonPHandler

The Atlantic Council’s Cyber Statecraft Initiative, under the Digital Forensic Research Lab (DFRLab), works at the nexus of geopolitics and cybersecurity to craft strategies to help shape the conduct of statecraft and to better inform and secure users of technology.

Related Experts: Jason Healey, Beau Woods, and Bobbie Stempfley