Big Data and War: Can a Cyberattack Justify an Armed Response?

February 20, 2023 By Jane Kelly, jak4g@virginia.edu Jane Kelly, jak4g@virginia.edu

Paul Stephan, a University of Virginia distinguished professor of law and expert in international dispute resolution and comparative law, recently posed an interesting question.

In the blog Lawfare, he writes, “If big data is a resource and therefore a potential target of armed conflict, what kinds of attacks justify an armed response and what are the rules governing such attacks?”

His post comes at an interesting time, when “[s]urveillance-oriented states, of which China is the foremost example, use big data to guide and bolster monitoring of their own people as well as potential foreign threats,” Stephan wrote. And don’t forget the renewed interest in artificial intelligence, “which uses big data as a means of optimizing the training and algorithm design on which it depends, as a cultural, economic, and social phenomenon,” he said.

These circumstances raised legal questions for Stephan, who is also a senior fellow at UVA’s Miller Center of Public Affairs. Could big data, which is growing in significance, be treated like territory, people and property, which are more traditional objects of international conflict, including armed conflict? Could a cyberattack on a data center that causes havoc, but no physical damage to the building or the people in it, justify an armed response?

Stephan argues that big data is, in fact, a resource, “and therefore a potential target in an armed conflict.”

UVA Today reached out to Stephan to learn more about his thoughts about the potential of the theft of big data leading to war.

Related Story

Portrait of Paul Stephan

UVA law professor Paul Stephan, an expert in international dispute resolution and comparative law, argues big data is “a potential target in an armed conflict.”

Q. Can you describe big data for our readers?

A. Big data is what we call enormous sets of data stored and organized so that it can be searched and otherwise used by computer programs. For example, anyone who uses social media contributes to the social media owner’s dataset by interacting with the service; it is the record of these interactions, the data, that allows the media owner to give away its services for “free,” with no direct charge.

Q. In your recent essay, you argue that big data is a resource and therefore a potential target in an armed conflict. Why is that?

A. Big data has at least two economically valuable features. It can be “mined,” or searched, to learn about trends and developments that may not be apparent through other means of observation. Think of internet searches on flu symptoms as an early warning mechanism for an epidemic.

Also, it can be used for the development of artificial intelligence, which is manufactured by “training,” or running in a directed fashion, algorithms on data sets. The bigger the data set, the better the training and therefore the better the artificial intelligence.

China and the United States are probably the world leaders in the exploitation of big data for both commercial and public interests, with Europe far behind. Also, many important social systems, like finance, public safety and transport systems, rely on big data to operate.

The more valuable the resource, the more tempting to target it by taking it down in the course of an international dispute.

Q. Traditionally, attacks on data or data infrastructure have been met with similar cyber retaliations. What level of intrusion would give rise to the justification of an armed response?

A. The traditional view has been that attacks with direct consequences in the material world justify an armed response, which is sometimes called a kinetic response. Think of taking down airplanes, causing car crashes, or producing infrastructure failures that lead to immediate death and destruction. Most people believe that a state legitimately can invoke its right to self-defense to respond to such actions with armed force.

Q. It sounds like data generally had not been treated as an “object” that can be harmed in an attack in the same way a military ship or outpost might be. Is that changing?

A. The experts who studied these questions on behalf of the North Atlantic Treaty Organization reached that conclusion, but there were dissents then and a few governments have issued statements indicating they might be open to an easier standard. The United Kingdom, for example, has suggested that an attack on its financial system, even without direct physical destruction of people or property, might justify an armed response, if the economic damage is great enough.

Q. Is law regarding armed conflict keeping pace with rapid technology accelerations? Or are technological advances, including artificial intelligence, happening too fast for the law to keep pace?

A. I would restate the question by arguing that traditional forms of lawmaking, such as treaties and statements by international organizations, can’t keep up. States try to fill the gap with their actions and explanations for their actions. This can generate more noise than signal, but attentive behavior by states with the power and capacity to make these choices can point in the direction of some observable standards. Without at least some clarity, there can’t be law, I think.

Give Where You Live, Support Our Local NonProfits. Donate Now
Give Where You Live, Support Our Local NonProfits. Donate Now

Q. In your piece for Lawfare, you write, “Surveillance-oriented states, of which China is the foremost example, use big data to guide and bolster monitoring of their own people as well as potential foreign threats.” What do you make of the Chinese spy balloon and three other objects the United States has shot down in recent days?

A. The assumption that many people seem to have made (I have no insider information here, and emphatically learned nothing about this technology during my times in government service) is that the balloon had an intelligence mission, probably gathering signal intelligence more than visual information.

Orbital satellites do this, too, but their orbits are easier to predict, which makes countermeasures easier. Balloons, people say, can operate more unpredictably and thus make countermeasures harder.

Incidentally, one of the arguments that government sources reportedly have made, or so the press reports, is that U.S. technical assets were turned on to track the signals coming to and from the balloon, thereby exploiting the balloon as a source of intelligence to the benefit of the United States; hence the delay in taking it down. This seems plausible to me, but again I have no independent information here.

There is a perpetual conflict between intelligence forces, which like to keep operations alive as long as possible to learn as much as possible, and law enforcement forces, which tend to prefer shutting down an adversary’s bad behavior as quickly as possible.

Q. What inspired you to write your piece on big data and the law of war?

A. The main point of my Lawfare post was to offer rules governing big data during times of conflict as an example of how international law can develop in the absence of traditional methods of lawmaking, such as treaties or authoritative statements by appropriate international bodies.

Q. Can you offer some examples?

A. Two norms that seem to me both benign and reachable are: one, no state sponsorship of ransomware or comparable malign cyber-based activity; and two, no armed responses to attacks on big data that do not entail direct physical injury to people or property.

Media Contact

Jane Kelly

University News Senior Associate Office of University Communications