- The Colonial Pipeline hack shows how devastating a cyberattack can be
- The hack itself was bad, but the hoarding of gasoline following the attack speaks to a deeper problem
- Disinformation and misinformation are a unique threat vector that we must consider
A ransomware attack recently made headlines when it shut down the Colonial Pipeline for six days and caused panic buying of gasoline across the Eastern Seaboard.
This is only one of many such attacks in recent months. Since the start of the pandemic, Ryuk — a prolific ransomware group — has hit at least 235 healthcare facilities across the U.S., crippling their services for weeks and profiting millions of dollars.
These attacks reflect a larger international trend, as globally, security threats become more frequent and complex.
“The landscape of national security has changed significantly in a number of ways, vis-a-vis 30–40 years ago and again 20 or even 15 years ago. And now, the pace of technological and social change is accelerating the rate at which security threats manifest,” says Scott Ruston, who is a research professor with the Global Security Initiative, a leading national security research laboratory, and who directs the Center on Narrative, Disinformation and Strategic Influence.
Operating out of Arizona State University, the laboratory houses four centers — cybersecurity, artificial intelligence, disinformation, and analytics — to tackle the world’s most complicated security threats.
“One of the things that I think is special about the contemporary moment is the degree to which many different security threats are interwoven,” says Ruston. “And I can think of no more timely or stark an example than the Colonial Pipeline shutdown.”
Ruston explains how misinformation chains exacerbated the effects of the attack:
“We saw the pipeline hack: There were some supply chain disruptions for both jet fuel and automotive fuel, but the disruptions were not consistent; they were not easily planned across the Eastern Seaboard.
However, we saw all kinds of hoarding behavior. In regions where they didn't have actual supply chain challenges, they had these dramatic spikes in demand, anomalies in the market, due to rumors — rumors that there wasn't going to be any fuel, rumors that Armageddon was just around the corner, those kinds of things.
And that's where some of the research that my center does comes in. What we look at is how the information environment is manipulated by adversarial actors to have deleterious effects on society, specific communities, and decision-making within the defense sector.”
Many information consumers participate in these information chains without harmful intent. This is key in distinguishing misinformation from disinformation.
“Disinformation is information that is false, misleading, or inaccurate and that is distributed with the willful intent to deceive or otherwise influence,” says Ruston.
“Whether it's a photo, tweet, press release, news report, or piece of a news report, not only does it have a particular meaning intended by the creator, but each audience member that receives that information is creating a meaning relevant to them,” says Ruston.
What may start as disinformation via an adversarial actor can become misinformation as its forwarded through communities by those unaware of its inaccuracy or falseness, or the manipulative intent of the original distributor.
Information wild west
In the past, large-scale information hubs were regulated through editorial oversight to preserve accuracy, as well as public trust and company image. Today, however, there are far more venues for information distribution, many of which are unregulated.
“That free flow of ideas has great social benefits and increases access to information. But a loss of that context which editorial oversight applies means that there's a lot of information circulating out of context,” says Ruston.
“One of the advancements of the modern media landscape is that almost anybody — individual, small group, business can have a significant impact with minimal outlay of funds. There are few barriers to entry into the media landscape. But a consequence of that shift from very few information distributors to many, is that these new players may not have reputational or financial stake in the game, as a newspaper, TV station, or alternate sociocultural institution would. All of those factors go into creating this sort of information wild, wild west.”
The late 1800s saw a similar period of social evolution as yellow journalism reached its height. Proliferating information outlets competed and attracted readership through sensationalist reporting, until sociopolitical backlash and public distrust prompted the adoption of new journalistic standards.
Ruston states that industry and government are responding similarly in today’s accelerating information cycle:
“I think we're seeing that again now; it's just that we're in the early stages. You see different entities try and take a crack at it: Twitter with its deplatforming activities and Facebook and other platforms, as they leverage their terms of service to impose a degree of what they deem to be ethical behavior on their platforms.
And you can see them wrestling with it because they take an action, and then there's a whole bunch of reconsideration of those actions.
At the same time, we've got Congress considering revisions to all the regulations that govern communications of various sorts. To date, the internet hasn't been perceived as a common or public good, but that might be changing; there’s debate in Congress.”
Disinformation and destabilization during the pandemic
The impact of information on national power has only increased over the years, as it becomes easier to manipulate and circulate. Some subjects, especially the highly politicized, are especially ripe for disinformation, explains Ruston:
“The COVID pandemic gave us a very apparent topic around which disinformation could circulate. And because it ended up being highly politicized — and that’s true globally, not just in the United States — there was potential for political, economic, and geo strategic benefit for certain actors and groups.
So, take a nation-state like Russia. It is to Russia's benefit anytime Europe or the United States is destabilized. And so amplifying stories about the debate of the proper response to the COVID pandemic is in Russia's interests. And there's evidence that their state-sponsored media outlets did a lot of that.
But the main point is there are many groups these highly politicized topics provide opportunities to curry favor or adherents, but also potentially to sow discord or to distract from other activities: nation states seeking to disrupt the balance power, non-state actors, activist groups, political operatives, and terrorist groups among many others.
And then the situation is just all the more ripe, because some communities were able to spend more time at home and in isolation, so you've got more people participating in the less-regulated, less ethically bounded forms of information sharing. We didn't see a huge spike in people watching local news during the course of the pandemic; we saw a huge spike in people being more participatory on social media.”
Disinformation as a tool has only become more powerful throughout the 21st century. However, propaganda is nothing new; the speed and reach of digital communication is simply making it easier to spread.
“[Public perception] is a lower cost target to go after than the ability to interdict or counter the armed forces of an adversary,” says Ruston. “This low-cost, remote measure provides adversarial actors new opportunity to disrupt nations and communities which may be physically secure. Anticipating and mitigating the effects of these campaigns will require more research,” says Ruston. “There's this rapidly changing nuance and contour to what security means, and the information space is a big part of that.”