File under ‘engineering chaos’

When FBI Director Christopher Wray recently testified about ongoing interference to denigrate vice president Joe Biden, he rattled off areas where the Kremlin was active.

“Social media, use of proxies, state media, online journals, et cetera,” he said.

Social media, it turns out, is only the first of four avenues for influence he mentions in an incomplete list of examples. There are:

Proxies: individuals and their various outlets and platforms. People who promote ideas, arguments and images with the sort of intellectual agility that can adapt arguments to changing situations.

Social media visualisation: who thinks this way? (cc Martin Grandjean)

State media, which can, if nothing else, flavor what is found via online search engines. Unless search results end at geographical borders – which they don’t – this has implications for what American voters see within the US.

State media, more importantly, can inject ideas – even false or exaggerated ones — into global debate.

Online journals: ditto. These can serve up fresh and new arguments against Biden and the integrity of US voting. These arguments address the reasoning mind, not simply the social media feeds of people in the US.

Serving up rhetorical fare that blends into the wider conversation about a political candidate, and in fact, bends the debate ever so slightly, underscores how the realm of competition is the actual information, not simply the engineered networks.

Too bad that since the Russian cyber invasion of 2016, in a reflection of the culture of Silicon Valley and cybersecurity, so many in the West have gone head over heels for a networked approach to fighting disinformation and distortions, rather than an information approach.

Photograph by György Kepes

Granted an information approach requires solving bigger issues for what qualifies as truth in a democracy.

But still, it’s only through a systematic approach to information, one that addresses domestic and foreign threats simultaneously, that the public can hope to prevail against domestic disinformation and foreign-born lies.

Given what has happened since 2016, of course, blocking malign actors is important and necessary. 

Yet the fact that the public in the US had been seduced into thinking that Biden suffers from cognitive decline shows how the power of disinformation and influence campaigns is not so much about the engineered reality of specific tweets from the Internet Research Agency.  Rather, the distortion happens in the realm of understanding, of the information accepted into the greater debate.

Western anti-disinfo crowds would be unwise to see this as a matter of networks that can simply be sleuthed, uncovered, removed and then revealed. But since 2016, what we have is a new genre of study about misinformation on social media networks.

Unfortunately, putting too much attention there, turns our attention away from the broader battle – as the Kremlin’s efforts so far in 2020 show.