For some time, I have wondered why the infosec community hadn’t taken up the challenge of defending democratic debate.
From a technological perspective, you can’t imagine a more complex task: how to preserve the openness of democratic debate while holding malign influencers at bay. For professionals, solving difficult puzzles brings great satisfaction.
So it’s with some pleasure to learn that at least one prominent information security mind has turned his attention to the topic. Bruce Schneier, a respected thinker in the area, and George Washington University political scientist Henry Farrell, have written a paper that seeks a broad framework for understanding how to defend democracy in the time of internet connectivity.
They divide political knowledge into two categories, common political knowledge, or “the shared set of social beliefs about how the system works, who the actors are, and so on, which helps to order politics” and contested political knowledge, “the knowledge that is contestable, where people may disagree.”
Democratic and authoritarian nations view these two types of information very differently. Democratic countries offer certainty about the political process – common knowledge – but uncertainty about outcomes, who should lead, or what the issues are. While authoritarian nations promte certainty about contested knowledge – who is the leader, what is a national priority – but murkiness about common areas of knowledge, the integrity and availabilty of justice, elections, accountability.
For years, authoritarian nations have felt beseiged by the common political knowledge promoted online from democracies. But they learned how to adapt to that, by using confusion and distraction to tamp down on such knowledge flowing their way. In 2016, Russia facilitated a common knowledge attack on the US, by attacked the shared knowledge of the system.
Separately, Schneier has written: “Libertarians often argue that the best antidote to bad speech is more speech. What Vladimir Putin discovered was that the best antidote to more speech was bad speech.”
The paper highlights how presumptions around the freedom agenda of the internet promoted by the West since the 1990s has posed a threat to the power of the authoritarian East. Likewise, the ability of authoritarian nations to promote “contested” political information into a democracies, allows them to attack the common knowledge necessary for democracy to work.
George Soros’ role in international politics would be a vivid example of the divide in political knowledge and how it is seen depending on the government type.
For years, the billionaire financier funded democratic reform think-tanks through the post-Soviet space. From the Western democratic view, Soros’ goal was not to challenge contested political knowledge as much as spread common political knowledge about democratic reform.
For the powerful in Eastern Europe however, the promotion of common political knowledge was increasingly viewed as a challenge to the monopoly on contested political knowledge in place since pre-internet times.
Looked at against the backdrop of the Color revolutions and the US invasion of Iraq, Soros’ organizations appeared dead-set on regime change in Eastern Europe.
This helps explain how Soros himself has become a Janus-faced symbol of this East-West information tug-of-war.
The Schneier-Farrell paper focuses on knowledge rather than information – a useful distinction because ultimately, it’s knowledge that drives outcomes and actions. Information is simply the mechanics of sharing that knowledge. Knowledge shapes perceptions, and thinking about it in its totality, rather than the engineered level of bot networks, is essential for understanding the struggle.
But the common-knowledge attack model needs refinement. Already people have asked where something like neoliberal consensus would fit in the world of contested/common knowledge. That’s a very good question. At some point, contested knowledge would seem to migrate to common knowledge – and vice-versa.
Government run healthcare in the US is controversial, contested. In Australia, it is common political knowledge. If the US public further embraces government-directed healthcare, does it eventually move into the common knowledge category?
Nevertheless, breaking political knowledge into these two broad streams has value in part because it offers a pattern for the human mind. It can, for example, offer an explanation for what is happening with the Trump Russia probe – as the suspects, including Trump, fight to bamboozle the public faster than the forces of law and order can prosecute the case.
In this world, common political knowledge (the processes of independent courts), is being damaged by the willful misapplication of contested political knowledge (claims the case is a “witch hunt”, driven by the “Deep State”, creating “McCarthyism”) to delegitimize the process – presumably with the hope of strangling it.
A challenge for citizens of democracies is the ability to make sense of the information chaos unspooling around them. At the same time, conflicting realities of open democracy and authoritarian states are increasingly colliding online.
Scheiner advocates for what he calls a “cybersecurity” approach to defending democracy. Importantly, he describes “democracy” as “an information system.”
Calling a democracy an information system is important for two reasons:
It more accurately describes our information environment today (an immense network).
Calling it a “system” is also valuable because it reaches right back to the early days of cybernetics to understand the challenge we have today.
In doing so, it sidesteps the more partisan linkages with the issue of influence and interference campaigns today.
It also sidesteps the notion of war -which summons imagery and ideas which, while appropriate in some cases, don’t suit the normal functioning of a democracy.
As I testified to parliament, one of the goals of any democracy in this new information reality, should be a relaxed decision making environment.
The sense of permanent emergency made possible by the internet and social media is creating a climate in which politicians and parties are increasingly making unforced errors. (Morrison)
Describing democracy as an information system also breaks the debate out of a false framing about free speech.
We live in a time when there is no scarcity to the published/broadcast word. That makes it difficult to claim people are being “censored” in the traditional sense. In fact, the concept of “free speech” has become a shield for chaos from fringe-right groups.
Recall, the Charlottesville, Virginia Unite the Right rally in 2017 was billed as an exercise in “free speech” – highly debatable, yet even using the term confuses citizens of liberal democracies rightfully raised to respect free expression.
Visualizing political knowledge along this access: with the types of knowledge depending on the type of government hints at the long struggle ahead. But it also offers some broad guardrails that can possibly be internalized.
I am a firm believer that the best defense against an information attack is to have a public that knows what it knows (political facts), and knows why it knows it (how these facts support the greater political order).
The common political knowledge view helps nudge society in that direction.
There is a quality of universalization going on.
That is needed to counter the mounting, some would say, paralyzing information complexity we all face.
There is one other noteworthy feature of the paper, while Farrell has done interesting work on trust in politics, Schneier ranks high in the tribe of the tech world.
Like any other social group, the tech world has its own culture and custom. Since the early 1990s, its libertarian streak has only grown stronger. Many in the community have grown up with the fictional vision of themselves as individuals on a matrix, ignorant of the real world needs of society to protect itself.
That all came home in a jarring way with the 2016 election.
Whether institutions adopt the common/contest political knowledge understanding to help them defend themselves is unclear. However, if Schneier (and other respected info sec figures) make problem solving in this area cool, it will be good for the overall defense of democracy.
For too long, there is a tendency in the technolibertarian sector to eschew the wider community-view (including people in government and business) in favor of an individual view.
Now the challenges to democracy are so great that if these groups don’t work together (tech complementing government and business) the entire, ahem, information system, is at risk.
And while common/contested knowledge dichotomy may work for today: my sense is that those who seek to undermine democracy online would eventually adapt and promote ideas that hard to classify as “common” or “contested.”
Despite all the talk of bots, trolls, deep fakes and AI, deception is ultimately a human business.
However, seeing someone with info sec credibility move into this broader, hybrid space, will hopefully encourage others.