America mainstreaming disinformation against itself (US2020 Disinformation newsletter, ed 1)

I'm (re)launching my newsletter to focus on disinformation during the 2020 US election. It's also part of a wider experiment in integrating Zettelkasten idea and knowledge management into my personal content strategy, hosted on MyHub.ai.

This first edition sets the scene by summarising content tagged #disinformation on my Hub since 2015 and particularly relevant to this year’s election.

Seeing 2020 from 2016

I started by revisiting predictions from the past. 

The oldest article mentioned in my Zettelkasten Overview was published just after the 2016 election, and predicted 5 Big Tech Trends to drive the 2020 elections: social media, AI, IoT, deepfakes and micropayments. 

Together, apparently, these would create hyper-personalized political campaigns from mining your social media data and Internet of Thinks (IoT) sensor data; AIs chatting to you like your closest friend; deepfaked political candidates and official avatars; and micropayments for political campaigns. 

a photorealistic avatar of the presidential candidate on the bus stop advertisement turns to me and says: “Hi Peter, I’m running for president. I know you have two five-year-old boys going to kindergarten at XYZ school. Do you know that my policy means that we’ll be cutting tuition in half for you? …I also noticed that you care a lot about science, technology, and space exploration — I do too, and I’m planning on increasing NASA’s budget by 20% next year. Let’s go to Mars! I’d really appreciate your vote. Every vote and every dollar counts. Do you mind flicking me a $1 sticker to show your support?”
- 5 Big Tech Trends That Will Make This Election Look Tame (November 2016)

With the benefit of hindsight

While political campaigns hyper-personalized from mining your social media data was not new in 2016, panicking about Cambridge Analytica was. Since then we’ve learnt that the reality in 2016 was more mixed, but they were probably right to predict psychometrics will be huge this year. 

Like so many futurists, however, they overestimated the speed of technological change on politics:

Wait, where’s the disinfo?

With three of the five technologies concerning micro-targeting, only one about content and disinformation barely mentioned, why mention this article in this newsletter? 

While disinformation itself is old, today’s disinformation owes everything to technology. Just as deepfakes (a content form) couldn’t exist without AI, the lies peddled via online advertising wouldn’t exist without microtargeting technology, enabled by wholesale abuse of people’s private data (cf Surveillance capitalism). Content and technology are intertwined: where new technology enables, new forms of disinformation follow. 

The 2016 article was a “technology push” approach to forecasting the adoption of new technologies into the political process. If it didn’t work, noone should be surprised. Better to ask: Cui bono? Who can benefit from this technology, and how will they use it? 

Also, where will they be from? It won’t always be the obvious places, like the IRA in St. Petersburg. Think Macedonia, Philippines… and down your street.

America mainstreams disinformation against itself

Why tactics developed to cope with 2016 are outdated now.

The other six articles, all published in 2020, identify some key trends which emerged over the past four years. Together they show how America has mainstreamed disinformation against itself.

The Comms Industry clambers aboard

Buzzfeed News points out how the work pioneered by Macedonian teenagers and Russian disinformation experts has been mainstreamed into the political communications industry, reinforced by deep fake text:

If disinformation in 2016 was characterized by Macedonian spammers and Russian trolls… 2020 is shaping up to be the year communications pros for hire provide sophisticated online propaganda operations… to spread lies and manipulate online discourse both manually and by overwhelming the internet with torrents of AI-generated text…
- Disinformation For Hire: How A New Breed Of PR Firms Is Selling Lies Online (January 2020)

Domestic political groups do the same

The NYTimes agrees, but points out that American partisan groups also copied the 2016 playbook. This creates two massive headaches for Big Tech. To start with, it’s hard to tell if divisive content online is posted by a profit-driven Macedonian clickfarm, a Russian/Iranian/Chinese operation, a hyperpartisan American useful idiot, or an American being paid by a Russian/Iranian/Chinese operation. 

And each time they suppress something an American wrote, this happens. In other words, having prepared for the last war, Tech Giants are:

struggling to handle the new challenges… The most divisive content this year may … [come] from American politicians… The future of disinformation is going to be domestic
- Tech Giants Prepared for 2016-Style Meddling. But the Threat Has Changed (March 2020)

The Platforms fail to react

While the President, without any sense of irony, rants on Twitter about Twitter, when it comes to Facebook one of his friends will pick up the phone. I’ve Hubbed several articles about the damage done by Facebook’s desire to keep the White House ‘on side’. The first concerned a senior Facebook engineer who (internally) showed “Facebook giving preferential treatment to prominent conservative accounts”. They fired him. Why?

Cui bono. Conservative pages spend big on Facebook, and so get a Facebook ‘partner manager’ to keep them spending. When one of their posts is labelled problematic by a Facebook factchecking partner, the page owners don’t follow Facebook policy and call the factchecker. They call “their guy at Facebook”. You can guess the rest:

partner managers … appear to have sought preferential treatment for right-wing publishers … citing ad volume as a reason … Facebook itself will quietly remove a fact-check applied by one of its partners …
- Facebook Fired An Employee Who Collected Evidence Of Right-Wing Pages Getting Preferential Treatment (August 2020)

Where are we now?

Back in March, The Atlantic pulled the above threads and more together into an apocalyptic vision of this year’s election, characterised by:

“censorship through noise… coordinated bot attacks, Potemkin local-news sites, micro-targeted fearmongering, and anonymous mass texting. With Trump planning to spend more than $1 billion… this could be the most extensive disinformation campaign in U.S. history”
- The Billion-Dollar Disinformation Campaign to Reelect the President (March 2020)

The best parts of this article, however, is when it explores the true damage, done off-screen in peoples’ heads. Disinformation, censorship through noise and related political tactics lead many people to “believe everything and nothing, think that everything was possible and that nothing was true” — to no longer care about what is a fact and what is fiction, as long as it sounds good.

And among those who are alert to the risks:

the prevalence of propaganda can curdle into paranoia… Eventually, the fear of covert propaganda inflicts as much damage as the propaganda itself. Once you internalize the possibility that you’re being manipulated by some hidden hand, nothing can be trusted.”

Fighting back

But don’t underestimate Americans’ capacity to fight for what they believe in. I’ll close with the first in a series of Lessons from StandUp Republic:

“The rage-engage cycle is a key part of how malign narratives gain traction on social media… [Trump] tests and revises purposefully inflammatory content to “improve his numbers”… rage is good business sense … [so] Leave content that is toxic alone… Giving people the “why” … helps us question not only that one piece of content, but future content.”
- What Lessons Haven’t We Learned Since 2016? Lesson 1: RAGE

The answer, of course, is that barely anyone has learnt any lessons since 2016: journalism is only now beginning to grapple with the rage-engage challenge, particularly when led by the White House.

They’ve published 5 more, several of which will almost certainly end up on my Hub, and being factored into the constantly evolving Zettelkasten Overview. 

So get the RSS of relevant resources as I curate them, and/or subscribe for regular newsletter summaries in your Inbox, and/or browse all editions, and/or check out my constantly evolving notes in the Overview. More: Don’t just Build a Second Brain: share (part of) it.

And if you’d like something similar, get yourself a Hub.

Cheers,

Mathew

Related reading

More Stuff I Think

More Stuff tagged futurism , disinformation , newsletter , us2020 , cui bono

See also: Content Strategy , Social Media Strategy , Digital Transformation , Disinformation in the US 2020 elections , Social Web , Media , Politics , Business

Cookies disclaimer

MyHub.ai saves very few cookies onto your device: we need some to monitor site traffic using Google Analytics, while another protects you from a cross-site request forgeries. Nevertheless, you can disable the usage of cookies by changing the settings of your browser. By browsing our website without changing the browser settings, you grant us permission to store that information on your device. More details in our Privacy Policy.