Backfire effect and Brexit (Top3ics, May 2017)

I’ve been meaning to blog about the ‘backfire effect’ cognitive bias since first coming across it last December.

(update: this has been reworked into a (better) post, following the process set out in my personal content strategy)

It went to the top of my ToBlog list thanks to a little serendipity last week:

  • last Sunday I saw a Facebook post about Sharon Spooner ripping into Michael Gove about the lies told by the Leave campaign
  • the next morning, as I got on my exercise bike, I decided it was time to check out the You Are Not So Smart (YANSS) podcast, where I found The neuroscience of changing your mind, the first of three episodes dedicated to the backfire effect.

They’re linked tangentially, so you don’t really need to watch Sharon’s “rant” (as the Express put it) for this post. But it’s well worth it, so enjoy:

What linked (at least in my mind) this to the podcast and other resources tagged backfire effect here? The comments:

tumblr_inline_oqci8yiBnM1qa02wp_640[1].png

I still strongly recommend all three 45-65min YANSS episodes, however, as they provide much more detail into the Why and the How. They manage this by featuring the scientists that discovered and are now studying the phenomenon, explaining in their own words how they carried out the research and what they think it means. It’s brilliant science journalism and storytelling. 

But if you don’t do either cartoons or podcasts, check out the Brainpickings article that first brought this cognitive bias to my attention as I read almost 50 articles on Fake News:

When someone tries to correct you… it backfires and strengthens those misconceptions … dealing with the cognitive dissonance produced by conflicting evidence, we actually end up building new memories and new neural connections that further strengthen our original convictions…  believers see contradictory evidence as part of the conspiracy and dismiss lack of confirming evidence as part of the cover-up
-  The Backfire Effect: The Psychology of Why We Have a Hard Time Changing Our Minds

Soul searching

If I didn’t follow up on the Brainpickings piece straight away it was partly because of Poynter’s Fact-checking doesn’t ‘backfire,’ which reported new research suggesting that the backfire effect may be “a very rare phenomenon”, and that

“… by and large, citizens heed factual information, even when such information challenges their partisan and ideological commitments.”

It’s noteworthy, however, that Poynter hosts the International Fact-Checking Network, which developed the code of conduct underpinning Facebook’s fact-checking efforts. The backfire effect is in fact a very well understood phenomenon, as the other resources on my Hub show.

Almost all of them, however, are better at diagnosis than treatment, and find that the solutions we have all put our faith in are insufficient - or worse. For example:

- some of them dispute the finding that factchecking doesn’t trigger a mental backfire:

the New York Times… posted an article explicitly denying alternative narratives of the Orlando shooting event. This denial was then cited several times by those promoting those narratives — as even more evidence for their theory.”
-  Information Wars: A Window into the Alternative Media Ecosystem

- others question media literacy:

Media literacy asks people to raise questions and be wary of information that they’re receiving. People are. Unfortunately, that’s exactly why we’re talking past one another
- Did Media Literacy Backfire?

- or even Marching for Science (although in fairness it does offer a modest solution too):

there is a genuine risk that the March for Science will be widely regarded as a manifestation of the great urban-rural divide that helped elect Trump…  If we truly want to endorse the idea of science, let’s break up into groups and fan out across America: let us talk quietly to people from Alabama to Maine and Alaska about evolution and climate change…
- A modest proposal for the march for science

Looking for solutions

Unsurprisingly, psychology is where we need to start. My hunch, however, is to turn for practical solutions to the science communication community, who have been tackling this for longer than most.

Some of the scientists interviewed in the second podcast, for example, studied the US Centre for Disease Control’s highly professional and wholly unsuccessful efforts to convince people that vaccines were safe.

“when they confronted anti-vaxxers with a variety of facts aimed at debunking myths concerning a connection between the childhood MMR vaccine and autism… they were successful at softening those subjects’ beliefs in those misconceptions, yet those same people later reported that they were even less likely to vaccinate their children than subjects who received no debunking information at all. The corrections backfired.”
-  How motivated skepticism strengthens incorrect beliefs (episode 2)

And then there’s the psychologist who says he can ‘inoculate’ people against misinformation, the latest in many interesting pieces from the Conversation on science communications.

And as I write this I am still absorbing podcast episode 3, which looks into:

  • what happens when you push people’s beliefs to tipping point, which turns out to be 30% (to understand that number, listen to the podcast, read the original research, or wait for my next post).
  • and takes us through the Debunking Handbook (pdf), originally written for science communicators.

Hopefully I’ll find a way of relating all this to Brexit and any other EU issues where:

“the facts are on your side, yet the people who need to hear them are dead set on keeping belief-threatening ideas out of their heads
-  How to fight back against the backfire effect (episode 3)

I’ll also be revisiting resources tagged identity (politics), like Why each side of the partisan divide thinks the other is living in an alternate reality and, of course, filter bubbles, which keep many people’s exposure to core-challenging ideas well under 30%.

See also: apart from backfire effect, identity and filter bubbles, see resources tagged psychology, factchecking, cognitive and science communication.

A disclaimer of sorts: I need to make absolutely sure that my post is not construed as a criticism of Sharon. Not because she was a client, or her partner has been my client twice, but simply because she’s a lovely person whom I admire a lot, and with whom I agree totally. A least when it comes to Michael Gove.

I hope you found what I found interesting: If so, let your friends know; get Top3ics by email; and/or the best links by my Messenger curatorbot.

Related reading

More Stuff I Think

More Stuff tagged psychology , cognitive , backfire effect , filter bubble , identity , science communication , newsletter , yanss

See also: Content Strategy , Social Media Strategy , Content Creation & Marketing , Communications Tactics , Psychology , Social Web , Politics , Science&Technology

Cookies disclaimer

MyHub.ai saves very few cookies onto your device: we need some to monitor site traffic using Google Analytics, while another protects you from a cross-site request forgeries. Nevertheless, you can disable the usage of cookies by changing the settings of your browser. By browsing our website without changing the browser settings, you grant us permission to store that information on your device. More details in our Privacy Policy.