What "The Filter Bubble" means for the Brussels Bubble

Imported from the Blogactiv.eu blogging platform, closed without warning in 2021. Links, images and embeds not guaranteed, and comments not displayed.

"The Filter Bubble", by MoveOn.org foreign policy director Eli Pariser, shows that the forces creating the Brussels Bubble are about to be reinforced by technology, operated invisibly - and with impunity - by a handful of companies.

When I launched Blogactiv in 2007 I had some interesting experiences engaging with eurosceptics of the more rabid variety. As I wrote a few months later, I learnt to appreciate:

"...the power of groupthink. Put similarly-minded people in the same environment (gated suburb, online community, religious school, whatever) for long enough, and watch extremism rise as all members reinforce each other’s worldview and raise the volume higher and higher to be heard. ... I don’t think for a moment that eurosceptics are the only ones prone to it. Go into any cafe within shouting distance of the Berlaymont here in Brussels, after all, and you’ll find yourself in a large group of similarly-minded people, most of whom have been in the same environment for a very long time…" more

I resolved then to explore groupthink and one of its causes - online Echo Chambers - via this blog, but haven't really done so properly. While almost every post about the Brussels Bubble touches on groupthink (the Bubble is partly constructed from the mutual incomprehension of people inside and outside EU affairs), I haven't really dug into groupthink psychology.

Groupthink in action

Partly because it's scary. Back in 2008, one of the eurosceptics I mentioned in my first post - Richard North - provided cogent, intelligent criticism of BlogActiv, and had recently co-written the Great Deception, "one of the best eurosceptic histories of the EU available", according to @litterbasket.

Two years later, he was riffing off a murderous rampage by a British nutcase called Derrick Bird, who killed 12 people in a shooting spree. Two days after the event, North can be found online, using this terrible tragedy as a spur to suggest that slaughtering officials was justified, and posting links to bomb-making equipment. From publishing an intelligent book to inciting murder - a long trip in two years. Is this groupthink in action?

Before the internet you needed to get people into the same room to get the Echo Chamber effect required for groupthink to take hold - every nutjob US cult, after all, needs its retreat in the wilds of Montana. When I was visiting EUReferendum 4 years ago, rebutting some of the loonier assumptions made there about BlogActiv, I realised that Echo Chambers are now one mouseclick away. Anyone who spends long enough in the same online space populated only by people with the same views will probably find those views steadily become more extreme over time. What the "Filter Bubble" shows is that this effect will now spread to the entire Web.

Enter the Filter Bubble

The Filter Bubble is an excellent book - I recommend you read it. I won't review it here for reasons of space, so go read some reviews, and/or watch this short TED talk by Eli Pariser, its author:

Now the above 10 minutes barely scratches the surface of what's going on here, but the easiest way of looking at it is this:

1) We're all become used to seeing online Ads on Google that match the search terms we just typed in.

2) And you've probably more recently noticed that after you search for something, then all sorts of web sites suddenly start displaying ads (and not just Google Ads) relevant to your search. The website you're viewing knows what you've been looking for, and is serving you relevant, personalised ads.

3) You're probably also aware that this is driven by a lot more data than your latest search terms. In fact, a small handful of companies have massive 'profile data' on you: the sites you visit, what you write online, what you search for, who your Facebook friends are, what you read, where you were educated, and much, much more.

They hold much more data than any government has ... which is why governments have made it easy for themselves to take a look:

"Google has received 4,601 user data requests from the US Government over the most recent six-month period and has complied with 94% of those requests..." more

4) These companies sell websites this data so the websites can personalise their ads, increasing clickthrough rates and hence revenue. So far, so OK, right? Why not have relevant, personalised ads? Beats the generic crap on TV.

But what happens when it's not just the ads that are personalised? What happens when sites use your profiles to present the content - the news articles, op-ed pieces, etc. -  they think you are most interested in? Quite a lot happens, actually. You're in a filter bubble, seeing a Web tailored to your interests - or what the profile data thinks are your interests. And you are impoverished as a result.

Now the Echo Chamber comes to you (and you can't even see it)

There is no Front Page - there is only your Front Page. Human editors have been replaced - as gatekeepers and curators to what's interesting, importing and relevant - by algorithms. Now they decide what you see ... and they don't have the same public service ethos written into their code that editors (theoretically) had trained into them through journalistic culture. The algorithms' only priority are financial.

And remember, not only can you not see the machines or affect how they work, you can't see the Web they're not showing you. So say goodbye to news and views which challenge you to think differently, or indeed content on any subject which Mark Zuckerberg thinks won't be relevant enough for you. You remember the Facebook founder, who's definition of privacy is pretty elastic, and who's definition of relevance can be encompassed by this quote:

"‘A Squirrel Dying In Your Front Yard May Be More Relevant To Your Interests Right Now Than People Dying In Africa"

So say goodbye to news about difficult, long-term, complex stories - more people, after all, will LikeTM stories which are likable, not tough ones like famine and war. And some of those people will be in your social graph, so that's all you'll see as personalisation rolls out across the Web like a suffocating blanket. After all, Facebook didn't roll out an 'Important' button.

I'm just scratching the surface, but you get the drift. In the little world of EU communications, eurosceptics will see only content negative about the EU, and those within the Brussels Bubble crowd won't see anything to disturb their worldview, either. You can argue, of course, that this is already the case, but its disheartening to think that Web Personalisation will make things even worse.

But personalisation is a much bigger problem. Greater political and social polarisation is in noone's interests - the last thing this planet needs is for the world's first global medium to fragment into 7 billion echo chambers, each with a population of 1.

Related reading

More Stuff I Think

More Stuff tagged social media , filter bubble , groupthink , eurosceptics , web20 , online-public-space , signal2noise , bxlsbbl

See also: Online Community Management , Social Media Strategy , Content Creation & Marketing , Psychology , Social Web

Cookies disclaimer

MyHub.ai saves very few cookies onto your device: we need some to monitor site traffic using Google Analytics, while another protects you from a cross-site request forgeries. Nevertheless, you can disable the usage of cookies by changing the settings of your browser. By browsing our website without changing the browser settings, you grant us permission to store that information on your device. More details in our Privacy Policy.