Filter Bubbles

By Sam Lihou

The term filter bubble was conceived by Eli Pariser - he has an excellent book of the same name as well as a TED talk which goes into a lot of detail about what a filter bubble is - but in essence, a filter bubble is an effect of AI algorithms which personalise content in your news and social media websites. This creates bubbles of interest that, whilst are comforting and familiar, may well close you off to other news and experiences you should be having. 

Backing up a little, it’s important to note that AI algorithms exist to tackle a legitimate problem - it’s just that the organisations using them might not be the best people to solve it. 

The problem is, there’s simply no way to read all the news all the time. From world news to social media, from viral YouTube videos, to Netflix hits, there’s enough happening each day on the web to spend multiple lifetimes consuming it all. So instead, algorithms exist to edit, prioritise and order all that information into something you can digest. In the days when newspapers were the primary source of information, journalists would do this for us - but now algorithms exist to filter what we see. 

Machine learning algorithms are especially good at finding patterns and correlations across massive data sets. By detecting common characteristics of the content you interact with the most, news publishers and social media companies can identify what’s likely to interest you again. Now they can populate your news feed with content they’re pretty sure you’ll click, like and retweet all over again- as well as monetising your attention by showing you the occasional ad. 

This might seem like a win-win - the user gets to see relevant content and the company running the AI algorithm retain users and get paid for ads. However, it’s not quite as simple as that. 

We’ve started to see that AI rewards clickbait articles with sensational or controversial titles because users are more likely to click on them, even if the article itself is junk or not even true. The impact of this can range from misleading to downright dangerous, with recent US elections being a prime example of how badly this can go awry. Factor into the mix that an awful lot of people will just read the headline before hitting share, and you can see how clickbait can spread like wildfire and lead to huge numbers of misinformed people. 

Our social networks comprise of our friends, families and colleagues - people who very likely have similar views to our own. That means that you’re likely to confirm your bias when everyone else you are exposed to on social media all approve, celebrate and share the same view. Our biases become amplified, we get less awareness and less tolerance for opposing views and become vulnerable to clickbait titles promoting false facts and fake news. 

So, as designers and makers of the web, we’ve got a responsibility to try to do things better. 

The concept of a newsfeed has been around long enough to become familiar but we’re only just realising that absent-mindedly scrolling past hundreds of pieces of bias content isn’t particularly good for us. 

The infinite scroll is one of the core mechanics of this behaviour. Perhaps designers of these feeds should introduce warnings after 10 minutes of scrolling that it might be time to go outside or suggest reading the same stories but from another point of view. Rather than solely filtering by relevance, we could sort by “important”, “opposing view” or “challenge me”. 

Some experts suggest we need to fight fire with fire- for example, ML-driven browser extension Nobias. This software makes use of an AI algorithm to tag news sites and headlines with a political leaning to help make it easier for people to discover their own biases. 

In the end, it’s up to each individual person to break out of their bubble and learn to listen to opposing views, find common ground and form their own opinions outside of the echo chamber. 

Further reading 

Eli Pariser: Beware online “filter bubbles” | TED Talk

Artificial intelligence created filter bubbles. Now it’s helping to fight it. – TechTalks 

Nobias - Chrome Web Store

 

Clever Stuff, 11 Colston Yard, Bristol, BS1 5BD

© Clever Stuff 2021