Farzaneh Azadasl, Cristina M. Gonzalez, Anna Kary, Ghufran Muhibi, Elena Sánchez Nicolás, Theodoros Papachristou, Emilija Tamosiune, Fazil Tercan, Qiao Zhou
European integration is at a crossroads. With the EU election just weeks away and Brexit on the horizon, polarization on social media and personalized digital algorithms threaten the cohesiveness of the European project.
The EU is very well aware of the danger, demanding action on election advertising transparency on platforms, for example. But it isn’t enough. We, citizens, need to be better informed about the power of algorithms, which shape our digital worlds and galvanize us on issues which can threaten our democracies. As EU competition commissioner Margrethe Vestager recently told The Atlantic, we are experiencing a “slow tsunami that is changing us, without being able to really fend for ourselves or give direction to our society.”
We’ll tell you some ways we can “fend for ourselves,” but first let’s talk about why this is so important.
What do Filter Bubbles Have to do With Democracy?
Filter bubbles do two things: contain and magnify.
Non-transparent algorithms choose the ads you see, the search results displayed, related content suggested, all based on your history of online behavior. Algorithms trap you in an information “bubble.”
It’s a perpetuating cycle, which feeds off of itself – the more times you like certain things the more you are shown similar content, leading to an over exaggeration of certain ideas or behaviors. Meanwhile, you are not being exposed to differing opinions; some refer to this as “tunnel vision.” This results in societal fragmentation – everybody’s “bubbles” are different – and possibly polarization, because the bubble amplifies our own understanding of reality. How is Europe expected to become more integrated if its citizens are becoming ever-more fragmented online?
To cut it short: Algorithms not only dictate what we see, based on past behavior, but they shape our future behavior, including voting habits. The problem is, if we are not presented with all available information, choices, and opinions, this means we are not truly informed, autonomous voters.
What is the EU Doing about This?
More than ever, the EU faces increasing “polarization by design,” making filter bubble effects likely to impact the upcoming election. “Polarization by design,” as identified by the European Parliament, broadly merges all mechanisms posing a threat to democracy. This includes filter bubbles and everything reinforcing them (think 2016 U.S. elections, Russian bots, “fake news”).
That’s why in October 2018, the European Commission urged national governments to establish a pan-European network to monitor electoral campaigning online. The Commission also published an “EU-wide code of practice on disinformation” to encourage online platforms to limit the “micro-targeting” of voters and increase political advertising transparency. Google, for instance, claims to have improved its algorithms to minimize the impact of filter bubbles, and Facebook recently introduced verification tools for political advertisements running in the EU, clearly labeling who is paying for the ad.
Whether these measures are sufficient to burst existing bubbles is debatable and provokes the frequent and open discussion on whether online platforms should be held accountable for (personalized) content.
What Can We, the Citizens, Do about This?
To be fair, algorithms are not necessarily created for nefarious, bubble-trapping purposes. Considering the immense amount of information produced daily, algorithms help us to sort out what’s important, creating what’s known as our “information diet.” Because this diet can influence our political views, we should keep in mind its possible biases. On Facebook, for example, people may see a political advertisement with a lot of positive feedback, but few will understand that the feedback may come from automated bots rather than real people.
It is worth raising additional concerns. How many Europeans are aware of fact-checking when exposed to the news? How many Europeans access a variety of sources or search engines, rather than Google, to get different political viewpoints? Little evidence exists on the degree to which internet users are knowledgeable or even aware of these complications.
Therefore, we must increase societal awareness that we live in a filter bubble world. We suggest people start by watching this TED Talk by Eli Pariser, who coined the phrase “filter bubble” in 2011. It’s taken years for his observations to finally resonate within the broader society, but awareness is growing. “It was only two years that people started saying it was really a problem,” Google’s country director of Belgium Thierry Geerts told us in an interview. “I think that every problem that we are really aware of and that is highlighted, we can work on, there is a better chance that we can control it.”
So, how can we control it?
We need to demand more transparency from businesses and tech companies in order to understand how algorithms are constructed and managed. This can be done by direct citizen lobbying, through organizations like The Good Lobby, or demanding that national and European leaders use their authority to steer this issue.
We must empower citizens and their capacity for critical thinking about the role of technology in society; this is fundamental for our democracies, especially during an election campaign. Facebook Political Ad Collector is an example of a tool that can help users reflect on how political advertisers target citizens through ads that are displayed in their Facebook News Feeds. The Algorithms Exposed (ALEX) project enables citizens to monitor their social media habits and volunteers their algorithms to scientific projects, which strive to break us from our bubbles.
The lack of awareness, together with the lack of transparency of personalized algorithms, undermines citizens’ capacity for critical reflection.
And finally, we must actively work to vary our “information diet.” Here are three suggestions on how you can start.
- Reading more moderate views and listening to different opinions outside of our comfort zones can help mitigate filter bubble effects little by little.
- Deliberately adding alternative opinions to our social media channels and newsletter subscriptions can broaden our daily news intake.
- Downloading applications like “Read across the Aisle” can also be very helpful, as they offer readers a variety of news sources to get their information.
But, a word of caution: Eli Pariser warns about diving directly into content that is on the complete opposite side of the spectrum. He encourages, rather, seeking out “bridge figures” with moderately different views.
But for the long-term, the true antidote to bursting our filter bubbles is ensuring good education systems throughout the EU. This certainly includes EU-wide implementation of media literacy programs – understanding how news is produced, how to detect fake news and other competencies, which foster savvy news consumers – and extends into general critical thinking practices, fact-checking habits, knowledge of history, and other crucial competencies. It’s important that all member states invest in these skills to better prepare their citizens to be active and informed voters. Our democracies depend on this.
Future European voters and elections may not face our current challenges if we better educate people about the societal impact of our digital worlds. “We are building this information society, but we don’t inform our children how to use this,” said Google’s Thierry Geerts. “We help them to cross the streets, but we don’t have any education on how to cross the internet.”
Pereira, A. (2017) A Philosophical Inquiry into How Algorithmic Personalization Undermines Frankfurt’s Necessary Conditions for Personal Autonomy. Universiteit Utrecht. 18