Social media, algorithmic engagement and their role in societal division

Checking social media with a smartphone. PHOTO: GETTY IMAGES
Checking social media with a smartphone. PHOTO: GETTY IMAGES
Max Williamson contemplates the perils of algorithmic curation.

Imagine a reality in which everyone around you always agrees. Every conversation you have reinforced your existing views, and no opposing views seem to exist.

Welcome, you are stepping into the realm of social media, governed not by elected officials but by algorithms.

We often open Instagram or TikTok to pass a moment, yet these platforms can shape our world view one scroll at a time. Many observers do not realise the control that the invisible gatekeepers of these platforms have over their perceptions.

A stream of content or feed often predicts what we want to see, sometimes before we do so. For example, after a casual chat with a friend about the idea of travelling to Kyrgyzstan and sending him a short video on Instagram, my feed was suddenly flooded with content on Kyrgyz cuisine, culture, and travel vlogs. This phenomenon is consistent with the manner in which Instagram algorithms utilise recent user interactions. When a user likes, shares, comments or engages with a video in any way, they are subject to algorithms that refine content suggestions, thereby ensuring the relevance of the content presented.

Such an experience feels convenient but also unnerving. Using a platform like Instagram creates filter bubbles that are designed to appeal to the user and shield them from alternative or dissenting perspectives.

The primary aim of these algorithms is to keep users engaged, extending their time on the platform, and boost advertising profits. They do not focus on accuracy, fairness, or fostering democratic discussions. Instead, they highlight content that keeps you emotionally invested, often stirring outrage, moral resentment, or an "us" versus "them" mentality.

Emotionally charged content taps deep psychological instincts, making us more inclined to react, comment and share. Researchers refer to this type of content as PRIME: prestigious, in-group, moral and emotional. These attributes elicit strong emotional reactions. Once you engage with this content, the algorithm presents you with more content.

Over time, this leads to personalised echo chambers: self-reinforcing environments where beliefs are echoed, opposing views are filtered out, and extreme content becomes more prominent. These algorithmic echo chambers polarise, isolate, and distort our understanding of the world. These are dynamic systems designed to manipulate and maximise engagement.

You might think that you can control your social media feed. However, the platform you are using quietly learns about you with every interaction, feeding you the content to keep you scrolling. Those caught in these cycles experience intensified emotions that affect not only online interactions but also real-world behaviour, furthering social divisions.

Traditionally, media outlets have offered diverse viewpoints and promoted balanced public discourse. While not free from bias, with many openly aligned to political parties or ideologies, editorial judgement served as a filter, applying professional standards and journalistic ethics to the published content.

This process was not perfect but was intentional. Editorial judgement once acted as a filter, flawed but meant to inform the public across ideological lines.

Today, the digital media landscape has shifted this dynamic, replacing editorial curation with algorithmic personalisation. Individuals are now segregated into distinct digital groups, each with unique characteristics.

Polarisation is not a product of social media alone; echo chambers and biased reporting have always existed. However, unlike traditional media, algorithms amplify biases on a larger scale, with unprecedented speed and precision. They adapt in real time, catering to individual preferences and reinforcing confirmation bias. Consequently, polarisation today is swift, widespread, and intense, fuelling a stark societal divide.

Research asked people across 19 countries in 2022 about social media, finding that 84% believed that internet and social media access would make it easier to manipulate false information.

In addition, 55% of Americans were found to rely heavily on social media for news and political information. That is, more than half of the population turned to a system built not for truth, but for engagement. Unlike traditional media, the social media rewards often confirm pre-existing viewpoints.

Addressing algorithmic echo chambers requires a novel approach. Social media companies should be held accountable for content amplification and societal impacts. Transparency regarding algorithm formulation, with moderate responsible content policies, can help mitigate misinformation and polarisation. Public awareness campaigns to educate users about how social media algorithms shape their experiences can also empower them to seek diverse sources of information. This can be linked to education concerning democratic elections, which are being compromised by the impact of social media.

Policymakers in liberal democracies must recognise the political power of social media platforms and consider stronger regulations to ensure fairness. As for citizens, there is no state control of the media in New Zealand. Governments are democratically elected officials that can be voted out, so the question is: Do you want someone who could be removed from power in three years to control social media platforms, or someone like Mark Zukerberg, Facebook’s CEO for 21 years with little incentive to create a fair system, as he maintains platform interaction?

If we do not ask the hard question of who shapes our online world and why, we risk losing control over what we believe, how we vote, and even how we treat one another. We need deliberate and informed actions to reclaim control over online narrators who seek to shape our society.

Only then can we ensure that our digital future enhances democratic values rather than erode them.

— Max Williamson is a master’s student in international studies at the University of Otago.