
Social media, algorithmic engagement and their role in societal division
Max Williamson contemplates the perils of algorithmic curation.
Imagine a reality in which everyone around you always agrees. Every conversation you have reinforced your existing views, and no opposing views seem to exist.
Welcome, you are stepping into the realm of social media, governed not by elected officials but by algorithms.
We often open Instagram or TikTok to pass a moment, yet these platforms can shape our world view one scroll at a time. Many observers do not realise the control that the invisible gatekeepers of these platforms have over their perceptions.
A stream of content or feed often predicts what we want to see, sometimes before we do so. For example, after a casual chat with a friend about the idea of travelling to Kyrgyzstan and sending him a short video on Instagram, my feed was suddenly flooded with content on Kyrgyz cuisine, culture, and travel vlogs. This phenomenon is consistent with the manner in which Instagram algorithms utilise recent user interactions. When a user likes, shares, comments or engages with a video in any way, they are subject to algorithms that refine content suggestions, thereby ensuring the relevance of the content presented.
Such an experience feels convenient but also unnerving. Using a platform like Instagram creates filter bubbles that are designed to appeal to the user and shield them from alternative or dissenting perspectives.
The primary aim of these algorithms is to keep users engaged, extending their time on the platform, and boost advertising profits. They do not focus on accuracy, fairness, or fostering democratic discussions. Instead, they highlight content that keeps you emotionally invested, often stirring outrage, moral resentment, or an "us" versus "them" mentality.
Emotionally charged content taps deep psychological instincts, making us more inclined to react, comment and share. Researchers refer to this type of content as PRIME: prestigious, in-group, moral and emotional. These attributes elicit strong emotional reactions. Once you engage with this content, the algorithm presents you with more content.
Over time, this leads to personalised echo chambers: self-reinforcing environments where beliefs are echoed, opposing views are filtered out, and extreme content becomes more prominent. These algorithmic echo chambers polarise, isolate, and distort our understanding of the world. These are dynamic systems designed to manipulate and maximise engagement.
You might think that you can control your social media feed. However, the platform you are using quietly learns about you with every interaction, feeding you the content to keep you scrolling. Those caught in these cycles experience intensified emotions that affect not only online interactions but also real-world behaviour, furthering social divisions.
Traditionally, media outlets have offered diverse viewpoints and promoted balanced public discourse. While not free from bias, with many openly aligned to political parties or ideologies, editorial judgement served as a filter, applying professional standards and journalistic ethics to the published content.
This process was not perfect but was intentional. Editorial judgement once acted as a filter, flawed but meant to inform the public across ideological lines.
Today, the digital media landscape has shifted this dynamic, replacing editorial curation with algorithmic personalisation. Individuals are now segregated into distinct digital groups, each with unique characteristics.
Polarisation is not a product of social media alone; echo chambers and biased reporting have always existed. However, unlike traditional media, algorithms amplify biases on a larger scale, with unprecedented speed and precision. They adapt in real time, catering to individual preferences and reinforcing confirmation bias. Consequently, polarisation today is swift, widespread, and intense, fuelling a stark societal divide.
Research asked people across 19 countries in 2022 about social media, finding that 84% believed that internet and social media access would make it easier to manipulate false information.
In addition, 55% of Americans were found to rely heavily on social media for news and political information. That is, more than half of the population turned to a system built not for truth, but for engagement. Unlike traditional media, the social media rewards often confirm pre-existing viewpoints.
Addressing algorithmic echo chambers requires a novel approach. Social media companies should be held accountable for content amplification and societal impacts. Transparency regarding algorithm formulation, with moderate responsible content policies, can help mitigate misinformation and polarisation. Public awareness campaigns to educate users about how social media algorithms shape their experiences can also empower them to seek diverse sources of information. This can be linked to education concerning democratic elections, which are being compromised by the impact of social media.
Policymakers in liberal democracies must recognise the political power of social media platforms and consider stronger regulations to ensure fairness. As for citizens, there is no state control of the media in New Zealand. Governments are democratically elected officials that can be voted out, so the question is: Do you want someone who could be removed from power in three years to control social media platforms, or someone like Mark Zukerberg, Facebook's CEO for 21 years with little incentive to create a fair system, as he maintains platform interaction?
If we do not ask the hard question of who shapes our online world and why, we risk losing control over what we believe, how we vote, and even how we treat one another. We need deliberate and informed actions to reclaim control over online narrators who seek to shape our society.
Only then can we ensure that our digital future enhances democratic values rather than erode them.
— Max Williamson is a master's student in international studies at the University of Otago.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles

NZ Herald
07-06-2025
- NZ Herald
Recipe: Red curry dumpling soup
If you've ever ventured onto TikTok, you've probably seen a version of this recipe. Comforting, flavourful, and super-quick, it's an easy one to amp up with extra vegetables to put your own twist on it. Red curry dumpling soup Serves 4 Ingredients 2 Tbsp red curry paste

1News
04-06-2025
- 1News
Fleet of 'police buses' rolled out in Auckland to drive recruitment
A fleet of "police buses" has rolled out across Auckland to drive recruitment into the force. The five Auckland Transport buses, wrapped in police livery, were set to be a mobile billboard for police as they continued driving their routes around the city. The buses were covered in the New Zealand Police's famous blue, yellow, and white, with "Next Stop: Making a Difference" written on the side. Commissioner Richard Chambers said the campaign was part of police efforts to recruit its target of 500 new officers. The buses will be around the city over the next 12 weeks. (Source: Supplied) ADVERTISEMENT "These newly decorated double-decker buses seat 500 people, which is exactly the number we want to recruit." He said Auckland had always been a key recruitment area for police, and was "keen to be highly visible and advertise far and wide". Winter's here, supermarket spying, and TikTok's new feature. (Source: 1News) "We know that the buses are ideal for reaching the whole of the city." The total cost to wrap the buses was $119,800, which included print, installation, and 12 weeks of media advertising. The redecorated buses were introduced just a month from the new police training campus in Albany on Auckland's North Shore, welcoming its first batch of recruits. "The new Auckland training campus at Albany will also help make a career with the police a reality for those who cannot relocate to Wellington for the full 20-week course," Chambers said. ADVERTISEMENT The new buses were part of a drive to recruit 500 news officers. (Source: Supplied) "That campus is set to welcome its first intake from the start of July. That is a significant development in our recruitment and training." Under the National-NZ First coalition agreement, the Government set a target of recruiting an extra 500 officers within its first two years in office. However, the recruitment drive has struggled to meet expectations, and the Government conceded last month it would likely miss the November target.


NZ Herald
18-05-2025
- NZ Herald
How many spoons of Milo do you need? Dame Jacinda Ardern sparks debate over serving size
'Is that an extreme Milo to liquid ratio?! Or is that the way it's always been?' While Nestlé is yet to weigh in on the debate, Ardern's morning Milo musing has stirred up a big response online. Some Kiwis were baffled by the six-spoon figure, while others found it fell in line with their own generous serving style. View this post on Instagram A post shared by Jacinda Ardern (@jacindaardern) 'Two heaped tablespoons with warm milk! That is the only way. Lol,' one Kiwi wrote. 'We must have the same parents because I would have been scolded if I had used six spoons at once,' another said. 'Geez that would have been the Milo of dreams ... would never have happened in my house in the 70s and 80s,' a third wrote. Others seemed to find the recommendation matched their Milo standards, although they noted the perfect ratio often came down to who was watching. 'Milo has always been made with as many spoons as you could sneak into it (Mum always made it with one),' one person said. 'You had as many teaspoons that your mum told you, not how many spoons Milo told you,' replied another. A pantry staple for generations of Kiwis, Milo has long been a popular drink for warming up after a cold day or fuelling kids before school. Nestlé's six-spoon recommendation is one version of its recipe for a 'perfect cup' of Milo, but instructions vary on packaging. Some tins recommend adding 'three heaped teaspoons' to water or milk instead, although the final taste will likely come down to personal preference.