Tonight: Panorama Delves Into Teenagers’ Dangerous Algorithms

By Olivia Emily

4 months ago

The episode is streaming now


As people across the globe disconnect in search of a simpler life and some parents heavily restrict their children’s internet access, one key question is thrown into sharp relief: can we live without our phones? This is exactly what BBC Panorama will delve into tonight, putting two families to the test to find out what happens to a phone-free household over the course of a week – discovering something dark about algorithms. Here’s what to expect.

BBC Panorama: Can We Live Without Our Phones?

Can We Live Without Our Phones? introduces us to a group of phone-addicted teenagers – aka normal teenagers. We follow their phoneless journey over the course of a week, along with that of their families.

Screen time has long been feared, from old wives’ tales of TV screens giving children square eyes to ‘iPad kids’ being the 21st century’s worst parental criticism. But it’s not just screen time alone that’s harmful. As Marianna Spring speaks to parents, teenagers and social media insiders, one thing becomes abundantly clear: algorithms are at the heart of what we all find most troubling on social media.

What Is An Algorithm?

In the context of social media, an algorithm is a computer generated process that determines the content shown to the user based on various factors, used to enhance user experience with the most relevant, engaging posts. Generally, algorithms have replaced the chronological timelines we used to see on Facebook and Instagram, which originally only showcased content from people we were friends with or had chosen to follow. Embedded in TikTok’s brand, for instance, is an algorithm-generated ‘For You Page’, featuring content selected from across the platform, mostly from people you do not follow.

When we use social media, our behaviour is always being analysed, especially what we like, search for, comment on, share, and even linger on for an extended period of time as we read the whole post or watch the whole video. The algorithm uses this data to learn your behaviour, then feeding you new content it thinks you will like based on its research. This is also how adverts are targeted to the most suitable audience.

Algorithms prioritise content with higher engagement rates, and often also consider the recency of the content, and the relationship between the user and the creator. Generally, algorithms are smart enough to deliver content that aligns with the user’s interests, but it’s common to talk about ‘training your algorithm’ – that is, directly (and perhaps more explicitly than usual) engaging with content that you like to ensure you see more of it.

That all said, algorithms can pose some dangers. For one, they create echo chambers, which can contribute to societal polarisation and worsen so-called ‘culture wars’. They are also criticised for perpetuating biases, and many people are uncomfortable with the amount of personal data algorithms can steal from you. Likewise, algorithms don’t always feed you appropriate content…

Why Are Algorithms Harmful?

In the Panorama episode, we meet Kai, an 18-year-old who is recommended violent and misogynistic content online, all thanks to his algorithm. Looking at his phone, Marissa describes the content as ‘gratuitous violence’, with Kai sharing that he sees ‘all sorts […] video after video. It could be a dog, and then someone getting run over or a car crash or something like that. Out of nowhere. Spontaneous. […] It kind of just stains your brain. And it’s all you think about for the rest of the day.’

Kai is just one of millions of UK children being fed content by algorithms, many of which operate as ‘black boxes’, meaning social media companies do not share the inner workings of the algorithm. It’s this lack of transparency that has made conversations with social media platforms so difficult – and frustrating. As a result of tech companies doing very little to protect their children from rampant bullying and devastating cases of teenage suicide, there are abundant calls from parents across the UK for social media to be banned for those under the age of 16 or even 18 years old, following. For example, Esther Ghey – the mother of teenager Brianna Ghey, who was sadly murdered in 2023 – supports a total ban on social media access for under-16-year-olds, believing her daughter would be alive if her killers had not had access to violent content online. Meanwhile, a recent study commissioned by Tesco Mobile found that, of 800 parents of children aged eight to 16, 65 percent claim keeping their youngster safe online is their main concern – ahead of bullying.

In the Panorama episode, Kai also shares how misogynistic content has crept into the algorithms of his friends. But Kai’s algorithm looks very different to that of his female peers. Marissa questions why this might be, speaking to ex-TikTok employee Andrew Kaung. ‘If you’re a female, a 16-year-old who lives in London, they’re usually interested in pop singers, songs, makeup… Whereas it’s very different for teenage boys,’ Andrew shares. At TikTok, Andrew worked on a team looking into the safety of children on the app, and investigated what content 16-year-olds specifically were being shown. ‘Stabbing, knifing kind of content,’ he says, referring to what teenage boys are typically shown. ‘Sometimes sexual content. And, of course, mostly misogynistic, controversial hate content as well.’ Following his research, Andrew recommended TikTok hire more content moderators to remove content that breaks the rules of the platform. However, TikTok said this would be too expensive.

In a statement, TikTok has asserted that gender is not a consideration in its algorithm tool, and that it expects to invest more than $2 billion in safety this year, with a whopping 40,000 content moderators already working for the platform.

WATCH

Panorama: Can We Live Without Our Phones? will air tonight at 8pm on BBC One. The episode is available to stream now on BBC iPlayer. bbc.co.uk