Deliberately and knowingly causing harm to children – as a business model.
It’s not a good look for the social media giants Meta and Google, who are currently on trial in the US over claims that they intentionally created ‘addiction machines’ to maximise their profits – ignoring warnings about the risks to young people’s wellbeing.
The case is the first of dozens that have been filed against Meta in the US, holding the company responsible for a crisis in youth mental health.
The pressure is stacking up, and it’s about time. In Europe, France, Denmark, Greece, Austria, and Portugal are moving towards a ban on social media access for children and young teenagers, following Australia’s landmark ban in December. Other countries, including the UK, are consulting on similar age restrictions.
Spain’s proposals also include holding platform executives legally liable for failing to remove illegal content, and making it a criminal offence to manipulate algorithms and amplify harmful posts.
Prime Minister Pedro Sanchez said: “ We must speak clearly about the management of platforms such as X and the vision of figures like Elon Musk.
“Freedom of expression is not the right of a magnate to buy public debate and manipulate it through algorithms that amplify confusion.
“Spain will move forward with legislation under which technology executives will assume direct criminal liability for serious violations on their platforms. The era of hiding behind servers is over.”
Meanwhile, large numbers of X users have closed their accounts, after Elon Musk’s bizarre decision to incorporate non-consensual porn into his platform’s offerings. A new YouGov survey across Germany, France, Spain, Italy, and Poland has found that 47% of Europeans would back banning X from the EU if it continues to breach EU rules.
It’s all about the intention
Social media companies have always ducked responsibility for user-generated content (UGC).
But recent events have shifted the focus to what the owners of the platforms control themselves, and the decisions that they – willingly and knowingly – inflict upon their users.
The issues that have drawn public attention in recent weeks are a very small part of the picture, but could this be the crack that lets the light in?
When you recognise that the incredible harm being done is not incidental but intentional, you can widen the focus, seeing the many other social media damages that can be traced straight back to the owners who control the platforms.
For instance, amplifying posts pushing extremist political views, inciting violent hatred and shutting down rational discussion.
Flooding the zone with AI-generated images and clickbait disinformation, as a deliberate distraction from what people can see around them in their real lives.
Deliberately suppressing or misrepresenting the facts about the climate emergency, depriving people of the information they need to demand action.
Using “cyborg” accounts, where a human controls the account but automation handles posting, liking, following, or replies. These are often used to exaggerate a political groundswell or to boost specific individuals. For example, they were used to amplify right-wing extremist Nick Fuentes on X to the point that mainstream media treated him as a serious figure.
There’s an agenda here, and it goes way beyond the damage to young people’s minds.
Age restrictions are not the answer
Media Revolution welcomes the high-profile debate about mental damage caused by social media platforms – and the dubious motives of their owners. It’s long overdue.
But one thing is generally missing from the conversation: the fact that there are alternatives to X, Meta, Instagram, TikTok and the other big tech companies.
Instead of simply putting an age limit on social media access – which helps a small group of users, but does nothing to protect others outside that age group – we can also take part in tech walkouts and guide people towards alternative platforms such as Mo-Me, which is not owned by any individual organisation, and isn’t built to prioritise addictive algorithms.
News feeds on Mo-Me are provided by independent, regulated media organisations, and users choose which accounts and channels to follow. There are no adverts, and no profit-driven ‘pushing’ of content; it’s a safe space for people of any age who want to connect with others, receiving and sharing news and information.
And if we’re talking about news….
While social media is under scrutiny, this is an opportunity for governments to look at some of the most prominent and prolific sources of online disinformation: their own national newspapers and TV channels.
At the moment, some of these newspapers and their social media channels are not only complicit in spreading hate speech and disinformation – they’re responsible. If we’re going to talk about harms, and examine the motives of big-tech platform owners, we also need to focus on the powerful media magnates who push the same agenda.
Let’s build on this momentum
The US lawsuits piling up against the social media platforms, the moves towards age-restricted access and the backlash against Elon Musk’s Grokepdia and X have highlighted some of the issues that are central to Media Revolution’s work.
This is a chance for the Media Revolution to continue growing, in momentum, in volunteers, allies and supporters. To show the bigger picture and build public awareness of what’s possible – what a Media Revolution – a collaboration of responses to the situation can truly achieve.
WHAT NEXT?
Be part of a Media Revolution. Join for an upcoming event:
Tuesday 24th Feb – Next News Clubs Zoom session
Thursday 26th Feb – RSA – Disinformation & Democracy
Join our Signal group – or sign up to the newsletter

