Content warning: Please note that the report contains examples of non-graphic but disturbing posts freely available to teens.
The Molly Rose Foundation (MRF) has released important new research showing how social media algorithms continue to expose young people to harmful content. Almost eight years on from the death of Molly Russell, the findings make clear that little has changed in the way platforms recommend content to teenagers.
Accounts created as 15-year-old girls on TikTok and Instagram were quickly and repeatedly pushed towards harmful material about suicide, self-harm, and depression. Alongside this content sat adverts from some of the UK’s most recognisable brands, including fast food companies, fashion retailers, and universities. This means advertising revenue is still incentivising platforms to distribute unsafe content at scale, with devastating consequences for young people.
The report findings were covered in Campaign and The Media Leader.
Harriet Kingaby, co-founder of the Conscious Advertising Network, said:
“This report exposes the shocking harmful content young people are exposed to on social media platforms, and highlights the role of advertising in incentivising its distribution. Platforms are profiting off this content, and advertisers are often unknowingly helping to fund it. The need for transparency could not be clearer: advertisers need to take control of their media supply chains. Advertisers cannot shy away from the role they play in ensuring young people remain safe online.”
The report, Pervasive-by-design, calls on advertisers to take a stand. It urges brands to demand greater transparency from social media platforms and to commit to safeguarding young people by aligning with CAN’s Children’s Rights and Wellbeing Guide and Guiding Principles.
Andy Burrows, Chief Executive of Molly Rose Foundation, said:
“Our research found algorithms continue to bombard teenagers with shocking levels of harmful content at an industrial scale and interspersed with this are adverts from leading brands. It is in no ones interest for children to be seeing this damaging material, least of all advertisers who will have to deal with the reputational risks that go with it. Advertisers have a duty to pressure tech companies to clean up their platforms and we’ll be working with companies moving forward to help them achieve this.”
At a time when online safety is under growing public and regulatory scrutiny, advertisers have the opportunity to be part of the solution. By acting responsibly, brands can protect young audiences, rebuild trust, and help shape a safer digital environment.
You can read the full research from the Molly Rose Foundation here.