Molly Vs THE MACHINES, a new Channel 4 film produced by Marc Silver and co-written by Harvard professor Shoshana Zuboff, tells the story of Ian Russell’s fight for online safety following his daughter Molly Russell’s tragic death in 2017. The documentary interrogates how digital systems designed for profit shape emotional life and behaviour.
The Reality of Algorithmic Feeds
The film reveals how TikTok and Instagram’s recommendation algorithms relentlessly feed teens suicide-related, self-harm, and depressive content, which is a toxic environment mirroring what contributed to Molly’s death. This narrative is supported by recent research from the Molly Rose Foundation and The Bright Initiative by Bright Data, titled ‘Pervasive-by-design’:
- Prevalence of Harm: Almost all recommended videos watched on Instagram Reels (97%) and TikTok (96%) were found to be harmful.
- Monetisation of Harm: Despite being eight years after Molly’s death, 1 in 10 pieces of self-harm and suicide content remains monetized across these platforms.
- Ad Adjacency: Advertising appears adjacent to harmful material in 1 of every 9.5 consecutive “For You” page posts on TikTok and 1 in every 10 Reels on Instagram. Major brands have been implicated.
Algorithms are optimised for engagement, not safety, and ad delivery systems fail to filter out harmful adjacency. As CAN’s Co-Founder Jake Dubbins states: “Today’s platforms are engineered to maximise attention at any cost. As long as harmful content is profitable and can be monetised by advertising, it will be amplified.”
A Roadmap for a Better Online Future
The Molly Rose Foundation has released their ‘roadmap for a better online future’, an urgent roadmap for change. “2026 is the year we must act – decisively, boldly, and with the courage to deliver meaningful and comprehensive change,” the document starts off with.
The five-point plan to transform children’s safety and wellbeing online involves:
- Fix and strengthen the Online Safety Act to shift incentives on regulated firms, focus on harm reduction, and deliver a regulatory approach better targeted to the size and financial position of the market.
- Extend the Act to actively promote and protect children’s wellbeing. Digital products must be built to be age-appropriate, high quality and nourishing by design.
- Require new levels of transparency, accountability and candour from large platforms and senior managers to proactively disclose information, with new transparency arrangements for corporate advertisers and supply chain disclosures.
- A ‘polluter pays’ and whole stack approach to harm reduction would pump-prime independent research into online harms. The scope of the Act should be extended to cover app stores, operating systems and parental controls.
- A bold investment in critical digital and media literacy education can inoculate children from the worst effects of online harm and equip young people with the critical skills they will need to flourish in our future AI and digital economy.
The Role of Advertisers
Point three of the roadmap is particularly vital for the advertising industry. The document argues that transparency is a powerful lever for change and calls for a “new deal” for advertisers who unwittingly monetise social media harm. Advertisers are urged to:
- Demand Impression-Level Transparency: Require platforms to disclose exactly where ads appear and what content they sit next to in newsfeeds, stories, and reels.
- Insist on Safety-by-Design: Make harm prevention a requirement in procurement, ensuring it is baked into the product rather than retrofitted.
- Review Media Buying Practices: Ensure alignment with brand values and social responsibility.
- Adopt Guiding Principles: Review and implement the CAN Children’s Rights and Wellbeing Guide.
Writing in Campaign, Channel 4’s Chief Commercial Officer, Rak Patel, shares why advertisers should watch Molly Vs THE MACHINES:
“If you have missed Molly Vs THE MACHINES, I urge you watch it. It’s one of the most moving, and important, pieces of television you will see. And one that speaks directly to a growing sense in our industry that there needs to be a rebalancing between the highly regulated environments of mainstream media and those of the social platforms.
The film clearly sets out how decisions about what was suitable for a child to view on social media were being made remotely in Silicon Valley and examines whether those decisions put corporate growth above duty of care. In short, it investigates the consequences of putting profit before people in a digital world without effective oversight.
[…] Advertising choices aren’t neutral: where budgets flow can reinforce either regulated, accountable ecosystems or incentivise models that reward harmful content. Responsibility and effectiveness can align.”
What You Can Do
- Watch the Film: Learn more about Molly Vs THE MACHINES and its impact.
- Read the Roadmap: Access the full five-point plan from the Molly Rose Foundation.
- Join the Conversation: Step 4 of our Guiding Principles (Conscious Questions) encourage you to discuss how these recommendations can be applied immediately.
If you’re looking to embed transparency, and trust into your media plans, explore CAN’s Guiding Principles and Guides to see how your organisation can start putting responsible advertising into practice. Please direct any questions or interest in CAN’s free membership to hello@consciousadnetwork.org.