🚀 Trusted by 5,000+ Advertisers & Premium Publishers

Life Beyond Molly: Ian Russell Discusses Big Tech, the Tragic Loss of His Daughter, and the Ineffectiveness of a Social Media Ban

Ian Russell recounts that his existence has been intricately divided into two distinct segments: the time before and after 20 November 2017. This fateful day marked the loss of his youngest daughter, Molly, who took her own life as a result of struggles with depression exacerbated by harmful online content. “Our life before Molly’s passing was typical—completely unremarkable,” he shares. Ian was a television producer and director, balancing family life with his wife and three daughters in a cozy London suburb, where the children attended local schools. “We resided in a semi-detached house, and lived an unextraordinary life.” Just days before her tragic death, the family celebrated the birthdays of all three daughters, with Molly’s 15th birthday approaching. “I vividly recall being in the kitchen, surrounded by family and friends, feeling utterly blissful. It was a moment of pure joy,” he reminisces. “That was on a Saturday night, and by Tuesday morning, everything had turned upside down.”

The second chapter of Ian’s life has been marked not only by deep sorrow and trauma but also by a resolute pledge to uncover and reveal the truths surrounding the online content that played a role in Molly’s death. His advocacy encompasses campaigns aimed at safeguarding others from the similar dangers associated with such online environments. He didn’t anticipate the duration these elements would demand. Obtaining essential information from social media companies to conclude the inquest into Molly’s death took nearly five years. The inquest ultimately determined that she died “from an act of self-harm while suffering from depression and the negative effects of online content.” In addition, the Molly Rose Foundation, which Ian established, now serves to provide support, conduct research, and elevate awareness regarding online harms. Ian has become a prominent voice advocating for reforms tackling these pressing issues.

In recent times, he has been busier than ever. We rendezvous at a hotel in London just hours before the House of Lords prepares to vote on an amendment to the children’s wellbeing and schools bill, aimed at prohibiting access to social media for children under 16. The anticipated approval came through, with the amendment passing by a margin of 261 to 150 votes. While Keir Starmer’s government displayed an inclination towards a consultation, they awaited developments from Australia’s groundbreaking initiatives. Nevertheless, there appeared to be considerable support in the UK for such a ban, spanning multiple political parties, with endorsements from Conservatives, including Kemi Badenoch, over 60 Labour MPs, bereaved families of children affected by social media, as well as celebrities and campaign organizations. A recent YouGov poll revealed that 74% of UK adults supported such a proposal.

Despite this wave of public support, Russell stands opposed to the ban. Just last Sunday, he lent his signature to a collective statement alongside organizations like the NSPCC and Full Fact, asserting that “blanket bans on social media would not yield the enhancements in children’s safety and wellness that are urgently required.”

Ian Russell, photographed at the Zetter Clerkenwell. Photograph: Linda Nylind/The Guardian

At this pivotal moment when online harms and social media companies are under intense scrutiny, an inconspicuous divide has emerged. Yet any individual grappling with this issue has likely contemplated their stance in depth, including Ian. He approaches the matter thoughtfully, stating, “As an individual, I need to gather information and process it before arriving at a conclusion. I often find myself in meetings absorbing what others are communicating, struggling to contribute until I’ve formed a coherent opinion.” But when it comes to this critical issue, he stands firm, despite the apparent difficulty some might have in comprehending his perspective. “We risk moving too hastily, seeking rapid solutions,” he cautions. “If quick fixes truly existed, we would have recognized and implemented them by now.”

The core arguments made by Russell and other critics of a social media ban resonate with anyone familiar with the ongoing discourse: children may seek out more dangerous alternatives, find ways to evade age restrictions, face a “cliff edge” upon turning 16, and specific demographics—such as LGBTQ+ and neurodiverse youths—could lose access to vital online support networks. Russell’s stance also encompasses the provisions of the Online Safety Act, which has just begun to fulfill its intended objectives. The legislation, enacted in 2023, mandates online platforms to implement robust age verification processes and prevent harmful content from reaching children. It further empowers the government and Ofcom, the independent media regulator, to impose fines or remove platforms that fail to adhere to these regulations. Ian’s advocacy played a significant role in propelling the enactment of the Online Safety Act following Molly’s tragic death.

Although Russell acknowledges the progress has been sluggish, he remarks, “It took five years of parliamentary discussions to enact the Online Safety Act. Ofcom’s implementation process has extended over more than two years; while it’s frustratingly slow, we’ve finally reached a point where any platform that serves children in the UK must prioritize their safety to operate here.”

One recent controversy surrounding Elon Musk’s X and Grok serves as a poignant illustration of this ongoing struggle. The introduction of Grok AI tools, which had the potential to manipulate images of women and children, including generating deepfake child sexual abuse imagery, elicited widespread horror, including Ian’s dismay. “I find it incredibly hard to comprehend how X and Elon Musk perceived this as acceptable,” he states emphatically. “It’s heinous, wrong, and utterly disgraceful. But what actions were taken in the UK following this revelation?” Following the emergence of the controversy, Ofcom launched a formal investigation into X, armed with the authority granted by the Online Safety Act. Both Starmer and technology secretary Liz Kendall expressed support for any punitive measures that Ofcom deemed necessary against X. Shortly thereafter, Musk reversed course, opting to eliminate the controversial software from his platform.

Molly Russell at 11. Photograph: The Russell family

Russell contends that the Online Safety Act achieved what a blanket ban on social media could not. “If a platform operates in a profoundly unsafe manner, it has no business being in this country. However, imposing an age restriction removes accountability from these platforms. In actuality, this could hinder the effectiveness of the Online Safety Act,” he argues.

While he acknowledges that the legislation has its shortcomings—some users still posted inappropriate content on X even after Musk’s reversal, and Ofcom took weeks to respond—Russell believes the act offers a foundation for improvement. “No legislation is perfect from the outset. We need to recognize that technology advances rapidly, necessitating continual modifications and updates. It’s essential we remain ahead of the tech developments.”

An often-cited argument for imposing a social media ban on those under 16 is likened to alcohol regulation. Just as society prohibits alcohol sales to minors due to its potential harm, should there not be a similar barrier for social media? Russell supports regulation but emphasizes the importance of practicality and appropriateness. He counters with a different analogy: that of cars and road safety. “Indeed, casualties will occur, and while it’s tragic, we accept this reality. We don’t institute a ban to protect under-16s from riding in cars. Instead, we emphasize appropriate safety measures, such as car seats for younger children and seatbelts for everyone.”

While it’s true that 16-year-olds cannot drive, he clarifies, “I am not advocating for unrestricted access to platforms promoting irresponsible behavior. What I propose is allowing 16-year-olds access to platforms deemed safe for them.”

In his perspective, age classifications should be determined on a case-by-case basis across various platforms. “For a platform proven to be safe and beneficial, the age threshold could be 13, while others may require ratings of 16 or even 18. By doing so, we incentivize developers of platforms targeting younger users to create safer environments.”

Critics may argue that social media companies have already had ample opportunities to implement these safety measures. Yet they have frequently chosen to downplay evidence regarding the dangers they pose, often scaling back internal safeguards. A case in point is Meta CEO Mark Zuckerberg, who apologized following a January 2024 Senate hearing for the negative impact of social media on young users but subsequently decided to remove fact-checking efforts a year later, despite acknowledging that it would lead to fewer harmful content being detected.

Ian Russell outside Barnet coroner’s court after the inquest into Molly’s death, September 2022. Photograph: Joshua Bratt/PA

Russell is cautious about placing trust in these platforms. “Judging a social media platform solely by its words rather than its actions is unwise,” he asserts. He argues that the Online Safety Act provides a more effective means of ensuring accountability. “With a competent regulatory body and stringent regulations in place, we can hold platforms accountable for their failings. They should be tasked with rectifying issues within a set timeframe or face consequences for non-compliance.”

His skepticism stems from personal experience. After Molly’s death, he and his family were left in a state of shock and confusion until they scrutinized her online activity. “Looking through her accounts, we painfully discovered the disturbing content that had been relentlessly fed to her by these platforms.” The algorithms had led her to an alarming array of graphic visuals and videos related to suicide and self-harm, often accompanied by disheartening statements like “Fat. Ugly. Worthless. Suicidal.” Even a psychiatrist involved in the inquest was disturbed after reviewing the content. “Discovering this harrowing and nihilistic material that she was continually exposed to, which ultimately drove her to believe that ending her life was the only option, was horrifying,” Ian reflects.

When they first reported their findings to Instagram, they had hoped for a constructive response. “We anticipated a response expressing gratitude for flagging such harmful content. Instead, we received replies indicating that the content in question did not violate their community standards.”

Understanding Molly’s online engagement turned out to be a tortuous journey. Cooperation from tech companies was notably lacking, but the senior coroner at the inquest, Andrew Walker, displayed unwavering determination. Pinterest, which had directed harmful content toward Molly, complied. Twitter (now X) allowed Russell to download his daughter’s account data, which, devoid of broader context, yielded limited insights. Initially, Meta provided enormous amounts of data that proved unmanageable. After further requests and ultimatums, five years after Molly’s death, Meta finally located an additional substantial batch of evidence. “This ultimately revealed some of the most damaging content Molly encountered,” he recalls. In the final six months of her life, Molly had been exposed to over 2,100 instances of harmful material on Instagram; only 12 days were free from such interactions.

A documentary titled Molly vs The Machines is set to premiere next month, intertwining verbatim recreations of moments from the inquest—such as instances where Meta executives questioned the harmfulness of the content Molly viewed—with interviews featuring Russell and her friends. It aims to offer a broader commentary on the negative implications of big tech and the phenomenon of surveillance capitalism: endless scrolling, engagement-maximizing algorithms, misinformation, and polarization—a reality that is collectively impacting all users, not just teenagers. “There exists a dual layer of online harm—one at the global level and another, personal one that Molly bore the brunt of. These elements are deeply interconnected, and we collectively must endeavor to address this complex issue,” he emphasizes.

Russell in Molly vs The Machines. Photograph: Publicity image

As the discourse surrounding this topic becomes increasingly polarized, it reflects the tendency of social media platforms to amplify extremes. The politicization of this issue adds another layer of complication, with the Lords amendment—driven by Conservatives—emerging as a setback for Starmer’s government. Lady Kidron criticized Starmer’s approach, calling it “the very epitome of party over country.”

However, Russell advocates for a middle path. “It’s easy for factions to emerge and define the narrative as pro-ban or anti-ban, pitting us against each other. Yet in my case, I always engage with differing perspectives. The dichotomy shouldn’t lie in a pro or anti ban stance; rather, we should align against the indifference of the technology companies to child safety, which is where the critical division truly lies.”

On a more personal note, Russell finds himself reflecting on the possibility of ever feeling that his work is complete. “My ultimate desire is to return to an ordinary life and cherish the memories of Molly. Unfortunately, until we manage to tackle this global issue, I don’t foresee that happening. I aim to maintain an ordinary life, preserving a semblance of normalcy in this second chapter following Molly’s passing.” His grief and advocacy have become intertwined, impacting his daily life. “There isn’t a day that passes without thoughts of her. Some days her memory energizes me, providing comfort; on other days, it can feel paralyzing, making it difficult to move forward. Thankfully, the harsher days are diminishing with time.”

Molly vs The Machines will be in cinemas Nationwide on 1 March and will be coming to Channel 4

Interested in growing your brand with smarter solutions? Get in touch with Auctera today.

Leave a Reply

Your email address will not be published. Required fields are marked *