‘Darkest of all worlds’: How Molly Russell fell into a social media whirlwind of despair

On the evening of November 20, 2017, Molly Russell and her family had dinner together and then sat down to watch an episode of I’m a Celebrity…Get Me Out of Here!.

A family meal and then watching a popular TV show: a typical scene for millions of families across the UK. As Molly’s mother, Janet, told police, “Everyone’s behavior was normal” at dinnertime.

The next day, around 7 a.m., Janet went to Molly’s room and found her daughter’s body.

Molly, 14, from Harrow, north-west London, had taken her own life after unbeknownst to her family she fell into a whirlwind of despair on social media. Some of the content viewed in the last year of his life was unrecognizable from primetime family television.

It was, as Molly’s father Ian said during the inquest into his daughter’s death, “just the darkest of worlds.”

“It’s a world I don’t recognize. It’s a ghetto of the online world that once you fall into it, the algorithm means you can’t escape it and it keeps recommending more content. You cannot escape it.

On Friday, the senior coroner at North London Coroner’s Court ruled after a two-week hearing that Molly died of self-harm while suffering from depression and “the effects negatives of online content”.

In many ways, Molly had the interests and hobbies of a typical teenager: the musical Hamilton, the rock band 5 Seconds of Summer, the starring role in her school show. Ian Russell highlighted this part of Molly’s life with a moving tribute at the start of the inquest at North London Coroner’s Court, speaking of a “positive, happy and bright young woman who was indeed destined to do good”.

Ian Russell arrives at North London Coroner's Court in Barnet on the first day of the inquest into his daughter's death.
Ian Russell arrives at North London Coroner’s Court in Barnet on the first day of the inquest into his daughter’s death. Photograph: Kirsty O’Connor/PA

He said: “It’s too easy to forget the person she really was: someone full of love, hope and happiness, a young person full of promise, opportunity and potential. “

But Russell said the family noticed a change in Molly’s behavior over the last 12 months of her life. She had become “more withdrawn” and spent more time alone in her room, her father said, but was still “happily” contributing to family life. The Russells attribute her behavior to “normal teenage mood swings.”

In September 2017, Russell told her daughter the family were worried about her, but described her behavior as “just a phase I’m going through”. Indeed, Russell said Molly appeared to be in “good spirits” during the last two months of her life.

Some of Molly’s social media activities – music, fashion, jewelry, Harry Potter – reflected the interests of this positive and bright person represented by her father.

But the darker side of Molly’s online life overwhelmed her. Of 16,300 posts Molly recorded, liked or shared on Instagram in the six months before her death, 2,100 related to suicide, self-harm and depression. She last used her iPhone to access Instagram the day she died, at 12:45 a.m. Two minutes prior, she had posted an image to the platform that carried a depression-related slogan.

It was on Instagram – the photo, image and video sharing app – that Molly saw some of the most disturbing content, including a montage of graphic videos containing clips related to suicide, depression and to self-harm set to music. Some videos contained scenes from film and television, including 13 Reasons Why, an American drama about teenage suicide that contained episodes rated 15 or 18 in the UK. In total, Molly watched 138 videos containing suicide and self-harm content, sometimes “jostling” in batches, including a session on November 11.

A consultant child psychiatrist told the hearing that he had not slept well for weeks after viewing Instagram content seen by Molly just before her death.

As the court sifted through the six months of Instagram content, they were shown a succession of images and clips containing slogans related to suicide and depression, or graphic images of self-harm and suicide. Some content, such as the music videos, was repeated more than once in court, giving those present an idea of ​​how Ian Russell felt when he said the ‘relentless’ nature of the content ‘impacted deep negative on my mental health”.

The court was told Molly left “behind a note that quotes” a depressive Instagram post she viewed, while a separate note started on her phone, quoting one of the video montages. Oliver Sanders KC, representing the Russell family, said “it’s Instagram that literally gives Molly ideas.”

Elizabeth Lagone, head of health and wellness policy at Meta, owner of Instagram and Facebook, was ordered by the coroner to fly over the United States to testify and was taken through numerous messages and Sanders videos. She defended the relevance of some posts, saying they were “safe” for children because they represented an attempt to raise awareness of a user’s mental state and share their feelings. Sanders wondered if a 14-year-old could be expected to tell the difference between a self-harm awareness message and one encouraging it.

Meta Health and Welfare Manager Elizabeth Lagone arrives at North London Coroner's Court
Meta Health and Welfare Officer Elizabeth Lagone arrives at North London Coroner’s Court. Photograph: Beresford Hodge/PA

Some content was clearly indefensible, even under Instagram’s 2017 guidelines, and Lagone apologized that Molly had viewed content that should have been removed from the platform because it glorified or encouraged suicide and self-harm.

But the content that Lagone sought to defend — like, for example, “the expression of someone’s feelings” — elicited expressions of exasperation from Sanders. He wondered how messages with slogans such as “I don’t want to do this anymore” could be appropriate for a 14-year-old.

Raising his voice at one point, he said Instagram chooses to put content ‘in the rooms of depressed children’, adding: ‘You have no right. You are not their parent. You are just a business in America. Instagram has a minimum age limit of 13, although Molly was 12 when she created her account.

Pinterest images were also disturbing. The inquest heard that Molly had used the platform, where users collected images on digital bulletin boards, and searched for posts under terms such as ‘depressing qoutes’. [sic] deep” and “suicially [sic] quests”.

One painting in particular, which Molly titled “Nothing to Fear…”, contained 469 images, some of which related to self-harm and suicide. Others related to anxiety and depression, while it emerged that Pinterest had sent content recommendation emails to Molly with titles such as “10 Depression Pins You Might Like”.

Jud Hoffman, community operations manager at Pinterest, told the inquest he “deeply regrets” what Molly saw and that the platform was unsafe at the time.

Hoffman also said the platform is still “not perfect” and that content violating its policies “probably still exists” on it. Internet safety campaigners like the Russell family say this applies to other platforms as well.

Jud Hoffman, Global Head of Community Operations at Pinterest
Jud Hoffman, Global Head of Community Operations at Pinterest. Photograph: James Manning/PA

The court also heard that Molly had a Twitter account which she used to contact Salice Rose, an influencer who discussed her experience with depression online, in a bid to get help. Ian Russell described it as “a call into the void” and said it was a “danger” for people like Molly to seek support from well-meaning influencers who couldn’t offer specialist support.

He also checked Molly’s YouTube account after her death and found a “high number of disturbing posts” relating to anxiety, depression, self-harm and suicide.

Throughout the hearing, Supervising Coroner Andrew Walker spoke about potential changes to how social media platforms work in relation to child users. Change has already arrived with age-appropriate design code, which prevents websites and apps from misusing children’s data, while the next online safety bill will impose an obligation due diligence to tech companies to protect children from harmful content.

In a pen portrait of his daughter read to the inquest, Ian Russell said he wanted to deliver a message of hope alongside the loss: that a tragedy that unfolded against the backdrop of social media platforms poorly regulated should not be repeated.

“Just as Molly would have wanted, it’s important to seek to learn all we can and then take whatever action is necessary to prevent such a young life from being wasted again.”

To R. @samaritans.ie. In the United States, the National Suicide Prevention Lifeline is at 800-273-8255 or chat for help. You can also text HOME to 741741 to get in touch with a crisis text line counsellor. In Australia, the Lifeline crisis helpline is 13 11 14. Other international helplines can be found at befrienders.org

#Darkest #worlds #Molly #Russell #fell #social #media #whirlwind #despair

Leave a Comment

Your email address will not be published. Required fields are marked *