Prince William has called for improved online safety for children after a coroner ruled social media contributed to the death of 14-year-old Molly Russell.
The Prince of Wales said: “No parent should ever have to endure what Ian Russell and his family have been through. They have been so incredibly brave. Online safety for our children and young people needs to be a prerequisite, not an afterthought.”
The schoolgirl from Harrow, northwest London, was found dead in her bedroom after viewing content related to suicide, depression and anxiety online.
Executives from Pinterest and Instagram’s parent company Meta, which also owns Facebook, were forced to attend her inquest in person.
Andrew Walker, the coroner, said he did not “think it would be safe” to give suicide as Molly’s cause of death, instead opting for self-harm.
Giving his findings on Friday, he said: “Molly was at a transition period in her young life which made certain elements of communication difficult.”
She was “exposed to material that may have influenced her in a negative way and, in addition, what had started as a depression had become a more serious depressive illness”, he told North London Coroner’s Court.
Mr Walker also said the “particularly graphic” content she saw “romanticised acts of self-harm”, “normalised her condition” and focused on a “limited and irrational view without any counterbalance of normality”.
Read more: The digital trail that sheds light on final months of Molly Russell’s life
Since Molly’s death in November 2017, her father Ian Russell has campaigned for better protections against potentially dangerous social media algorithms.
Mr Russell said after the inquest: “We have heard a senior Meta executive describe this deadly stream of content the platform’s algorithms pushed to Molly as ‘safe’ and not contravening the platform’s policies.
“If this demented trail of life-sucking content was safe, my daughter Molly would probably still be alive and instead of being a bereaved family of four, there would be five of us looking forward to a life full of purpose and promise that lay ahead for our adorable Molly.
“It’s time the toxic corporate culture at the heart of the world’s biggest social media platform changed.”
Dame Rachel de Souza, the children’s commissioner for England, told Sky News social media giants should remove such harmful content from their platforms following the inquest’s findings.
She added it is “despicable” the companies put “profits ahead of children’s safety”.
Dame Rachel said: “There was talk at one point of actually fining named executives, the people in charge, and imprisoning them. I think it can’t go far enough.
“Why can these companies not take this stuff down now?
“I meet leaders of the six big tech companies every six months. They agree to meet me and I constantly ask them ‘how many children have you got online? Are you taking this material down?’.
“They try to evade my questions. They are not doing enough. I honestly think we are seeing mealy-mouthed responses.
“As I said before, they need to get a moral compass and sort this out now, they can do it.”
Frances Haugen, a former Facebook employee who became a whistleblower after leaving the company, told Sky News the social media giant should be looking at deaths like Molly’s as preventable.
She said: “What’s so shocking about the case of Molly is that it’s not unique.
“The reality is these platforms can take a child from an interest like healthy eating, and just by the nature of how the algorithms are designed, push them all the time towards more extreme content.”
Mrs Haugen left the social media giant in May 2021 and took with her thousands of international documents which triggered a series of allegations – including Facebook knew its products were damaging teenagers’ mental health.
She continued: “I can imagine what’s happening in Facebook now. There is a deep mythology inside of Facebook that the good they produce outweighs the bad, that there maybe a few tragic cases, but there is so much value in connecting people that they can sleep well tonight.
“Facebook should look at these deaths and treat them as preventable. I wish they would take some responsibility and act.”
Judson Hoffman, a senior executive at Pinterest, apologised for some of the content Molly saw, admitting that when she used the site in 2017 it was “not safe”.
He said the platform now uses artificial intelligence to remove such content.
Elizabeth Lagone, head of health and wellbeing policy at Meta, who owns Facebook, Instagram and WhatsApp, told the coroner some of the content Molly had viewed in the run-up to her death was “safe”, while her family argued it was suicidal.
In a heated exchange, she said the issue of removing suicidal or self-harm content was “nuanced and complicated” and that it is “important to give people that voice” if they experience those feelings.
Anyone feeling emotionally distressed or suicidal can call Samaritans for help on 116 123 or email jo@samaritans.org. Alternatively, letters can be mailed to: Freepost SAMARITANS LETTERS.