Molly Russell: Tech firms still failing after teenager’s death, says father

  • By Angus Crawford
  • BBC News

Image source, Russell family

Image caption,

Molly Russell was 14 when she died in 2017

Social media companies are still pushing “harmful content to literally millions of young people”, Ian Russell, the father of Molly Russell, has said.

He said he is horrified by the scale of the issue and that “little has changed” since Molly took her life aged 14. He fears more young lives could be lost.

New research from the Molly Rose Foundation shows young users can still access suicide and self-harm content.

Social media platforms say they are working hard to keep teenagers safe.

The sites subject to the research by the foundation set up in Molly’s name – TikTok, Instagram and Pinterest – said they had created new tools to limit access to harmful material.

Molly, who took her own life after being exposed to a stream of dark, depressing content on Pinterest and Instagram, would have turned 21 this week.

An inquest last year concluded she ended her life while suffering from depression and the negative effects of online content.

A researcher from the foundation evaluated more than 1,000 individual posts and videos, identified from searching 15 hashtags associated with harmful material and that Molly was known to engage with.

Data experts Bright Initiative helped analyse the posts and videos, which were published from 2018 to October this year.

On Instagram, the research found almost 50% of what they viewed contained content that “displayed hopelessness, feelings of misery and highly depressive themes”.

On TikTok, it found that half of the posts examined containing “harmful content” had been viewed more than a million times.

And, on Pinterest, the researcher was actively recommended multiple pictures of “people standing on cliff tops, drowning, stylised images of people in freefall through the air”.

Image caption,

Ian Russell claims there has been a “systemic failure”

Online safety campaigner Mr Russell said “six years after Molly died, this must now be seen as a fundamental systemic failure that will continue to cost young lives”.

Meta, which owns Instagram, said it had been working hard with experts and had “built more than 30 tools to support teens and families, including our sensitive content control, which limits the type of content teens are recommended”.

A Pinterest spokesperson said it was “committed to creating a safe platform for everyone” and constantly updated its policies and enforcement practices around self-harm content, “including blocking sensitive search terms and evolving our machine learning models so that this content is detected and removed as quickly as possible”.

A TikTok spokesperson said “content that promotes self-harm or suicide is prohibited” on the site, adding: “As the report highlights, we strictly enforce these rules by removing 98% of suicide content before it is reported to us.”

It said it provides “access to the Samaritans right from our app for anyone who may need support” and invests in “ways to diversify recommendations” and “block harmful search terms”.

The research conceded that the platforms had made limited efforts to improve safety.

After Molly’s death, Instagram announced a series of changes which the report says “had some welcome targeted impact”.

TikTok, it said, “appears to enforce its community standards more effectively than some other platforms”. And “some improvements had been made” by Pinterest.

But overall the report identifies problems on all three platforms:

  • A failure adequately to tackle harmful material and how it is recommended
  • A design that increases exposure to negative content through, for example, hashtag suggestions
  • Algorithms that actively spread harmful content
  • Community standards that are too narrow

Prof Louis Appleby, a government adviser on suicide prevention and professor of psychiatry at the University of Manchester, said of the research: “We’ve moved on in how we view the online world.

“We are in a new era of social responsibility and tech companies need to do more about their images and algorithms.”

Technology Secretary Michelle Donelan said the Online Safety Act, which became law last month, should address these kinds of problems, with measures to “protect both adults and children” from problematic content.

She said: “It is despicable and indefensible that social media companies are still turning a blind eye to the scale of horrendous suicide and self-harm content on their platforms.”

Regulator Ofcom is currently drawing up codes of practice which it expects tech companies to abide by and which will be enforceable by law.

Ms Donelan said she plans to meet the companies soon to tell them they “must not wait and instead should act now” to make sure “we don’t see more tragic stories such as Molly’s”.

Reference

Denial of responsibility! Elite News is an automatic aggregator of Global media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, and all materials to their authors. For any complaint, please reach us at – [email protected]. We will take necessary action within 24 hours.
DMCA compliant image

Leave a comment