Instagram warns parents if teenagers ‘repeatedly’ search for self-harm terms

Instagram will warn parents if their child searches for suicide or self-harm terms, the company behind the social media platform has announced.

Adults will receive a warning if a child using a Teen Account searches for terms related to suicide or self-harm frequently over a short period of time, Mehta said. Adult accounts are required to opt-in to monitoring settings more than teen accounts.

Mehta added that the alerts build on Instagram’s existing policies that block searches for harmful content and redirect users to support resources, and will begin alerting users next week for registered users in Australia, the United Kingdom, the United States, and Canada.

It comes as several countries, including the UK, France and Spain, have indicated they are considering following Australia’s example and banning their own social media for young people.

Alerts will be sent to parents via email, text message, WhatsApp, as well as in-app notifications, Mehta said. (Supply: Meta)

“Constantly follow the evolution of language”

Mehta said alerts for monitoring accounts will come with resources to help adults address conversations with children.

“These warnings build on existing efforts to protect teens from potentially harmful content on Instagram,” the platform said in a statement.

Mia Bannister lost her son Ollie to suicide at the age of 14 after suffering online bullying and an eating disorder, which she says was caused by social media algorithms.

Ms Bannister said she had “mixed feelings” about the announcement.

“On the other hand, any measure that increases parental awareness is preferable to silence,” she says.

“However, this is a reactive measure by Meta, not a preventive measure,” she said.

“Warnings after ‘several searches in a short period of time’ may be too late for some families,” he said, calling for “meaningful age enforcement and structural changes to platform design.”

”We need prevention as well as notification.”

Lisa Given, professor of information science at RMIT University, told ABC News Channel that the idea had a number of loopholes, including the language young people use on social media when discussing harmful content.

“I think they’re constantly following the evolution of language in teenagers,” she says.

“Applying a few filters to common words is not enough.

“In this case, they’re putting a lot of the burden on the parents.”

Caroline Thayne, national clinical advisor for youth mental health organization Headspace, said the announcement was an important reminder for families to keep the conversation going.

“This is an opportunity for families to talk to young people about what they are looking for, reassure them that they will not be judged, and remind them that if they are in distress or thinking about self-harm, they can seek help from a trusted adult before turning to search engines,” she said.

Meta is monitored across platforms

Meta is facing multiple legal battles over its social media network, particularly in the United States.

“They’re definitely on the defensive,” Professor Given said, adding: “My concern is that they’re not dealing with harmful content.”

In Los Angeles in early February, Meta CEO Mark Zuckerberg testified for the first time before a jury about Instagram’s impact on the mental health of young users.

The lawsuit is between Meta and a California woman who accuses Meta and Google of intentionally designing a platform that was addictive and negatively impacted her mental health as a child.

Zuckerberg said Meta previously had goals related to the amount of time users spent in the app, but has since changed its approach.

He also pointed to a National Academy of Sciences finding that research has not shown that social media changes children’s mental health.

Mark Zuckerberg appears in court in Los Angeles. (AP: Ryan Sun)

Instagram CEO Adam Mosseri testified last week that he was unaware of a recent meta-study that showed no link between parental supervision and teens being more careful about their social media use.

Meta researchers found that teens who reported that Instagram regularly made them feel bad about their bodies viewed significantly more “eating disorder-adjacent content” than those who didn’t, Reuters reported in October.

In 2024, Zuckerberg apologized at a U.S. Senate hearing to families whose children had been harmed by social media.

After being pressed to apologize directly, Zuckerberg said, “I’m sorry for everything you’ve been through.” “It’s terrible. No one should have to go through what your family went through.”

Latest Update