/socialsamosa/media/media_files/2026/02/27/1-5-2026-02-27-11-17-45.jpg)
Instagram will begin notifying parents if their teen repeatedly searches for terms related to suicide or self-harm within a short period of time.
The alerts will apply to teens enrolled in the platform’s parental supervision tools. Parents will receive notifications by email, text message or WhatsApp, as well as through an in-app alert.
/socialsamosa/media/post_attachments/wp-content/uploads/2026/03/02_PARENT-NOTIF-WITH-EXPERT-TIPS_Carousel-01-921228.jpg?fit=1920%2C1672)
/socialsamosa/media/post_attachments/wp-content/uploads/2026/03/02_PARENT-NOTIF-WITH-EXPERT-TIPS_Carousel-02-104547.jpg?fit=1920%2C1672)
Parents and teens enrolled in supervision will be notified next week about the change. The alerts will roll out in the United States, the United Kingdom, Australia and Canada, with other regions to follow later this year.
The notification will state that their teen has repeatedly attempted to search for terms associated with suicide or self-harm and will include access to expert resources to help guide conversations.
The attempted searches that could trigger an alert include phrases promoting suicide or self-harm, phrases suggesting a teen wants to harm themselves, and terms such as ‘suicide’ or ‘self-harm.’
As per the platform’s official statement, most teens do not search for such content and that its policy is to block these searches and direct users to resources and helplines. The alerts are intended to inform parents if repeated attempts occur.
The platform analysed search behaviour and consulted experts from its Suicide and Self-Harm Advisory Group to determine the threshold for triggering notifications. The system requires multiple searches within a short period while aiming to avoid excessive notifications.
Dr. Sameer Hinduja, Co-Director, Cyberbullying Research Center, said, “When a young person searches about suicide or self-harm, empowering a parent to step in can be extremely important. The fact that Meta has now built this in is a meaningful step forward and is the kind of change that child safety experts have been pushing for.”
Vicki Shotbolt, CEO, Parent Zone, added, “It’s vital that parents have the information they need to support their teens. This is a really important step that should help give parents greater peace of mind – if their teen is actively trying to look for this type of harmful content on Instagram, they’ll know about it.”
Content in which users discuss their own struggles is allowed but is hidden from teens, even if posted by accounts they follow. The platform alerts emergency services when it becomes aware of someone at imminent risk of physical harm.
/socialsamosa/media/post_attachments/wp-content/uploads/2026/03/03_block-search-terms-clearly-associated-with-suicide-and-self-harm-and-direct-people-to-support-resources-853883.jpg?resize=960%2C836)
The platform is developing similar parental alerts for certain AI features, which would notify parents if a teen attempts to engage in certain conversations related to suicide or self-harm with its AI tools. Further details are expected in the coming months.
/socialsamosa/media/agency_attachments/PrjL49L3c0mVA7YcMDHB.png)
/socialsamosa/media/media_files/2026/02/17/desktop-leaderboard-1-2026-02-17-13-11-15.jpg)
Follow Us