San Francisco– In a bid to reduce exposure of minors to self-harming, provocative and disturbing content on its platform, Instagram has launched the “sensitivity screen” feature that blurs questionable pictures and video-thumbnails on the app until the viewer opts in.
The new feature — that has already reached users in India — blocks images of cutting and self-harm that could pop-up in search, recommendations or hashtags and influence minors into physical danger, Vogue.co.uk reported on Wednesday.
Adam Mosseri, Head of Instagram, announced the rollout of “sensitivity screens” in an op-ed he wrote for The Telegraph, expressing grief on the suicide of British teenager Molly Russell whose parents blamed the photo-messaging app for exposing their daughter to self-harm and suicidal content.
“We are not yet where we need to be on issues of suicide and self-harm. We need to do everything we can to keep the most vulnerable people who use our platform safe,” Mosseri wrote.
The announcement comes after UK Health Secretary Matt Hancock issued a warning to Instagram-owner Facebook to improve protection for young people on its apps or face legal action.
“We already offer help and resources to people who search for such hashtags, but we are working on more ways to help,” Vogue.co.uk quoted Mosseri writing in the op-ed.
The company claims to have been working with engineers and trained content reviewers to figure out ways to make finding self-harm images harder on the app.
“At Instagram, nothing is more important to us than the safety of the people in our community and we do not allow posts that promote or encourage suicide or self-harm. We rely heavily on our community to report this content and remove,” Mosseri added.
In September 2018, Instagram introduced a “prompt” feature aiming to curb drug abuse and substance sales on the platform.
Prompt’s “get support” option has three more options to direct people looking for help against drug-abuse issues to recovery and treatment organisations. (IANS)