Technology 3 min read

Instagram to Add Sensitivity Screen

Image via Wccftech

Image via Wccftech

Instagram just announced the launch of a sensitivity screen to shield its young users from harmful images.

Suicides in the age group of 15 to 24 is increasing in the United States. In fact, it’s the third leading cause of death among teens and young adults in the country.

Despite these increasing numbers, Instagram claims its decision was prompted by the death of a fourteen years old girl.

Prior to taking her own life in 2017, young Molly Russel reportedly viewed various images and videos depicting self-harm on Instagram.

Unsurprisingly, the girl’s family partly blamed the social media platform for the death of their teenage daughter.

The surprising part is the measures that the Facebook-owned app is taking to prevent access to graphic images of self-harm and suicide.

The new Instagram boss, Adam Mosseri, promised to put up a sensitivity screen to hide all self-harm content from view. In order to have access to such images, you must consent to it with full knowledge and at your own risk.

Granted, the promise came a week after health secretary Matt Hancock’s ultimatum to Facebook. But, it doesn’t make the social media platform’s effort any less commendable.

A sensitivity screen is nothing new – at least, not for Tumblr users. Similar to Tumblr’s blocks over mature content, you only need a single tap or click to gain access.

But while Tumblr’s is still inconsistent and somewhat unreliable, Instagram’s sensitivity screen should work like a charm. The social media platform can’t afford even the slightest slip through the cracks.

In an effort not to become a tool for self-harm, Instagram already rolled out a couple of features.

For example, suicide prevention tools are available for users that need it. The Facebook-owned platform also removed content containing cutting or any other type of mutilation from its hashtags and search result.

As noble as the effort may seem, Instagram is merely trying to reduce access. The self-harm images and videos would still remain on the platform,

This makes you ask; why won’t the social media platform just delete self-harm images from its platform?

While this seems like the obvious solution, it’s not as simple as you think – at least, not to the company’s advisers. While Instagram wants to discourage self-harm, the platform also wants to be a haven for individuals whose healing process involves sharing their stories.

The platform will provide better support for people who post images that indicate a struggle with self-harm or suicide, said Mosseri.

Read More: Zuckerberg Confirms Integration of Facebook, Whatsapp & Instagram

First AI Web Content Optimization Platform Just for Writers

Found this article interesting?

Let Sumbo Bello know how much you appreciate this article by clicking the heart icon and by sharing this article on social media.

Profile Image

Sumbo Bello

Sumbo Bello is a creative writer who enjoys creating data-driven content for news sites. In his spare time, he plays basketball and listens to Coldplay.

Comments (0)
Most Recent most recent
share Scroll to top

Link Copied Successfully

Sign in

Sign in to access your personalized homepage, follow authors and topics you love, and clap for stories that matter to you.

Sign in with Google Sign in with Facebook

By using our site you agree to our privacy policy.