It seems that Instagram, too, is conscious of its role. In a Telegraph op-ed, chief executive Adam Mosseri has spoken about features the platform will implement to reduce protect its users from images of self-harm. The system will essentially be an extension of the app’s sensitivity screens’, which were introduced in March 2017. The feature essentially adds trigger warnings to certain images, blurring them and telling users the photo contains sensitive content.
A Half-Measure?
If they still wish to view the content, users are able to click ‘see photo’, and continue. In this way, self-harm won’t be banned completely, it will simply require additional input to view. Instagram says it came to this decision after advice from experts about recovery. “I have been deeply moved by the tragic stories that have come to light this past month of Molly Russell and other families affected by suicide and self-harm,” said Mosseri. “Even as a parent, I cannot begin to imagine what these families are going through and my thoughts go out to them.” He added that engineers are “working around the clock” to make such content harder to find. However, Instagram’s refusal to outright ban such content is unlikely to sit well with some. Sensitivity screens will do little to stop those who seek such content, though it may cause them to think twice. Mosseri speaks of the need to both reduce harm while allowing support communities to flourish. Though it won’t ban the images, it will remove them from search, hashtags, and account suggestions. Instagram is also looking at offering more resources to those who search self-harm hashtags. In the future, the company is looking at connecting users with organisations like Samaritans and Papyrus for professional support. This week, Mosseri will have to answer to UK health secretary Matt Hancock, who has been outspoken about the issues the platform faces.