University of Washington
Need help now?  Call the National Suicide Prevention Lifeline at 1.800.273.8255
Need Help Now? Call the National Suicide Prevention Lifeline. 1.800.273.8255
News and Views Forefront Insight Blog Personal & professional perspectives on news and policy related to mental health and suicide prevention

Why social media needs stricter rules on self-harm posts

Posted on 

CC photo by Carissa Rogers ​Instagram, for years, has battled with users on what they can and cannot post when it comes to content involving self-harm. In 2012, Instagram changed its guidelines so that users could express themselves with images of self-harm as long as they appeared in the context of recovery or awareness, and with the condition that the post would have a warning label. With that being said, if the post showed any sign of promotion or glorification of self-harm, the user’s account would be shut down. Although Instagram has made strides in the right direction of mental health and suicide prevention, the decision to allow users to express without promoting self-harm, is one that has been proven to not be as effective.

In the Time article Teen Depression and Anxiety: Why the Kids Are Not Alright, the author Susanna Schrobsdorff investigated the factors involved with 21st century depressed and anxious teens. The article discusses how teens now, more than ever, “have a reputation for being more fragile, less resilient and more overwhelmed than their parents were when they were growing up.” The difference between the 21st century teens and the other generations prior, is that today’s teens are fully surrounded by technology and social media.

According to the Time article, the line between the offline and online worlds are blurred for today’s teens. This is why it is immensely important for social media sites like Instagram to have policies in place to ensure that users are not promoting or glorifying self-harm; not only are there more teens than ever who suffer from having depression and anxiety, but teens also lack a full understanding of the difference between on and offline life.

While Instagram does have guidelines in place, The Social Media and Adolescent Health Research Team (SMAHRT) within Seattle Children’s Research Institute researched how well Instagram filtered out non-suicidal self-injury (NSSI) content on Instagram. In the study, SMAHRT assessed Instagram posts as to whether or not NSSI content had the warning labels in place that Instagram’s policies had promised. Out of the 18 hashtags that SMAHRT researched, only six showed a warning label. This shows that while Instagram is on a path geared toward helping improve the mental health of its users, the social media site still has a long road ahead.

One of the SMAHRT’s points was based on another study, which found that “online NSSI content was a trigger for offline NSSI behaviors.” This is problematic, and a direct reason why social media sites need to have stricter policies regarding self-harm. Some social media sites, like Twitter, do not have any regulations against explicit content, so their users can post or consume anything they wish. Users struggling with depression and anxiety who have the ability to look up explicit content related to self-harm do not have safe online worlds — and from Schrobsdorff’s point about online and offline worlds blurring for today’s teens, it is concerning for either one of their worlds to be unsafe, since they are so closely connected. 

Posted by 
Sydney Sifferman's picture
Sydney Sifferman
Sydney Sifferman is a University of Washington alumna and former communications intern at Forefront.