Suicide prevention efforts increased during the coronavirus pandemic as it took a toll on the mental health of the people. Moreover, the restrictions and lockdown limited contact among friends and family which increased the risk of self-harm and suicide among people living with the anxiety or depression.
Recently, Instagram introduced an innovative technology that can detect suicide and self-harm related posts and images in Europe and the UK. The tool can detect content related to these topics in the form of words and pictures as well. This will limit the visibility of the posts to the audience and even delete the posts if it violates the rules.
This new technology works through artificial intelligence and Facebook applauded this initiative saying that this is an essential step. The latest algorithm of Instagram will helo in suicide prevention by referring such harmful posts to the human moderators. These moderators will further investigate these posts or take action accordingly. They can also contact emergency services to help the user or connect the user with help organizations.
The representatives of Instagram mentioned that this referral program will start later in Europe and the UK. They still need to protect the privacy and data of the users according to the regulations. However, they will implement this referral program as the next step of this initiative.
The director of the public policy of Instagram, Tara Hopkins said that they can only use human reviews and some technology in Europe at this time. Hence, it only works if a person reports the harmful post to Instagram.
Over the past years, many users targeted the popular social media sites, Instagram, and Facebook for not regulating the posts for suicide prevention. They complained that self-harm or suicide-related posts stay posted on these websites.
Social media plays an important role in improving or worsening the mental health of a person. There are several types of research that show the negative impact of these websites on mental health. Hence, a regulation program or technology is essential for suicide prevention or limiting the engagement over such posts.
Several cases of suicide or self-harm came forward previously that hinted about their intentions on internet platforms. This new algorithm can help detect such posts and indicate self-harm potential in a user.
Even though these websites have several harms but they also work for the betterment of mental health as well. People can interact and talk with people going through the same situation and express how they are feeling. This can help reduce the cases of self-harm as it removes the stigma from this topic.
Experts suggest that talking openly about such topics can also help in suicide prevention. A strong stigma around these topics exists in society and it can only dissolve when people talk openly about it. Meanwhile, the new technology can also help limit such posts from people who are more vulnerable to self-harm or triggered by such events.
Instagram is introducing this new technology to make the platform more suitable and safe for people suffering from mental health problems. Also, the use of such platforms increased immensely during the pandemic due to the restrictions and lockdown.