The breakdown of stigma around mental health has, in part, been due to the direct influence of YouTube and its creators. You only have to look at the work the likes of Zoe Sugg has done to help raise awareness of anxiety and later work with mental health charity Mind, to see that this is the case.
Thanks to the platform, there have been potentially hundreds of thousands of people of all ages, races and genders who have found comfort in knowing there are others out there going through the same thing they are.
However, amongst the panic attack vlogs and the coping mechanism videos, there’s a side to mental health content that shows a much darker image – and which could be putting you or your friends at risk of self harm or even suicide.
How Much Dangerous Mental Health Content Is On YouTube?
We did a quick search on YouTube for content that is pro eating disorders and found thousands of videos encouraging people to starve themselves. We also found thousands of videos showing viewers how to self harm and how to hide it from their friends and family. And the worst thing? We were able to search for this content with no warning, no risk of what we were about to see or even an alert offering a helpline to call if a person is in danger.
Infamously, Tumblr came under fire several years ago due to their growing and dangerous community which would encourage starvation and suicide amongst other dangerous practises. Using certain terms, users were able to search the platform and freely encourage one another to get thinner and thinner, making those at risk of an eating disorder become more ill.
In 2012, Tumblr took simple steps to tackle these morbid communities and try and prevent those at risk suffering even more. When a person searches for potentially harmful terms, a PSA appears making the user think twice and offering outlets of support.
So it begs the question, how are we able to search for similar terms around suicidal feelings, self harm and eating disorders on YouTube, without any outlet of support being offered? You could argue the case of free will and freedom of speech, surely we’re able to search for and access whatever we want? But those suffering with mental health problems are often not thinking straight and in a place of despair.
Facebook have recently added 3,000 moderators to their team to help stop people from committing suicide via Facebook Live and other issues with the platform…
YouTube can only do so much and the same goes for all social networking sites. In reality, a company would have to employ thousands upon thousands of people to moderate every video that goes up on the site, every comment left and every piece of content people create.
You’ll recall the recent upset YouTube created via their banning of LGBTQ+ content under its ‘restricted’ mode and, as a result, the platform had to hold its hands up and apologise. To help counteract LGBTQ+ content being restricted, YouTube “reversed some of [its] restrictions”and admitted it had a moderation problem.
However, surely the health risk of potentially hundreds of thousands of viewers is massively important? And while we’re certainly not tech experts, would it really take much to advertise a mental health charity or add a warning about triggering, upsetting content when a person searches for particular search terms?
The most worrying thing about our discovering is wondering whether it’s too late and whether people have been so influenced by this readily available content that they’ve suffered as a result?