YouTube Has Lost Control Of The Platform And Their Logan Paul Statement Is Proof

10 January 2018, 14:01

Logan Paul
Logan Paul. Picture: Instagram // Twitter

By Josh Lee

YouTube pushed blame onto Logan in the wake of the "suicide forest" controversy. But what about their failings?

After 9 days of silence, YouTube finally addressed the growing controversy surrounding Logan Paul's controversial "suicide forest" vlog. Although YouTube publicly condemned Logan Paul and his vlog, which featured footage of a suicide victim's body, they failed to take any responsibility for allowing the video - which had an image of the body in its thumbnail - to appear on the site in the first place, be monetised (despite a clear violation of community standards), and then go on to trend worldwide. And to top things off, they made the completely tone-deaf claim that they "acted accordingly."

This is a multi-billion dollar, global organisation who will most likely have some of the best public relations staff in the world. So why didn't YouTube call themselves out? The answer, I fear, is because YouTube have in some ways lost control of their own platform.

How algorithms helped Logan's suicide forest vlog trend

YouTube relies on algorithms for all sorts of things, from recommended videos and predictive searches, to populating their trending page. So when it's said that YouTube "allowed" Logan's video to trend, what's really meant is that their algorithms allowed it, not that there was a conscious decision to make the vlog trend.

But the fact that Logan's video was able to trend automatically is even more worrying than if it had been a deliberate choice. After all, if YouTube is reliant on algorithms, and algorithms allowed this to happen, how can the community ever be assured that this sort of thing won't happen again?

The truth is, they can't. Artificial Intelligence can do a lot of things, but it lacks the human nuance that sets us apart from machines. And with 300 hours of video content being uploaded every minute, it would clearly be impossible to use human judgement on every piece of content. And a video-sharing platform without adequate checks and balances is a dangerous thing.

There have been pretty big clues pointing to this for a while

Throughout 2017, stories of non-adult LGBTQ+ content being automatically demonetised or age-restricted were prevalent. In June 2017, YouTube CEO Susan Wojcicki accepted that LGBTQ+ content had been unfairly restricted by, you guessed it, algorithms. Announcing an update to their policies to allow non-adult LGBTQ+ videos in Restricted Mode, Wojcicki said, "it [the algorithms that control what goes into Restricted Mode] still won't work perfectly but over time our systems will get better." But despite her assurances, to this day non-adult LGBTQ+ content continues to be automatically restricted.

This issue should have been a major warning sign to YouTube that their algorithms weren't completely fit for purpose. Perhaps, because it affected a marginalised community whose videos tend to get fewer views than Logan Paul's, they didn't take it as the wake-up call that it very clearly was. Now an algorithm failure has contributed to such a huge controversy, YouTube must accept their share of the blame.

The Trending algorithm issue wasn't the only red flag

On January 2, a member of YouTube's Trusted Flagger Programme claimed that Logan's "suicide forest" vlog was reported to YouTube and manually reviewed. Despite both humans and machines checking the video, which, to reiterate, had an image of the victim's body in the thumbnail, it was deemed acceptable for all ages and monetisation. Not only was there a huge machine failure, but a catastrophic error of human judgement too.

It's no wonder LGBTQ+ YouTubers were furious at Logan Paul, when their perfectly innocuous content is restricted on the same platform where potentially traumatising content wasn't. But this isn't really all Logan's fault. Sure, he decided to film, edit and publish that video. But without several helping hands at YouTube, it probably wouldn't have ever seen the light of day.

Maybe that's why YouTube were happy to threaten Logan with "further consequences" while keeping quiet about the hand they had in creating this mess. After all, if YouTube were to admit that they can no longer control the content on the site, what sort of message would that send to creators, fans, advertisers and parents of young viewers? It's far easier to deflect blame onto the creator than it is to acknowledge the systemic issues affecting YouTube.

After last year's adpocolypse, perhaps an admission of failure is too commercially dangerous. But without publicly acknowledging their own failures, YouTube can't even begin to rebuild trust between the platform and its lifeblood - the creators and fans.