This App Got Called Out For Being Racist And People Are Seriously P*ssed

25 April 2017, 11:51 | Updated: 6 November 2017, 09:40

We the Unicorns

By Hollie-Anne Brooks

Faceapp have since responded to the shocking evidence.

Faceapp is the latest time-wasting trend to take over and our lives; and we've spent the past few days being a mix of horrified and amused as the app transforms our face. If you've missed what the fuss is all about, Faceapp is a filter-based app (like Snapchat) which uses neural networks to transform your onscreen looks. The app basically has the ability to make you look like the opposite sex, look younger or look older.

YouTubers including Mort3mer and JackSepticEye recently used the app.

Of course, we've all been having a lot of fun with Faceapp but it looks like that's about to end.

The company behind the app have been accused of racial discrimination in its performance.

Faceapp has a "hot" filter which apparently makes you look sexy (not that beauty is subjective or anything). However, it appears as if the app considers hot to be lighter skin and Western facial features; and the company are under fire for lightening people's skintone. Here are some pretty awful examples:

As well as lightening skin, the app also removes glasses with it's "hot" filter and gives users exclusively Western features.


When doing our research, we found out that Faceapp is owned by a Wireless Lab, a Russian tech company.

Similar to ongoing political events, tech companies in Russia have been the subject of a lot of discrimination lately; including Russian-owned Livejournal making its users essentially agree to not discuss LGBTQ+ matters.

Additionally, Faceapp aren't the first company to land themselves in hot water over their use of filters. Last year Snapchat was called out over a "Bob Marley" filter which gave users a much darker skintone. Snapchat's flower crown filter has also been criticised for visibly lightening skin.

Bob Marley Snapchat filter


Faceapp have since responded with a statemen, blaming the malfunction of their "neural network" system.

"We are deeply sorry for this unquestionably serious issue," The statement reads. "It is an unfortunate side-effect of the underlying neural network caused by the training set bias, not intended behaviour."

While you're here, be sure to subscribe to our YouTube channel and check out the latest edition of YouTuber News.