For the first time, Facebook has issued figures displaying the number of spam, hate speech and other controversial content it removed in the first quarter of 2018. The social network has come under increasing pressure from politicians, activists, and academics to curb the spread of fake news and provoking content. Facebook has now revealed measures taken to deal with posts violating its standards.
The content can be divided into six categories:
- graphic violence
- adult nudity and sexual activity
- terrorist propaganda
- hate speech
- fake accounts.
97% of all content was removed from Facebook between January and March 2018 as it was spam, 837 million posts overall. 21 million posts involving adult nudity were also deleted. 96% of which was found and identified by Facebook’s technology before it was reported. Similarly, 86% of posts involving graphic violence were identified before they were reported. Along with 3.5 million pieces of content either taken down or given warning labels.
Facebook has admitted that when it comes to hate speech, its technology is still not totally potent at identifying posts. However, 2.5 million hate speech posts were removed from the social network in quarter1 2018, 38% of which was identified by technology. When it comes to flagging and eliminating spam, damage fake accounts is critical. At the same time interval, 583 million fake accounts were damaged with the mass taken down within minutes of registration.
For graphic violence, Facebook took down or applied warning labels to about 3.5 million pieces of violent content in quarter1 2018. This means 86% pieces of violent content was identified by the technology before it was reported to Facebook.
Mark Zuckerberg said, “we have a lot of work still to do to prevent abuse. We’re up against sophisticated adversaries who continually change tactics to circumvent our controls, which means we must continuously build and adapt our efforts. It’s why we’re investing heavily in more people and better technology to make Facebook safer for everyone”.