Sexually permissive content, terror propaganda, hate comments, etc. social networks are now deleting critically offensive content from their websites, sometimes thousands of times in a day!
Software is now playing the lead role in YouTube when removing videos from the platform. Of the 8.3 million clips that the company deleted in the final quarter of 2017, machines detected a good 80 percent. Around three-quarters of these 6.7 million videos were removed before users saw them, the Google video platform emphasized in a blog post.
YouTube deletions: who makes the decision?
The YouTube software uses a database of known problematic videos that stop when uploading again. At the same time, it is also increasingly analyzing internal video content in order to locate problematic clips and mark them for examination. Google employees are largely making the final decision.
Of the videos reported by people, 30% were sexually explicit and 27 percent of the messages referred to clips with spam or incorrect information. Terrorist propaganda was the reason for ~500,000 reports (two percent of total reported.) India, the USA, Brazil and Russia were the top countries in this regard.
Facebook: hundreds of thousands of posts deleted
Facebook claims to be harder than ever against extremist content on its platform: In the first quarter of 2018, the world’s largest online network put-out hundreds of thousands of distributed content that was related to the terrorist organizations. A total of 1.9 million posts were removed or provided with warnings – twice as many as in the previous quarter, said Facebook. Almost ninety-nine percent was reported not by users, but by automated software and their own auditors discovered them. On average, such posts are available for less than a minute on the platform, it said. Detecting extremist content is done by a team specializing in terrorist content. According to Facebook, the group has grown from 150 to 200 employees since June 2017, with more to come.
At the same time, the online service published its definition of terrorism: “Any non-governmental organization that intentionally commits violence against persons or property in order to intimidate a civilian population, government or international organization in order to achieve a political, religious or ideological goal.” Governments, on the other hand, may according to legal understanding, in certain circumstances, lawfully use force.
Facebook and YouTube: Why the deletions?
YouTube wants to submit statistics for deleting videos in the future regularly. The Google video platform, Facebook, and other online companies have recently been accused again and again, including removing of extremist content not fast enough. In early March 2018, the European Commission issued new guidelines on the removal of terrorist and other illegal content on online platforms in order to increase pressure on such companies.