Google-owned YouTube took down 8.3 million videos in the last three months of 2017, with machines doing most of the work in cleaning up the video-sharing platform.
The announcement comes alongside the launch of the Reporting History dashboard, which will allow YouTube users to see the status of videos that they have flagged.
YouTube Takes Down Over 8 Million Videos, Thanks To Machines
In an official blog post, YouTube revealed that it removed 8 million videos from the platform from October to December of last year. Most of the videos were either spam or attempts at uploading adult content to the website.
YouTube, however, attributed the numbers mostly to machines, which the company said has helped it flag content and remove millions of inappropriate videos before users were able to watch them. YouTube said that 6.7 million videos were first flagged by machines instead of humans, and 76 percent of those videos were removed before they logged a single view.
It appears that YouTube is successfully training machines to flag the objectionable videos. With millions of videos uploaded to the platform every quarter, using machine learning technology looks like the only way for YouTube to keep up with the amount of content. The blog post added that YouTube's investment in machine learning is paying off for high-risk, low-volume content, such as violent extremism, and for high-volume areas such as spam.
YouTube, however, is not taking humans out of the equation. In addition to YouTube teams supporting the work that machines do in flagging videos that may be violating the website's policies, the involvement of users is also important.
The Reporting History dashboard will allow users to monitor the YouTube videos that they manually flag as possibly inappropriate. Through the dashboard, users will see the review status of their flagged videos and if they are still available on the platform.
"We are committed to making sure that YouTube remains a vibrant community with strong systems to remove violative content and we look forward to providing you with more information on how those systems are performing and improving over time," YouTube wrote to end the blog post.
New YouTube Updates
The announcement of the recent results of YouTube's investments in machine learning for flagging inappropriate content comes after several recent updates to the video-sharing platform.
YouTube recently said that it will release a YouTube Kids app that will rely on humans, not machines, for its algorithm. According to the company, this will help parents and children in avoiding graphic content.
Last month, YouTube also revealed plans to add links to Wikipedia and other sources to conspiracy theory videos to help users formulate their own decision on whether or not to believe such content.