In its first quarterly report on “Community Guidelines Enforcement”, YouTube says it has removed more than 8 million videos. 80% thanks to the machine learning.

YouTube is trying to restore its image with a big spring cleaning in violent content. To this end, the group led by Susan Wojcicki is playing the transparency card, publishing its first quarterly report on the application of the rules to the community ( Community Guidelines Enforcement ).

The transparency map

The video sharing platform also launches a dashboard that allows users to view the list of videos they’ve uploaded to YouTube for removal.


” This regular update will help us show the progress we are making in removing illicit content from our platform ,” the company said in a blog post . ” By the end of the year, we plan to refine our reporting systems and add additional data, including commentary, speed or deletion data, and policy reasons. ‘elimination. 

The report states that 8.28 million videos were removed in the last quarter of 2017 for failing to adhere to the appropriate content guidelines that can be shared on the platform. For example, 8% of these removed videos were for violent extremist content, before being viewed 10 times. The platform states that “we  have hired full-time specialists in violent extremism, counter-terrorism and human rights, and we have expanded our regional expert teams. 

But, most of these videos were marked by users for sexual content (30.1%) and spam (26.4%).

The machine learning to the rescue

It should be noted that the vast majority of these videos were detected by machines.

Indeed, 6.7 million were thanks to the machine learning (automatic learning in French) deployed in June 2017. This represents 80% of the deleted videos.

” Our investment in machine learning to accelerate deletions is paying off in high-risk, low-volume areas such as violent extremism, and in high-traffic areas such as spam .”

The responsiveness highlighted by YouTube

As if to respond to criticism, YouTube also highlights its effectiveness and readiness to erase inappropriate content. In fact, 75.9% of all videos automatically marked on the platform have been deleted before they are seen. And, of the 6.7 million videos identified by machines, 76% were removed even before being viewed once.