YouTube deletes more than 51 million videos used in scams

YouTube deletes more than 51 million videos used in scams

51 million videos and 1.97 million YouTube channels were taken down between January and March this year alone, due to violations of the platform's terms of use. The report published by Google paints the panorama of the reach of the network's automatic moderation systems, while also showing the size of the problem involved in the use of publications, mainly for the practice of scams.

  • NASA and SpaceX broadcasts were used to promote Bitcoin scams
  • Brazil already has more than three million victims of cloning WhatsApp
  • More ransomware victims are paying to rescue their data

Attempts in this category, involving financial fraud, links to malicious content, spam and other instances of this type accounted for more than 87% of removals in the first quarter of 2020, a total equivalent to more than 44.8 million videos. Second is sexual or nudity content, which represented 8% of withdrawals, with clips that placed child safety in third, with 2.6%. Other types of materials, such as those promoting hate speech, cyberbullying, harassment or where users impersonate third parties had less than 1% detection, in the sum of automatic blocks.

When it comes to manual removals, based on user complaints, however, the work is a bit slower. According to YouTube, 399,400 videos were taken down by its team of human moderators, who independently review publications based on user complaints or what YouTube calls “individual bookmarks”, collaborators whose referrals have more weight. The transparency report also shows a total of 7,700 removals after complaints from NGOs and only 38 due to governmental indications.

Another good news is that almost half of all videos removed by YouTube had no views, that is, their coup attempts or the aforementioned violations did not reach anyone. This total is 49.9%, while another 27.4% did not exceed 10 views. YouTube did not elaborate on the reach of the other 22.7% of clips that exceeded that mark, leaving aside indications of the potential range of scams and other violations.

In the field of manual removals, things change a little. Scam attempts remain at the top, with 37%, but unsafe content for children appears in second place, with 24.3%, followed by sexual or nudity content (14.3%) and violence (11.4%) . The divergence in numbers is due to the fact that, especially in the case of small ones, it is more difficult for an algorithm to automatically identify materials of this type.

Brazil appears in third place in the list of countries most affected by the publication of irregular material, with 484,500 videos removed, indicating that many of them may also appear in our language due to the popularity of the service here. The country appears behind only the United States (1 million) and India (826,600), which lead the ranking, and leaves behind Indonesia (442,100) and South Korea (262,000) as the five greater violations and automatic withdrawals of content.

Automatic content removal work also takes place in the comments, with YouTube reporting a total of 693.5 million posts automatically taken down. Again, the majority are coup attempts, with 67.9% of cases, followed by issues related to child safety (13.9%), harassment and bullying (11.8%) and, finally, hate speech or abusive (6.3%).

In the comments, too, the algorithm acts more than ever, with 99.6% of detections happening automatically. According to YouTube, the total number of publications taken off the air is related only to the content posted in videos that remain on the air, not counting those who left the platform by removing channels or posts altogether.