YouTube Takes Down Over 58 Million Videos In Four Months of 2018

The video streaming platform has shared its quarterly report highlighting that millions of videos were removed.

S Aadeetya
Tech News
Published:
YouTube in 2018 has made its way to entry-level devices like JioPhone in India.
i
YouTube in 2018 has made its way to entry-level devices like JioPhone in India.
(Photo: The Quint)

advertisement

In a bid to suppress objectionable content on its platform, popular video-streaming platform YouTube on Friday has said that over 58 million videos were taken down in the third quarter of 2018.

The Alphabet-owned entity, clocking over 1.8 billion users globally every month (as of May 2018), also reiterated that government pleas for content take down will be undertaken in around 36 hours, according to a Reuters report.

According to a YouTube report, extremist content has been freely available on YouTube, instigating communal violence in some cases, which hasn’t gone down well with global ministries, who’re seeking better co-operation from Alphabet’s video platform.

During September, 90 percent of the nearly 10,400 videos removed for violent extremism or 279,600 videos removed for child safety issues received fewer than 10 views, according to YouTube quoted in a Reuters report.

YouTube removed about 1.67 million channels and all of the 50.2 million videos that were available from them.

Nearly 80 percent of the channel takedowns are related to spam uploads, YouTube said. About 13 percent concerned nudity, and 4.5 percent child safety. YouTube said users post billions of comments each quarter.

Between July and September 2018, YouTube removed over 7.8 million videos for violating its community guidelines. About 81 percent of these videos were flagged by machines. Out of these, 74.5 percent videos had never received a single view.   

The report further highlights that most of the content removed during this period was spam, which gets quickly identified by automated detection tools at YouTube’s end.

But most of this is confined to content in relation to extremism or nudity. For everything else, YouTube banks on user-submitted complaints for content that is harder to decipher for the automation tools.

YouTube app on an Android Tablet (Photo: iStock)

Such content usually gets flagged much after the content has been widely consumed on the internet. To fix this issue, YouTube has over 10,000 moderators who keep a close eye on the user reports for objectionable content and they are hopeful of sanitising the platform with help from its users.

We’ve seen our comment ecosystem actually grow, not shrink. Daily users are 11 percent more likely to be commenters this year than they were last year.
YouTube statement

The more users YouTube gets on board to comment and share feedback, the better it will be for the quality of content available on the platform. It’s still early days, but one hopes that necessary actions will lead to positive results in the coming years.

(With additional inputs from Reuters)

(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)

Published: undefined

ADVERTISEMENT
SCROLL FOR NEXT