YouTube enforces Community Guidelines by faster removals and tackling comments

author-image
Social Samosa
New Update
YouTube

YouTube has always used a mix of human reviewers and technology to address violative content on the platform, and in 2017 they started applying more advanced machine learning technology to flag content for review by their teams.

This combination of smart detection technology and highly-trained human reviewers has enabled them to consistently enforce their policies with increasing speed.

They are committed to tackling the challenge of quickly removing content that violates their Community Guidelines and reporting on the progress. That’s why in April they launched a quarterly YouTube Community Guidelines Enforcement Report. As part of this ongoing commitment to transparency, today they’re expanding the report to include additional data like channel removals, the number of comments removed, and the policy reason why a video or channel was removed.

Focus on removing violative content before it is viewed

They previously shared how technology is helping their human review teams remove content with speed and volume that could not be achieved with people alone. Finding all violative content on YouTube is an immense challenge, but they see this as one of their core responsibilities and are focused on continuously working towards removing the content before it is widely viewed.

· From July to September 2018, we removed 7.8 million videos

· And 81% of these videos were first detected by machines

· Of those detected by machines, 74.5% had never received a single view

When they detect a video that violates their Guidelines, they remove the video and apply a strike to the channel. They terminate entire channels if they are dedicated to posting content prohibited by their Community Guidelines or contain a single egregious violation, like child sexual exploitation.

The vast majority of attempted abuse comes from bad actors trying to upload spam or adult content: over 90% of the channels and over 80% of the videos that they removed in September 2018 were removed for violating their policies on spam or adult content.

Looking specifically at the most egregious, but low-volume areas, like violent extremism and child safety, their significant investment in fighting this type of content is having an impact:

Well over 90% of the videos uploaded in September 2018 and removed for Violent Extremism or Child Safety had fewer than 10 views.

Each quarter they may see these numbers fluctuate, especially when their teams tighten their policies or enforcement on a certain category to remove more content. For example, over the last year they’ve strengthened child safety enforcement, regularly consulting with experts to make sure our policies capture a broad range of content that may be harmful to children, including things like minors fighting or engaging in potentially dangerous dares. Accordingly, they saw that 10.2% of video removals were for child safety, while Child Sexual Abuse Material (CSAM) represents a fraction of a percent of the content we remove.

Also Read: YouTube is cleaning it’s aura by uprooting spam subscribers

Making comments safer

As with videos, they use a combination of smart detection technology and human reviewers to flag, review, and remove spam, hate speech, and other abuse in comments.

They’ve also built tools that allow creators to moderate comments on their videos. For example, creators can choose to hold all comments for review or to automatically hold comments that have links or may contain offensive content. Over one million creators now use these tools to moderate their channel’s comments.

They’ve also been increasing our enforcement against violative comments:

From July to September of 2018, their teams removed over 224 million comments for violating our Community Guidelines.

Majority of removals were for spam and the total number of removals represents a fraction of the billions of comments posted on YouTube each quarter.

As they have removed more comments, they’ve seen their comment ecosystem actually grow, not shrink. Daily users are 11% more likely to be commenters than they were last year.

They are committed to making sure that YouTube remains a vibrant community, where creativity flourishes, independent creators make their living, and people connect worldwide over shared passions and interests. That means they will be unwavering in their fight against bad actors on our platform and our efforts to remove egregious content before it is viewed. They know there is more work to do and they are continuing to invest in people and technology to remove violative content quickly. They look forward to providing you with more updates.

You can Download the entire report Here

guidelines Youtube Content technology community platform spam Community Guidelines review abuse youtube community flag policy hate-speech learning adult adult content advanced channel removals Child Safety Child Sexual Abuse Material CSAM flag content highly-trained highly-trained human reviewers human human reviewers machine remove spam reviewers smart detection smart detection technology violative violative content Violent Extremism YouTube Community Guidelines