YouTube is updating policies and partnering with lawmakers and civil society to remove violative & extremist content online.
In 2018, the platform claims they made more than 30 policy updates. Hate Speech and harmful content are considered to be one of the most complex subjects relating to the platform.
YouTube is consulting experts in subjects like violent extremism, supremacism, civil rights, and free speech.
Although in a recent interview with CNN Sundar Pichai, CEO, Google commenting about the subject states “It is a challenging problem”. He adds, “It’s one of those things in which let’s say we are getting it right over 99% of the time. You’ll still be able to find examples. Our goal is to take that to a very, very small percentage, well below 1%”.
In 2017, YouTube introduced a stance towards videos with supremacist content, including limiting recommendations and features like comments and the ability to share the video. This step reduced views to these videos (on average 80%).
The platform would now also be prohibiting videos alleging that a group is superior in order to justify discrimination, segregation or exclusion based on qualities like age, gender, race, caste, religion, sexual orientation or veteran status.
Additionally, borderline content and harmful misinformation, such as videos promoting a phony miracle cure for a serious illness would be reduced.
If a user is watching a video that comes colse to violating policies, they would see more videos from authoritative sources (like top news channels) in the “watch next” panel.
Moreover, channels that repeatedly violate hate speech policies will be suspended from the YouTube Partner program, which means they would lose the ability to run ads on their channel or use other monetization features.