Facebook recruits 3000 to team of human moderators for objectionable content

After multiple self harm and objectionable videos have surfaced on the platform, Facebook has hired 3000 human moderators to review videos.

author-image
Mohammad Kanchwala
Updated On
New Update
Facebook Human Moderators

After multiple suicides surfaced on Facebook Live failed to be taken down recently, the company has hired 3000 human moderators and reviewers to a team responsible for screening objectionable videos on the platform.

Close to 4500 people are already a part of this team, but that has proven to be insufficient since the number of videos on Facebook has skyrocketed in recent times. Mark Zuckerberg expressed sorrow over the recent spurt of incidents where Facebook users broadcasted self harm videos through Facebook Live in a Facebook post from his official account. Zuckerberg also insisted on the need to build a safer community by responding to such situations sooner.



He went on to disclose that in order to rein in this misuse of Facebook as a platform, the company would add 3000 people to an existing team of 4500 human moderators and reviewers to review the millions of reports that Facebook receives every week, and effectively speed up the process of removing posts that do not comply with Facebook guidelines such as hate speech and child exploitation.

“And we'll keep working with local community groups and law enforcement who are in the best position to help someone if they need it -- either because they're about to harm themselves, or because they're in danger from someone else.” The post further adds.

Zuckerberg admits the need to develop new tools in order to combat such threats that endanger Facebook’s image as a safe place for friends and family to connect with each other.

“We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if someone needs help. As these become available they should help make our community safer.”

Sharing an incident when Facebook reached out to law enforcement and helped prevent a Facebook user who was considering committing suicide on Live stream, and how in most cases, Facebook is unsuccessful when preventing such incidents.

“No one should be in this situation in the first place, but if they are, then we should build a safe community that gets them the help they need.” said Zuckerberg as he concluded the Facebook post.

The interesting part about Facebook deciding to bring back human moderators for censorship is that they had done away with their team of human moderators in the past due to 'accusations' of bias on their part when it came to highlighting Trending Topics for Facebook.

The AI that Facebook entrusted with these responsibilities stuttered more than once causing embarrassment for the company, but it appears Facebook has stumbled upon knowledge that at times human judgement is the best available option. Atleast for the next few years.

Social media Social Media news Indian Social Media Facebook Mark Zuckerberg Facebook News facebook updates social media censorship Facebook live facebook censorship facebook human moderators mark zuckerberg official