Facebook fake news battle will work on new three-pronged approach

Facebook fake news

Facebook has revealed their plans on unleashing a three-pronged approach in a fresh bid to tackle their ongoing fake news problem. Product Manager, Tessa Lyons shares insights into the Facebook fake news battleplan, announced today as a part of the Hard Questions series.

The Facebook fake news combat and strategy to stop misinformation is divided into three parts, as per the blog post by Tessa Lyons.

1. Remove accounts and content that violate our Community Standards or ad policies

2. Reduce the distribution of false news and inauthentic content like clickbait

3. Inform people by giving them more context on the posts they see

Frequent violators will be punished with suspension, temporary or permanent and this could significantly arrest the spread of fake news stories.

Here’s how each of the new three faceted strategies towards tackling fake news.

1. Remove accounts and content that violate our Community Standards or ad policies

Although the company says that false news does not violate Facebook’s Community Standards, it often violates their policies in other categories, such as spam, hate speech or fake accounts, which they remove.

For example, if the company finds a Facebook Page pretending to be run by Americans that are actually operating out of Macedonia, that violates Facebook’s requirement that people use their real identities and not impersonate others. “So we’ll take down that whole Page, immediately eliminating any posts they made that might have been false,” Lyons explains.

Over the past year, Facebook claims to have learned more about how networks of bad actors work together to spread misinformation, so we created a new policy to tackle coordinated inauthentic activity.

“We’re also using machine learning to help our teams detect fraud and enforce our policies against spam. We now block millions of fake accounts every day when they try to register.” she elaborates.

Authenticity and Quality in News Feed

Hear more about Facebook's efforts to improve experiences on the platform and reduce the spread of false news.

Posted by Facebook for Developers on Tuesday, 1 May 2018

2. Reducing the spread of false news and inauthentic content

A lot of the misinformation that spreads on Facebook is financially motivated, much like email spam in the 90s. If spammers can get enough people to click on fake stories and visit their sites, they’ll make money off the ads they show.

By making these scams unprofitable, Facebook says it is destroying their incentives to spread false news on Facebook. So they’re figuring out spammers’ common tactics and reducing the distribution of those kinds of stories in News Feed. Facebook has started penalizing clickbait, links shared more frequently by spammers, and links to low-quality web pages, also known as “ad farms.”

Also Read: What data does Facebook collect when I’m not using Facebook, and why?

They also claim to be taking action against entire Pages and websites that repeatedly share false news, reducing their overall News Feed distribution. “And since we don’t want to make money off of misinformation or help those who create it profit, these publishers are not allowed to run ads or use our monetization features like Instant Articles.” Lyons further states.

Another part of Facebook’s strategy in some countries is partnering with third-party fact-checkers to review and rate the accuracy of articles and posts on Facebook. These fact-checkers are independent and certified through the non-partisan International Fact-Checking Network. When these organizations rate something as false, we rank those stories significantly lower in News Feed. On average, this cuts future views by more than 80%. The company also uses the information from fact-checkers to improve their technology to identify more potential false news faster in the future and are looking forward to bringing this program to more countries this year.

3. Informing our community with additional context

Even with these steps, Facebook knows people will still come across misleading content on Facebook and the internet more broadly. To help people make informed decisions about what to read, trust and share, Facebook is investing in news literacy and building products that give people more information directly in News Feed.

“For example, we recently rolled out a feature to give people more information about the publishers and articles they see, such as the publisher’s Wikipedia entry.” Lyons says.

Another feature, called Related Articles, displays articles from third-party fact-checkers immediately below a story on the same topic. If a fact-checker has rated a story as false, Facebook will let people who try to share the story know there’s more reporting on the subject. Facebook will also notify people who previously shared the story on Facebook.

“Last year we created an educational tool to give people tips to identify false news and provided a founding grant for the News Integrity Initiative to invest in long-term strategies for news literacy.” Lyons concludes.

Facebook says they are working with their AI research team, learning from academics, and even working closely with third-party fact checkers among other organizations in their efforts to weed out fake news that in their words is “disruptive and has destructive consequences around the world.”