It has been rather peculiar to watch a corporation with virtually limitless resources like Facebook, struggle to rein in the spread of fake news on their platform, and not meddle with fairness.
But it is easier said than done, and the Facebook fake news problem has once again beaten the social network!
The social network has tried their best, multiple times to introduce new features that arrest the spread of fake news, although none seem to have had the impact Facebook would have wished for. That pattern has not been broken yet again with another Facebook fake news test falling apart rather embarrassingly.
The newest attempt from Facebook involved automatically promoting comments that included the word ‘Fake’ to the top of the comments section, in order to offer some contest and objectivity to a news article, that could potentially be fake. Creating a sense of doubt in the minds of readers may encourage many to read more and verify the reports must have been the insight that led to Facebook opting for this test.
The BBC reports, “The trial, which Facebook says has now concluded, aimed to prioritise “comments that indicate disbelief”. It meant feeds from the BBC, the Economist, the New York Times and the Guardian all began with a comment mentioning the word fake.”
All of the observed top comments were of varied inclinations but held one common thread that bound them together, which was the word fake. Now this led to even legitimate stories being thrown into the arena of questioning and doubt which obviously wasn’t amusing to any concerned parties.
— joanna barrett (@jobrigitte) October 23, 2017
@facebook Every top comment showing on a political post is 'lie' or 'fake' Is this your new get-everyone-riled algorithm or bot inundation?October 28, 2017
Facebook is already experiencing friction with publishers, and promoting such comments to the top would have enraged them further. Rather obvious.
“Clearly Facebook is under enormous pressure to tackle the problem of fake news, but to question the veracity of every single story is preposterous,” said Jen Roberts, a freelance PR consultant told BBC. “Quite the reverse of combating misinformation online, it is compounding the issue by blurring the lines between what is real and what isn’t. My Facebook feed has become like some awful Orwellian doublethink experiment.”
With Facebook being blamed for being the misused tool at the disposal of Russians that helped manipulate the US Presidential Elections, Mark Zuckerberg has time and again stated that they are taking the war on fake news very seriously. No effective solutions have surfaced though, yet Facebook is taking it seriously is what can be said with absolute certainty.
The recent tests may seem rather hasty, and not well thought through and the immediate backlash has resulted in Facebook nearly scrapping this update is good news
Facebook told the BBC, “We’re always working on ways to curb the spread of misinformation on our platform, and sometimes run tests to find new ways to do this. This was a small test which has now concluded. We wanted to see if prioritising comments that indicate disbelief would help. We’re going to keep working to find new ways to help our community make more informed decisions about what they read and share.”
The social network has begun to work with fact checking websites, flagging posts as suspicious or untrustworthy but sadly, same ol’ same ol’ has been the case.
Featured Image Source