Last updated on February 17th, 2017 at 06:53 pm
Soon after Trump’s victory over Hillary Clinton in the presidential race, Facebook came under fire for causing the upset result by allowing the spread of fake news on its social networking site.
Initially reluctant to admit Facebook’s role in the poll results, which according to Mark Zuckerberg was a minuscule percentage and had no bearing on the election outcome, the world’s most popular social networking site has announced measures that would go a long way in discouraging the spread of hoaxes and fake news.
Although Facebook and Google had earlier announced their intentions of incorporating measures to disallow the spread of fake news on their respective platforms ( Read more Facebook and Google Come Down Hard on Fake News Websites ), Mark Zuckerberg’s most recent Facebook post in this regard speaks about incorporating meaningful measures to do away with hoaxes and fake news on his site.
Speaking broadly about the strategy being implemented, he wrote:
“Today we’re making it easier to report hoaxes, and if many people report a story, then we’ll send it to third-party fact-checking organizations. If the fact checkers agree a story is a hoax, you’ll see a flag on the story saying it has been disputed, and that story may be less likely to show up in News Feed. You’ll still be able to read and share the story, but you’ll now have more information about whether fact checkers believe it’s accurate. No one will be able to make a disputed story into an ad or promote it on our platform.”
According to information provided by Facebook’s newsroom http://newsroom.fb.com/ on December 15, they cannot act as “arbiters of truth” and hence the solution to the problem had to be approached carefully.
Basically, according to the update, there are four areas that need to be worked on “Easier Reporting,” “Flagging Stories as Disputed,” “Informed Sharing,” and, “Disrupting Financial Incentives for Spammers”.
On the “Easier Reporting” aspect of the four-pronged strategy, the Facebook team is experimenting different ways of making hoax and fake news reporting easier for users. Facebook believes that they have to rely heavily on the FB community to help it “detect more fake news.”
The other area of work in this regard is “Flagging Stories as Disputed.” Flagging a story as disputed, hoax or fake is not easy work; hence, third party involvement becomes essential.
“We’ve started a program to work with third-party fact-checking organizations that are signatories of Poynter’s International Fact Checking Code of Principles,” writes Adam Mosseri, VP of News Feed.
In simple terms what this means is, once a news or story has been identified as disputed (fake, hoax) by the fact-finding third party, it will get flagged as ‘disputed.’
In addition, for better user understanding, one would guess, a link providing access to information on the “why” of the dispute, will also be provided.
“Informed Sharing,” is the third area of activity as far as Facebook’s endeavour toward tackling the menace is concerned. Again community involvement is of prime importance – Facebook can only indicate and warn.
Last, but by no means the least, is the “Disrupting Financial Incentives for Spammers” area that is being tackled on war footing. In fact, one is forced to think that this is the most important of the four because the monetary gain is the biggest reason for most disputed stuff on different sites, FB included.
“It’s important to us that the stories you see on Facebook are authentic and meaningful. We’re excited about this progress, but we know there’s more to be done. We’re going to keep working on this problem for as long as it takes to get it right,” concluded the FB newsroom piece on the subject.