Fake news stories touch on a deep range of subjects, from unproven cancer cures to celebrity hoaxes and more. However, fake political stories have drawn the attention and to the possibility that they had altered a public image and might have swayed the US presidential election.
Last week, Mark Zuckerberg's line was that over 99 percent of content on Facebook is real. Now, he still keeps that there is only a "small" amount of misreporting, but outlines seven points describing programs the company is resolving in response. That includes plans we'd already heard about to cut off income for fake news sites from its advertising program and which enables it easier to report misinformation news stories when they pop up.
Misinformation And Fake News Can Spread Faster Than Real News
Forty-two percent put this responsibility on social networking sites and search engines, and a related percentage of the public itself. Fake news stories can be go viral than news stories from common sources. This is because they were created for sharing and to be easy to click, more often inflammatory topics and stories that satisfy emotional responses.
Fact-Checkers Will Utilize And Step In
Stories that fail the fact check won't be removed from Facebook but they will be publicly flagged as disputed, and which will constrain them to appear lower down in people's news feed. Users can also click on a link to learn why that is. And if people would agree and they want to share the story with friends anyway, they easily can, however, they'll get a warning.
It's also collaborating with Poynter's International Fact Checking Network, which includes Snopes.com, The Factcheck.org, Politifact, also the Associated Press and as well as the ABC News, to make a small army of fact-checkers that will regularly review news-minded items on Facebook. In addition, the digital giant is considering to effectively deny fake news sites from augmenting their lies with news feed ads.