NEW YORK -- Facebook is taking new measures to curb the spread of fake news on its huge and influential social network. It will focus on the "worst of the worst" offenders and partner with outside fact-checkers and news organizations to sort honest news reports from made-up stories that play to people's passions and preconceived notions.
The social network will make it easier for users to report fake news when they see it, which they'll be able to do in two steps, not three. If enough people report a story as fake, Facebook will pass it to third-party fact-checking organizations that are part of the nonprofit Poynter Institute's International Fact-Checking Network.
Five fact-checking and news organizations are working with Facebook on this: ABC News, The Associated Press, FactCheck.org, Politifact and Snopes. Facebook said this group is likely to expand.
Stories that flunk the fact check won't be removed from Facebook, but they'll be flagged publicly as "disputed," which will force them to appear lower down in people's news feed. Users can click on a link to learn why. And if people decide they want to share the story with friends anyway, they can -- but they'll get another warning.
"We do believe that we have an obligation to combat the spread of fake news," said John Hegeman, vice president of product management on news feed, in an interview.
But he added Facebook also takes its role to provide people an open platform seriously, and it is not the company's place to decide what is true or false.
Fake news stories touch on a broad range of subjects, from unproven cancer cures to celebrity hoaxes and backyard Bigfoot sightings.
But fake political stories have drawn outsized attention because of the possibility they influenced public perceptions and could have swayed the U.S. presidential election.
There have been dangerous real-world consequences.
A fake story about a child sex ring at a Washington, D.C., pizzeria prompted a man to fire an assault rifle inside the restaurant.
By partnering with respected outside organizations and flagging, rather than removing, fake stories, Facebook is sidestepping some of the biggest concerns experts had raised about it exercising its considerable power in this area.
For instance, some worried Facebook might act as a censor -- and not a skillful one, either, being an engineer-led company with little experience making complex media-ethics decisions.
"They definitely don't have the expertise," said Robyn Caplan, researcher at Data & Society, a not-for-profit research institute funded in part by Microsoft and the National Science Foundation.
In an interview before Facebook's announcement, she urged the company to "engage media professionals and organizations that are working on these issues."
Facebook CEO Mark Zuckerberg has said fake news constitutes less than 1 percent of what's on Facebook, but critics contend that's wildly misleading.
For a site with nearly 2 billion users tapping out posts by the millisecond, even 1 percent is a huge number, especially since the total includes everything that's posted on Facebook -- photos, videos and daily updates in addition to news articles.
In a study released Thursday, the Pew Research Center found nearly a quarter of Americans said they have shared a made-up news story, either knowingly or unknowingly.
Connect with the Southeast Missourian Newsroom:
For corrections to this story or other insights for the editor, click here. To submit a letter to the editor, click here. To learn about the Southeast Missourian’s AI Policy, click here.