Facebook unveiled major changes Friday to the News Feed of its 2 billion users, announcing it will rank news organizations by credibility based on user feedback and diminish its role as an arbiter of the news people see.
The move comes after the company endured harsh criticism for allowing disinformation to spread on its social network and for favoring liberal outlets over conservative ones. In a blog post accompanying the announcement, chief executive Mark Zuckerberg wrote Facebook is not "comfortable" deciding which news sources are the most trustworthy in a "world with so much division."
"We decided that having the community determine which sources are broadly trusted would be most objective," he wrote.
The new trust rankings will emerge from surveys the company is conducting. "Broadly trusted" outlets that are affirmed by a significant cross-section of users may see a boost in readership, while less known organizations or startups receiving poor ratings could see their web traffic decline significantly on the social network. The company's changes include an effort to boost the content of local news outlets, which have suffered sizable subscription and readership declines as news consumption migrated online.
The changes follow another major News Feed redesign, announced last week, in which Facebook said users would begin to see less content from news organizations and brands in favor of "meaningful" posts from friends and family. Currently, 5 percent of Facebook posts are generated by news organizations; that number is expected to drop to 4 percent after the redesign, Zuckerberg said.
Facebook and other Silicon Valley giants are grappling with their roles as dominant distributors of information in an era of foreign manipulation of social media platforms and dwindling revenues for many media outlets. On Friday, Google announced it would cancel a 2-month-old experiment, called Knowledge Panel, that informed its users that a news article had been disputed by independent fact-checking organizations. Conservatives had complained the feature unfairly targeted a right-leaning outlet.
More than two-thirds of Americans now get some of their news from social media, according to Pew Research Center. That shift has empowered Facebook and Google, putting them in an uncomfortable position of deciding what news they should distribute to their global audiences. But it also has led to questions about whether these corporations should be considered media companies.
Daniel Kreiss, a professor at the school of media and journalism at the University of North Carolina at Chapel Hill, said Facebook was now "offloading" its responsibilities for accuracy and quality onto its users. "Just by putting things out to a vote in terms of what the community would find trustworthy undermines the role for any serious institutionalized process to determine what's quality and what's not," he said.
Facebook has also been the target of accusations of political partisanship. In the summer of 2016, Facebook was charged with excluding conservative media outlets from Trending Topics, a list of top stories that runs on the upper right-hand side of Facebook pages. After conducting an investigation, the company discovered it had left the ranking decisions up to low-level contractors.
Ultimately, Zuckerberg ended up inviting conservative media figures to Facebook headquarters in Menlo Park to apologize for the misunderstanding.
He also made the decision to fire the contractors who served as editors of Trending Topics, opting instead for a more technological approach. But outsourcing the decision to software algorithms led to further criticism that the social network had become vulnerable to bad actors seeking to spread disinformation. Quality news got less exposure than click-bait in many cases. And users were not being exposed to diverse viewpoints.
These issues exploded during the presidential campaign, as false stories, such as the Pope endorsing Trump for president, generated more traffic than those from mainstream news outlets and investigators discovered that Russian operatives were using the social network to spread disinformation and divisive content.
Jay Rosen, a journalism professor at New York University, said that Facebook learned the wrong lesson from Trending Topics, which was to try to avoid politics at all costs. "One of the things that can happen if you are determined to avoid politics at all costs is you are driven to illusory solutions," he said. "I don't think there is any alternative to using your judgement. But Facebook is convinced that there is. This idea that they can avoid judgement is part of their problem."
He acknowledged Facebook was in a tough position. "They are looking for a safe approach," Rosen added. "And sometimes you can be a situation where there is no safe route out."
Surveys of media credibility have varied widely over the years, partly because political persuasion appears to shape how Americans view news organizations. A poll conducted last year by the Missouri School of Journalism found Buzzfeed and Breitbart to be among the least credible sources, while The Economist and "public television" was the most trusted.
Facebook revealed few details about how it is conducting its trust surveys, declining to share copies with The Washington Post. But Zuckerberg wrote that the decision came after substantial internal debate.
"The hard question we've struggled with is how to decide what news sources are broadly trusted," Zuckerberg wrote. "We could try to make that decision ourselves, but that's not something we're comfortable with. We considered asking outside experts, which would take the decision out of our hands but would likely not solve the objectivity problem. Or we could ask you — the community — and have your feedback determine the ranking."
Facebook's previous efforts to ask its users to determine the accuracy of news have not always turned out well. Last year, the company launched a feature that allowed users to flag news stories they felt were inaccurate. The experiment was shuttered after a nine months.
Some experts wondered whether Facebook's latest effort could be gamed.
Renee DiResta, head of public policy at the nonprofit organization Data for Democracy, applauded efforts by technology companies to come up with innovative ways to measure quality and weed out disinformation. But she raised concerns over whether Facebook's survey could be manipulated, for example, if certain media organizations encouraged their users to flood the surveys with identical responses.
"This seems like a positive step toward improving the news environment on Facebook," Diresta said. "That said, the potential downside is that the survey approach unfairly penalizes emerging publications."