Tom Gara, opinion editor at BuzzFeed News, wrote on Twitter that he expected partisan outlets to fare worse: "This sounds like extremely good news for news publishers that aren't hated by one side or the other". So, we surveyed a vast, broadly representative range of people (which helps - among other measures - to prevent the gaming-of-the-system or abuse issue you noted) within our Facebook community to develop the roadmap to these changes - changes that are not meant to directly impact any specific groups of publishers based on their size or ideological leanings.
The company chose to use community input to rank news sources in an attempt to be as objective as possible, Zuckerberg said.
He wrote in the blog post: "We surveyed a diverse and representative sample of people using Facebook across the United States to gauge their familiarity with, and trust in, various different sources of news".
Facebook began surveying users last week on their familiarity with certain outlets and how much they trust them. "That's why it's important that News Feed promotes high quality news that helps build a sense of common ground".
For the first change in the United States next week, publications deemed trustworthy by people using Facebook may see an increase in their distribution.
Tillerson: The United States Remains in Syria to Help Overthrow Bashar Assad
But Tillerson also declared in his speech that "reducing and expelling malicious Iranian influence from Syria " is a top priority. It was that vacuum that allowed ISIS and other terrorist organizations to wreak havoc on the country.
Facebook says it will start prioritizing news from outlets that its users think are "trustworthy". Facebook initially proposed fighting false stories by letting users flag them. "This data will help to inform ranking in News Feed".
Facebook has had a stormy relationship with news organizations, especially those with strong political leanings. If the answer is no, then it's evident people are trusting too many untrustworthy sources.
Letting Facebook users essentially decide which news outlets are reliable enough to be worthy of "distribution" on the site is a seriously risky endeavor. "Teachers with DACA don't know if they'll be allowed to teach in a few months, but somehow we expect them to take care of our children", Zuckerberg said.
The move is likely to send shockwaves through the media landscape in almost every country, given the ubiquity of the world's largest social network and how central it has become in some places to the distribution of news. So it views community feedback as the most suitable method. Facebook has dealt with controversies in the past when conservative news sources have alleged unfair treatment, and the company claims this new system will put the power in users' hands. Because as Facebook is now making increasingly clear, we're all, ultimately, responsible for vetting our own media consumption, even on social media.