CEO Facebook Mark Zuckerburg recently announced big changes in the news feed which resulted in less air time for posts from publishers, businesses, and celebrities while making it more centric towards friends and family.
Publications trusted by a broad cross-section of Facebook users will get priority over those that have low trust ratings.
Mr Zuckerberg says he isn't "comfortable" with judging the trustworthiness of news outlets, doubtless because he doesn't want to alienate users of any particular political stripe or affiliation.
When you select a person or page to see first, their posts appear at the top of your news feed.
We decided that having the community determine which sources are broadly trusted would be most objective. Perhaps the survey might be some indication of who will be the winners and losers on the Facebook platform.
"Clearly we don't know". There has also been criticism that this change will have the unintended effect of reducing the visibility of high-quality and more thoughtful pieces that may not come to the surface with the new algorithmic changes. "Publishers have known that investing time and effort in Facebook gives the platform an terrible lot of power-more data about consumers and more control of the relationship", says Brian Wieser, a media analyst at Pivotal Research. The company is planning to do this by running surveys.
Now they will have to figure it out on their own. They are always going to do what they think is right for themselves. When people watch videos they tend not to comment, like, or converse with friends, he added. "I'm pretty sure most publishers will be able to find ways to compensate". Not only are they likely to see upticks in referral traffic and therefore potentially higher ad revenue, they'll also receive a reputation bump that they can leverage into convincing users to pay for subscriptions.
AdNews has been told that this latest move only impacts the United States - for now.
The change to the Facebook newsfeed came as the online giant seeks to address charges that it has failed - along with Google and Twitter - to prevent the spread of bogus news, most strikingly ahead of the 2016 election in the United States, Agence France-Presse reported. Our aim is to involve as many people as possible in a conversation about holding power to account and creating a world where people and planet are nurtured.
Simply put, this means you'll see more posts from your friends and less content from pages.
In a report from First Draft News that looked at how disinformation works, Claire Wardle and Hossein Derakhshanpointed out that for many users, sharing is performative-in other words, they don't share fake news posts because they believe they are factually accurate, but because doing so fits the worldview of a specific group they would like to belong to.