Facebook makes drastic changes for ‘more meaningful’ News Feed
Facebook announced on Thursday it will be making changes to its News Feed in order to “prioritize posts that spark conversations and meaningful interactions between people.”
Mark Zuckerberg, Facebook founder and CEO, said he will shift the focus from helping users find relevant content to helping them find “more meaningful social interactions.”
“At its best, Facebook has always been about personal connections. By focusing on bringing people closer together – whether it’s with family and friends, or around important moments in the world – we can help make sure that Facebook is time well spent,” Zuckerberg wrote.
In the coming months, the company will begin rolling out updates that can “predict” which posts users may want to talk about with their friends and give these higher priority, according to a blog post by Adam Mosseri, the head of Facebook News Feed.
The social media giant said the updates could cause traffic to fall for certain “passive” public pages that do not generate conversations between people.
“The impact will vary from Page to Page, driven by factors including the type of content they produce and how people interact with it,” Mosseri wrote.
However, the updates could end up helping to spread more fake news and create more polarization among users.
With an average two billion monthly users, Facebook became the main target of criticism for helping to spread fake news during the 2016 presidential election. Since then, the company has been pressured by Congress to crack down on misinformation and fake news shared on its platform.
Recently, the company partnered with fact-checkers and spent millions of dollars on combing through posts adding “disputed” tags in an effort to stop fake news from spreading on the platform. However, their efforts failed after research showed the tags were not effective.
“Academic research on correcting misinformation has shown that putting a strong image, like a red flag, next to an article may actually entrench deeply held beliefs – the opposite effect to what we intended,” Facebook Product Manager Tessa Lyons said in a blog post in December.
The newly announced changes could also have the opposite effect to what the company intends.
By updating their algorithm to predict what a user is likely to engage with, Facebook may proliferate content that reinforces a user’s existing ideologies rather than content that challenges them or makes them change their minds.
Although users will still be able to post links to fake news articles and share those posts with their friends under the new system, stories that garner lots of interactions will now be prioritized on a user’s News Feed, while factual news that debunks the fake news will be deprioritized.
This could exacerbated the spread of fake news, as users will be less likely to see posts that originate from a different perspective.
During their study on the effect of adding a “disputed” tag to a fake news story, Facebook found that the solution to stopping the spread of fake news was to simply link to related articles, which gave the reader more context.
“Indeed, we’ve found that when we show Related Articles next to a false news story, it leads to fewer shares than when the Disputed Flag is shown,” Lyons wrote.
When readers were given a contrasting view, they were able to discern what was real and what was fake. By narrowing users’ viewpoints, Facebook may be helping fake news spread on their platform even faster.
However, Zuckerberg said he expects the changes will cause users to spend less time on the platform anyway.
“Now, I want to be clear: by making these changes, I expect the time people spend on Facebook and some measures of engagement will go down. But I also expect the time you do spend on Facebook will be more valuable. And if we do the right thing, I believe that will be good for our community and our business over the long term too,” Zuckerberg wrote.