Since late last year, when it came under fire for its role in helping spead “fake news” during the 2016 U.S. presidential cycle, Facebook has been working to find ways to reign in the spread of misinformation, disinformation and spam on its platform.
But last week, the social networking giant announced one of its most aggressive changes yet: to “reduce the influence of…spammers” who spread low-quality content, Facebook is updating its News Feed algorithm to “deprioritize” the content from users who share 50-plus links publicly a day.
According to Adam Mosseri, Facebook’s News Feed VP, “One of our core News Feed values is that News Feed should be informative. By taking steps like this to improve News Feed, we’re able to surface more stories that people find informative and reduce the spread of problematic links such as clickbait, sensationalism and misinformation.”
Mosseri says that the 50-plus limit will remain just one of many signals Facebook uses to determine how content is displayed in the News Feed and noted that the change will “only apply to links, such as an individual article, not to domains, Pages, videos, photos, check-ins or status updates.”
Why the 50-plus link limit?
According to Mosseri, sharing of 50-plus links per day is “one of the strongest signals we’ve ever found for identifying a broad range of problematic content.” As such, Facebook feels comfortable instituting an algorithm change based on this limit without adding additional filters that look at the actual content being shared. In other words, the users sharing rather than the content shared.
Facebook does not believe that its latest algorithm change will have much impact on publishers’ News Feed reach but Mosseri did note that “publishers that get meaningful distribution from people who routinely share vast amounts of public posts per day may see a reduction in the distribution of those specific links.”
While all of this sounds fine in principle, there are bigger issues
First, Facebook’s change is a reminder of the power wielded by the world’s largest social network. Literally overnight, Facebook has the power to make sweeping and rough-edged changes to how content is distributed on its platform. This specific change might have a limited impact on most publishers, but the potential for future changes that are more impactful remains.
Second, it remains to be seen just how effective Facebook’s latest change will be. After all, now that individuals using Facebook to intentionally spread low-quality content know about the algorithm change, they can alter their behavior. What will Facebook do if their behavior becomes harder to distinguish from the behavior Facebook considers legitimate?
Finally, it’s interesting that instead of banning accounts spreading high volumes of low-quality content, Facebook is simply deprioritizing their content in the News Feed. According to a tweet from TechCrunch’s Josh Constine, Facebook feels that “it can’t suspend spammers just for oversharing” but it’s somewhat hard to square that with the idea that many of these “spammers” are ostensibly violating Facebook’s terms.
Given this, skeptics might suggest that Facebook is trying to have its cake and eat it too: it doesn’t want to lose users – even if they’re inteionally flooding the company’s social network with low-quality content – but it wants to be able to say that it’s doing something to mimimize the spread of their content.
Ultimately, it would appear that Facebook is in a lose-lose position
From both practical and principles standpoints, it seems Facebook is going to be quite challenged to reign in the class of so-called spammers who are spreading low-quality content on its network while at the same time maintaining the moral highground and claiming that it either isn’t engaging in censorship or not going far enough to protect its financial interests.
Given this, publishers and marketers should expect that this latest News Feed algorithm change won’t be Facebook’s last and they shouldn’t continue to pay close attention to the changes Facebook is making in the fight against “fake news” because there’s no guarantee that future changes won’t affect them.