Facebook aims to reintroduce its algorithms to users. The organization recently took a number of steps to bolster consumer trust in its rating and recommendation systems. Facebook said in a blog post that it would make it easier for users to manage what appears in their feeds, pointing to new and existing resources. In an apparent attempt to back up that statement, Facebook’s vice president of global affairs, Nick Clegg, wrote a 5,000-word Medium post defending the company’s ranking algorithms and dismissing the argument that they build harmful echo chambers. In a wide-ranging interview with The Verge published the same day, Clegg defended Facebook’s algorithms.
Taken together, these actions seem to be part of a concerted effort by Facebook to restore its algorithms’ bad image, which many claim actively promote and incentivize political polarization, disinformation, and extreme material. The efforts come as the company faces harsh scrutiny from lawmakers for its platform design, and just a week after CEO Mark Zuckerberg appeared before Congress during a hearing on fake news.
Facebook is promoting the message that it isn’t to blame for the spread of polarization and extreme content on its sites, and that it is taking reasonable measures to tackle both in its new public relations campaign. That defies longtime critics who claim that Facebook’s algorithms are designed to reward the most offensive content, a claim that both Facebook and Clegg deny.
It’s worth mentioning that two of the Facebook resources announced on Wednesday were already available (Recode wrote about both last year). Users can priorities up to 30 sources in their feed using “Favorites,” and “Why am I seeing this?” provides an explanation for why a specific piece of content appears in their feed. Facebook also announced a “Feed Filter” tool on Wednesday, which will allow users to switch between an algorithmically curated feed, a feed based on the pages they’ve “favorited,” and a reverse-chronological feed. Notably, there is no way to permanently move to the reverse-chronological edition.
Clegg argues in a Medium post published on Wednesday that Facebook’s newly announced News Feed features offer users algorithmic preference and clarity, as well as measures to demote clickbait and misleading or harmful content. He argues that promoting extreme content is not in Facebook’s commercial interest, claiming that marketers — the lifeblood of the company — dislike it. Clegg also told The Verge that there’s no need for Facebook to “send people the kind of sugar rush of artificially polarizing content” if it wants to retain its users for the long haul.
Long-time Facebook critics, on the other hand, don’t seem to believe these claims.
Recode quoted a spokesperson for the Real Facebook Oversight Board, a coalition of academics and activists critical of Facebook, as saying, “Nick Clegg’s Medium post is a cynical, stunning show of gas lighting on a scale hard to fathom even for Facebook.” “’Where does FB’s reward lie?’ says Clegg. Perhaps a better question is: Where does Nick Clegg’s motivation lie? The response is unmistakable.”