Facebook Reveals What Content It Will Suppress, But Not Remove — Though Crucial Details Remain Hidden

Published 3 years ago
Facebook Holds Virtual F8 Developer Conference

TOPLINE Facebook published its guidelines on content distribution Thursday, reiterating that it throttles the reach of “problematic or low-quality” content including clickbait, misinformation and sensationalist health claims, as the company faces increasing pressure from users and lawmakers to reveal how it decides what content to display. 

KEY FACTS

There are dozens of reasons Facebook curtails the reach of content it displays in its News Feed, according to Content Distribution Guidelines published on Thursday, including spam, unsafe reporting on suicide and “posts from broadly untrusted news publishers.” 

Facebook said the guidelines, previously shared through earlier announcements, have been newly gathered in the platform’s months-old transparency center

Advertisement

Facebook said there are three reasons it will reduce the reach of content: to incentivize creators to build “high-quality and accurate content,” to foster a “safer community,” and in response to direct user feedback. 

Publishing the measures in one place—which apply to problematic posts that do not warrant removal from the site—is supposed to provide more clarity on Facebook’s internal process, the company said, as it faces increasing fire around the world to reveal how it controls what users see.

Loading...

According to the guidelines, Facebook also demotes posts by people who “probably have multiple accounts for the purposes of evading enforcement,” news articles lacking clear authorship and links that lead to pages containing explicit or shocking content. 

WHAT WE DON’T KNOW

The guidelines offered only a top-level look at how Facebook controls News Feed and did not, for example, give details on how content is reduced, by how much or whether different kinds of posts have their reach cut in different ways.  

Advertisement

KEY BACKGROUND

The guidelines are not a new initiative of Facebook’s and simply shine a small light on the company’s well-known and poorly-understood practice of manipulating what users see. Recent investigations by the Wall Street Journal and the New York Times uncovered details of a project to show people positive stories about Facebook, separate rules for high-profile users and politicians, and evidence the company is fully aware its products can be harmful to users, including young girls. In 2014, Facebook revealed a clandestine experiment on nearly 700,000 to determine whether or not it could manipulate users’ emotions using News Feed (the study found they could). 

FURTHER READING

No More Apologies: Inside Facebook’s Push to Defend Its Image (NYT)

Advertisement

Loading...