Will The Kyle Rittenhouse Verdict Change How Facebook Treats Gun Violence—Again?

Published 3 years ago
Kyle Rittenhouse Found Not Guilty In Kenosha Protest Shootings

When Kyle Rittenhouse fatally shot two men and maimed a third in August 2020, Facebook took relatively swift action. A day after the incident in Kenosha, Wisconsin, it removed his Facebook and Instagram accounts, started to prohibit posts praising him and blocked his name from the apps’ search function.

The moves came as part of another new Facebook policy around violence and mass shootings that debuted that same week, though it’s unclear whether it dropped before or after Rittenhouse shot those men. And as part of its decision to reduce Rittenhouse’s profile on the platform, the company officially designated Rittenhouse as a “mass shooter.”

But the steps immediately drew criticism from within Facebook. In a post to Facebook’s internal Workplace message board days later, one employee wrote: “If Kyle Rittenhouse had killed 1 person instead of 2, would it still qualify as a mass shooting? Can we really consistently and objectively differentiate between support (not allowed) and discussion of whether he is being treated justly (allowed)?”

Advertisement

The Workplace post went on: “Can we really handle this at scale in an objective way without making critical errors in both under and over enforcement?”

That comment really hits the nail on the head. Facebook has been caught up in a years-long reckoning over what type of content to regulate and how it should do that. It has been roundly criticized by liberals for not doing enough and by conservatives for doing too much. As a result, it is pulled in both directions, more often than not pleasing neither side.

Loading...

Recently, it has been pressed recently to take a stronger stance against violent content and posts that might lead to violence, which you might think could draw universal support. It hasn’t. And on Friday, it became even more complicated for Facebook: A jury found Rittenhouse not guilty, reigniting outcries from right-wing pundits that Facebook had unfairly rushed to penalize him. (His lawyer had successfully convinced the jury that he acted in self-defense that August evening in Kenosha, Wisconsin, a city then engulfed in protests over the police shooting of a 29-year-old Black man, Jacob Blake.)

Advertisement
— Jack Poso 🇺🇸 (@JackPosobiec) November 19, 2021

Facebook has long been reluctant to make judgment calls about what belongs on its site—and when it has prohibited material such as violent content, it has not always been successful in keeping it from appearing on its platform. One of the most dramatic examples: the March 2019 Christchurch shooting in New Zealand, where the shooter livestreamed his onslaught on Facebook and YouTube. No one reported the video to Facebook until 29 minutes after it went up, and no part of the video triggered Facebook’s automated moderation software, according to a internal Facebook report about the Christchurch shooter. Facebook eventually shut off the feed, but it would spend the next day taking down 1.5 million copies of the video. In response, Facebook changed a number of its policies related to live videos, including speeding up how quickly its software reviews new live videos. (Before Christchruch, a broadcast would typically have lasted for 5 minutes before the program noticed it; subsequent changes lowered it to about 20 seconds.)

As with many Facebook policy changes, these were reactive, and the more recent past has seen more of Facebook trying to keep up with current events unfolding on its platform. In August 2020, shortly after the Rittenhouse shootings, Facebook CEO Mark Zuckerberg acknowledged the company had erred in not taking down a Facebook event page that encouraged a militia to form in the same Wisconsin city where Rittenhouse had shot the three men. Facebook users reported the militia group 455 times before Facebook removed it. And then in January, Facebook took measures against posts related to the U.S. Capitol riots only in the aftermath to the insurrection—even though sentiment delegitimizing the election had blossomed on Facebook in the months after Joe Biden’s victory, another internal Facebook report shows.

The Rittenhouse verdict raises a whole new set of questions. When should a “mass shooter” label get affixed to someone—before or after a trial? Ever? How exactly should Facebook tamp down on posts? Should it scrap the mass shooter policy entirely?

Advertisement

Over the weekend, Facebook, which didn’t return requests to comment, was again sent backtracking. It lifted its block around searching for “Kyle Rittenhouse” on Facebook and Instagram, helping posts about Rittenhouse from right-wing media personalities like Ben Shapiro and Dan Bongino to attract tens of thousands of comments, reshares and reaction emojis, the signals that then boost their posts further up users’ Facebook feeds. One Facebook group, American Patriots Appeal, is advertising a Rittenhouse T-shirt. It costs $27.99 and shows him crouched, G.I. Joe-style, holding a semiautomatic rifle. An emblazoned phrase appears alongside: “Kyle Rittenhouse Did Nothing Wrong.”

The internal Facebook documents cited in this story come from the documents that Facebook whistle-blower Frances Haugen turned over to the SEC; redacted versions have gone to Congress and a consortium of news organizations, including Forbes. They’re popularly known as The Facebook Papers.  

By Abram Brown, Forbes Staff

Advertisement

Loading...