© 2024 SDPB Radio
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
News

In Wake Of Shootings, Facebook Struggles To Define Hate Speech

In a post on Facebook CEO Mark Zuckerberg wrote that the live-streamed images following a police shooting in Minnesota were "graphic and heartbreaking."
Josh Edelson
/
AFP/Getty Images
In a post on Facebook CEO Mark Zuckerberg wrote that the live-streamed images following a police shooting in Minnesota were "graphic and heartbreaking."

In the wake of last week's shootings, Facebook has seen a significant spike in flagged content, with users calling out each other's posts as racist, violent and offensive, according to Facebook employees, who say the company is having a very hard time deciding who is right or how to define hate speech.

Unpublished, and re-published

The day after Diamond Reynolds live-streamed her fiancé bleeding to death after he was shot by police in suburban St. Paul, Minn., the CEO of Facebook weighed in with a post. Mark Zuckerberg said the images were "graphic and heartbreaking," a reminder of why it's so important to "build a more open and connected world."

What he didn't mention is the downside of being connected. Employees inside Facebook tell NPR the company is struggling internally to deal with the fallout — the posting wars of the last week. The policy team has been working round-the-clock to respond to users.

(Facebook pays NPR and other news organizations to produce live videos for its site.)

Facebook user Robert Jones recaps a notice he got: "Because of repeated infractions, 'Son of Baldwin' has been unpublished. If you believe this is unfair, please let us know."

Jones is a black blogger in Brooklyn. The notice said his popular page (he has about 80,000 fans) had been unpublished.

He filled out a form where users can make their case to Facebook to get re-published. And then he got another notice. " 'Your account has been banned for 30 days. If you think this is unfair let us know.' And I just shook my head," he says.

Right after the Dallas shooting, in which a black gunman killed five police officers, Jones wrote a post that began:

"Dear Black Folks:

"Do NOT feel collectively responsible when an assailant is black.

"White folks do NOT feel collectively responsible when an assailant is white.

"If white people get to be individuals and presumed collectively innocent, then black people get to be individuals and presumed collectively innocent, too."

Jones did not call for violence. And he only managed to get his page back up because some fan he doesn't know happened to have a friend inside Facebook.

"I have never spoken to her on the phone," he says. "But I am eternally grateful."

Cop killer cartoon

Another controversial Facebook post included a graphic drawing of a person in a black hood slitting the throat of a white police officer. Blood is dripping from the officer's neck.

Many Facebook users reported it, including T.J. Dunne — who found it offensive.

"If it was the other way around, if I was posting a picture of me cutting a black guy's throat, you don't think they would throw a fit?" he says.

Dunne says he reported the image to Facebook immediately and got a response in about an hour.

The verdict: The image does not violate community standards — Facebook's rules on prohibited content. Dunne was floored, not just because of the post. He says there were more than 7,000 comments, some very charged. He decided to join in. "When I'm seeing death to white people and kill all the white cops — yeah I commented, quite a few times."

NPR was not able to review his or others' comments because Facebook did end up reversing its decision and pulling the post. Dunne has a theory about why: "The next day, five cops got killed. Then it got taken down."

Facebook responds

Both men take Facebook's initial decisions as a sign of political bias: The platform is stacked against their cause or community. They also didn't get an explanation for why the company reversed the initial decisions. (Jones, the blogger, just got a notice saying Facebook "made a mistake.")

Monika Bickert, Facebook's head of policy, wouldn't discuss these specific cases but offers this insight: "We look at a how a specific person shared a specific post or word or photo to Facebook."

Meaning, if one person shared the cop killer cartoon to condemn it, that's OK. If another person shared it as a call to arms, it's not. And, she says, in no case does an algorithm automate the decision-making process. Her team will examine every individual share — which sounds remarkably labor intensive.

She says Facebook considers "was it somebody who was explicitly condemning violence or raising awareness? Or was it somebody who was celebrating violence or not making clear their intention or mocking a victim of violence?"

According to LinkedIn data, many of the people at Facebook who make these editorial decisions about hate speech are recently out of college, and have a real range of bachelor's degrees — in business, math, managing medical records, and psychology. Bickert says senior managers regularly consult on cases, and she brings in outside experts to conduct training.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

News
Aarti Shahani is a correspondent for NPR. Based in Silicon Valley, she covers the biggest companies on earth. She is also an author. Her first book, Here We Are: American Dreams, American Nightmares (out Oct. 1, 2019), is about the extreme ups and downs her family encountered as immigrants in the U.S. Before journalism, Shahani was a community organizer in her native New York City, helping prisoners and families facing deportation. Even if it looks like she keeps changing careers, she's always doing the same thing: telling stories that matter.