Facebook is doing some soul-searching.
In a new commentary, the social media giant acknowledges the possibility that social media can have negative ramifications for democracy. This comes after repeated criticism that it didn't do enough to prevent the spread of fake news that had the potential to impact the 2016 U.S. presidential election.
"Facebook was originally designed to connect friends and family – and it has excelled at that," writes Samidh Chakrabarti, Facebook's Civic Engagement Product Manager. "But as unprecedented numbers of people channel their political energy through this medium, it's being used in unforeseen ways with social repercussions that were never anticipated."
Chakrabarti adds: "In 2016, we at Facebook were far too slow to recognize how bad actors were abusing our platform. We're working diligently to neutralize these risks now."
This is a marked change in tone from the week of the 2016 election, when Facebook CEO Mark Zuckerberg said it's a "pretty crazy idea" that fake news could have influenced the poll.
"There's a profound lack of empathy in asserting that the only reason why someone could have voted the way that they did is because they saw some fake news," Zuckerberg said in November 2016, as NPR's Aarti Shahani reported.
Since then Facebook has slowly shifted its view. Zuckerberg "is fast coming to terms with the power of his platform to cause harm," Aarti reported. In September, Zuckerberg wrote: "Calling that crazy was dismissive and I regret it. This is too important an issue to be dismissive."
Facebook, has been reluctant to wade into the business of sorting fact from fake news, though last year it introduced a system relying on third party fact checkers to flag particularly egregious examples.
The platform also was the target of a concentrated influence campaign from Russian entities. According to Facebook, "Russian actors created 80,000 posts that reached around 126 million people in the US over a two-year period."
Chakrabarti says they are trying to increase transparency about where ads are coming from. Soon, he says, Facebook will "require organizations running election-related ads to confirm their identities so we can show viewers of their ads who exactly paid for them."
Facebook says that a few years ago, it was easier to say that social media was clearly positive for democracy. It cited the Arab Spring – where many protests were organized via Facebook – as an example.
Now it's less clear, says Chakrabarti. "If there's one fundamental truth about social media's impact on democracy it's that it amplifies human intent – both good and bad."
"I wish I could guarantee that the positives are destined to outweigh the negatives, but I can't," he writes. He describes the current discussion about the potential negative implications of social media as a "moral duty."
Facebook has a relationship with many news organizations, and until recently, Facebook paid NPR to produce videos that run on the social media site.
Tech firms including Facebook have faced increasing scrutiny on Capitol Hill. Most recently, executives from Facebook, YouTube and Twitter appeared before a Senate committee last week to discuss the"steps social media platforms are taking to combat the spread of extremist propaganda over the Internet."
Today's discussion on Facebook is part of a series of "Hard Questions" from the social media network. Another recent post questions, "Is spending time on social media bad for us?" It stated that active interactions on social media are good for well-being. However, Facebook said that "when people spend a lot of time passively consuming information – reading but not interacting with people – they report feeling worse afterward."
Copyright 2021 NPR. To see more, visit https://www.npr.org.