© 2024 SDPB Radio
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Instagram Has A Problem With Hate Speech And Extremism, 'Atlantic' Reporter Says

Instagram has increasingly become a home for hate speech and extremist content, according to Taylor Lorenz, a reporter for <em>The Atlantic</em>.
Anadolu Agency
/
Getty Images
Instagram has increasingly become a home for hate speech and extremist content, according to Taylor Lorenz, a reporter for The Atlantic.

Facebook announced on Wednesday that starting next week, it will begin banning white nationalism and white separatism content on its platforms. That includes its popular photo-sharing app, Instagram.

While Facebook and Twitter have come under heavy criticism for the spread of misinformation and conspiracy theories, Instagram has flown relatively under the radar. That's allowed the platform to increasingly serve as a home for hate speech and extremist content, according to Taylor Lorenz, a reporter for The Atlantic.

In an article titled "Instagram Is the Internet's New Home for Hate," Lorenz writes that Instagram is "likely where the next great battle against misinformation will be fought, and yet it has largely escaped scrutiny."

Instagram is huge, with over 1 billion users. But policing the platform has its challenges, says Lorenz.

For example, users can set their accounts to "private" mode, meaning that only approved followers can see the content that is posted on that user's page — this makes it harder to regulate the content posted on private accounts.

Lorenz said that Instagram relies on its users to report problematic content. So, problematic content, especially on private accounts, can easily slip by unnoticed and go unreported by users.

NPR spoke with Lorenz about how extremist content spreads on Instagram — and what she thinks should be done to stop it.


Interview Highlights

On what Instagram's extremist content looks like

Extremist content on Instagram is essentially just a more visual way of presenting classic misinformation that we've seen on other platforms. So, a lot of racist memes, white nationalist content, sometimes screenshots of fake news articles.

On who extremists target on Instagram

A lot of these accounts are actually targeted towards younger people. Some of the heaviest engagers on Instagram are teenagers and sort of young millennials. A lot of these big right-wing extremist meme pages consider those people their audience and those are the users that they're targeting.

It's not all young people that are following these pages, but primarily it's a lot of teenagers, maybe college students, kids right out of school who are kind of looking to form their identity and learn about the world — learn about news events — and they're increasingly turning to social media to do that. Instagram and YouTube are the two most used platforms for Generation Z. So they're following these accounts and just becoming susceptible to their ideas.

On how memes are used to introduce people to extremist ideas

Memes and humor in general disarms people and it makes them almost more susceptible to extremist beliefs. Humor is a really good way to introduce people to ideas, especially extremist ideas, and conspiracy theories. You kind of start by laughing at it. Then, you start by questioning things a little bit, and you can end up believing and getting sort of sucked up in a lot of this stuff through humor.

On how Instagram makes it easier to find extremist accounts

Instagram is built on a bunch of different algorithms and one big algorithm that stimulates growth in the site is the page recommendation algorithm. So that's when you follow one Instagram page [and then] you're immediately prompted to follow a slew of more pages. So you can follow even a semi — what's considered a mainstream conservative meme page, and you're immediately recommended very extremist content from people like Alex Jones and other notorious conspiracy theorists.

On why extremist content can go unnoticed on private accounts

Instagram relies on users to report problematic content, and while they are developing algorithms that they say can catch some of this stuff — a lot of extremists memes, for instance — you might have a meme page with 10,000 followers — all of those people are very susceptible to white nationalist beliefs and the account is set to private. So, it's kind of what we're seeing with Facebook groups too, where there's no outside person policing it. This type of stuff is not appearing on a lot of normal users feeds.

On how popular Instagram is for Russian misinformation groups

A Senate report last year found that the IRA, which is the Internet Research Agency — a notorious Russian troll farm that promotes a lot of this nefarious misinformation — actually found Instagram to be their most valuable platform.

They ran tons of Instagram accounts aimed at stoking sort of divisive political opinions and promoting extremism to Americans.

On how to combat extremism on social media platforms

The media has covered a lot of this misinformation stuff and done a great job of it. You know, there can always be more coverage, but it's also up to people to hold people like Mark Zuckerberg, or the head of YouTube, head of Instagram, accountable for this type of stuff. Because when they see public outcry or they see #DeleteFacebook type of movements, it really does move the needle. So, people can just be aware.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Korva Coleman is a newscaster for NPR.