WEOS Finger Lakes Public Radio

Instagram Has A Problem With Hate Speech And Extremism, 'Atlantic' Reporter Says

Mar 30, 2019
Originally published on April 6, 2019 3:02 pm

Facebook announced on Wednesday that starting next week, it will begin banning white nationalism and white separatism content on its platforms. That includes its popular photo-sharing app, Instagram.

While Facebook and Twitter have come under heavy criticism for the spread of misinformation and conspiracy theories, Instagram has flown relatively under the radar. That's allowed the platform to increasingly serve as a home for hate speech and extremist content, according to Taylor Lorenz, a reporter for The Atlantic.

In an article titled "Instagram Is the Internet's New Home for Hate," Lorenz writes that Instagram is "likely where the next great battle against misinformation will be fought, and yet it has largely escaped scrutiny."

Instagram is huge, with over 1 billion users. But policing the platform has its challenges, says Lorenz.

For example, users can set their accounts to "private" mode, meaning that only approved followers can see the content that is posted on that user's page — this makes it harder to regulate the content posted on private accounts.

Lorenz said that Instagram relies on its users to report problematic content. So, problematic content, especially on private accounts, can easily slip by unnoticed and go unreported by users.

NPR spoke with Lorenz about how extremist content spreads on Instagram — and what she thinks should be done to stop it.


Interview Highlights

On what Instagram's extremist content looks like

Extremist content on Instagram is essentially just a more visual way of presenting classic misinformation that we've seen on other platforms. So, a lot of racist memes, white nationalist content, sometimes screenshots of fake news articles.

On who extremists target on Instagram

A lot of these accounts are actually targeted towards younger people. Some of the heaviest engagers on Instagram are teenagers and sort of young millennials. A lot of these big right-wing extremist meme pages consider those people their audience and those are the users that they're targeting.

It's not all young people that are following these pages, but primarily it's a lot of teenagers, maybe college students, kids right out of school who are kind of looking to form their identity and learn about the world — learn about news events — and they're increasingly turning to social media to do that. Instagram and YouTube are the two most used platforms for Generation Z. So they're following these accounts and just becoming susceptible to their ideas.

On how memes are used to introduce people to extremist ideas

Memes and humor in general disarms people and it makes them almost more susceptible to extremist beliefs. Humor is a really good way to introduce people to ideas, especially extremist ideas, and conspiracy theories. You kind of start by laughing at it. Then, you start by questioning things a little bit, and you can end up believing and getting sort of sucked up in a lot of this stuff through humor.

On how Instagram makes it easier to find extremist accounts

Instagram is built on a bunch of different algorithms and one big algorithm that stimulates growth in the site is the page recommendation algorithm. So that's when you follow one Instagram page [and then] you're immediately prompted to follow a slew of more pages. So you can follow even a semi — what's considered a mainstream conservative meme page, and you're immediately recommended very extremist content from people like Alex Jones and other notorious conspiracy theorists.

On why extremist content can go unnoticed on private accounts

Instagram relies on users to report problematic content, and while they are developing algorithms that they say can catch some of this stuff — a lot of extremists memes, for instance — you might have a meme page with 10,000 followers — all of those people are very susceptible to white nationalist beliefs and the account is set to private. So, it's kind of what we're seeing with Facebook groups too, where there's no outside person policing it. This type of stuff is not appearing on a lot of normal users feeds.

On how popular Instagram is for Russian misinformation groups

A Senate report last year found that the IRA, which is the Internet Research Agency — a notorious Russian troll farm that promotes a lot of this nefarious misinformation — actually found Instagram to be their most valuable platform.

They ran tons of Instagram accounts aimed at stoking sort of divisive political opinions and promoting extremism to Americans.

On how to combat extremism on social media platforms

The media has covered a lot of this misinformation stuff and done a great job of it. You know, there can always be more coverage, but it's also up to people to hold people like Mark Zuckerberg, or the head of YouTube, head of Instagram, accountable for this type of stuff. Because when they see public outcry or they see #DeleteFacebook type of movements, it really does move the needle. So, people can just be aware.

Copyright 2019 NPR. To see more, visit https://www.npr.org.

KORVA COLEMAN, HOST:

We've had many conversations on this program about how misinformation and conspiracy theories spread on social media platforms like Facebook and Twitter. But one platform used by more than 1 billion people has gone relatively under the radar, and that's Instagram. For those of you who don't know, Instagram is a social media site that is owned by Facebook. On Instagram, users can post pictures, look at pictures from other people's accounts and message other users.

Instagram is increasingly providing a home for hate speech and extremist content. That's according to reporting by Taylor Lorenz, who wrote about this for The Atlantic. She says that Instagram is likely to be, quote, "where the next great battle against misinformation will be fought, and yet it has largely escaped scrutiny." And we should mention here that Facebook is among NPR's financial supporters.

Taylor Lorenz joins us on the line from New York. Welcome, Taylor.

TAYLOR LORENZ: Hi. Thanks for having me.

COLEMAN: Taylor, for people who mainly use, say, Facebook and Twitter, what does Instagram look like? How is it different?

LORENZ: Well, Instagram is primarily a visual platform. So unlike Twitter, where most tweets are made up of text and Facebook, where there's a lot of mix of links, videos, things like that, Instagram is primarily images and video. So you can see content from friends, family, news organizations, meme pages - really anyone.

COLEMAN: So what does extremist content look like on Instagram?

LORENZ: Extremist content on Instagram is essentially just a more visual way of presenting classic misinformation that we've seen on other platforms - so a lot of racist memes, white nationalist content, sometimes screenshots of fake news articles, sometimes people like Alex Jones ranting and promoting conspiracy via YouTube clips that are uploaded to Instagram TV, which is their sort of YouTube competitor. So it can take a lot of different forms.

COLEMAN: Who follows these accounts?

LORENZ: Millions and millions of people follow accounts that post content like this. A lot of these accounts are actually targeted towards younger people. So some of the heaviest engagers on Instagram are teenagers and sort of young millennials. And so a lot of these big right-wing extremist meme pages consider those people their audience. And those are the users that they're targeting.

COLEMAN: How easy is it to, say, follow one account and then get attracted to another account, and then another, and then another that might feature white supremacism?

LORENZ: It's extremely easy. I mean, Instagram actually pushes this and facilitates it. So Instagram is built on a bunch of different algorithms. And one big algorithm that stimulates growth in the site is the page recommendation algorithm. So that's when like - when you follow one Instagram page, you're immediately prompted to follow more pages. So you can follow what's considered a mainstream conservative meme page and you're immediately recommended very extremist content from people like Alex Jones and other notorious conspiracy theorists.

COLEMAN: Now, Facebook has announced that next week it'll begin banning white nationalism and white separatism content on both Facebook and Instagram, which it controls. How did white separatism and white nationalism begin to flourish there in the first place?

LORENZ: All of these platforms have really taken a hands-off approach. They really haven't policed white nationalism or white separatism to the extent that they have other extremist movements. I mean, the New Zealand shooting, I think, was hopefully an inflection point, where it's becoming increasingly clear that they have to crack down on this stuff because not only are they - is this kind of extremist content running rampant on the platforms, these platforms are facilitating its growth.

It's a big problem with Instagram though, as opposed to Facebook and Twitter, is that a lot of these big white nationalist figures, for example, there's a huge cadre of people that are part of the Identity Evropa movement. This is a white nationalist, white supremacist movement. And they're not exactly espousing their ideas on Instagram, but they're normalizing themselves.

So a lot of them are adopting influencer strategies, where they're kind of actually just posting about their lifestyle, posting themselves at nice events, dressed up. And people will follow some of these white nationalist figures, aspire to their lifestyle and then end up becoming introduced to their ideas. You know, they'll go ahead and Google them. They'll start watching their YouTube video. They'll start reading contents on a blog, maybe. So they're more susceptible to that.

COLEMAN: That was Taylor Lorenz. She reports on tech news for The Atlantic. Thank you for joining us, Taylor.

LORENZ: Thank you so much for having me. Transcript provided by NPR, Copyright NPR.