Antisemitic posts are rarely removed by social media companies, study finds
Go Deeper.
Create an account or log in to save stories.
Like this?
Thanks for liking this story! We have added it to a list of your favorite stories.
Five major social media companies, including Facebook and Twitter, took no action to remove 84 percent of antisemitic posts, a new report from the Center to Counter Digital Hate (CCDH) found.
Despite promising to crack down on antisemitic hate, Facebook, Twitter, Instagram, Youtube and TikTok did not act on these posts even as they were flagged through the existing tools used for reporting malignant content.
Researchers from the CCDH, a nongovernmental organization based in the United States and the United Kingdom, examined 714 anti-Jewish posts on the five platforms published between May and June. Collectively, they had been viewed 7.3 million times, the report said.
"The study of antisemitism has taught us a lot of things ... if you allow it space to grow, it will metastasize. It is a phenomenally resilient cancer in our society," Imran Ahmed, the CEO of CCDH told NPR.
Turn Up Your Support
MPR News helps you turn down the noise and build shared understanding. Turn up your support for this public resource and keep trusted journalism accessible to all.
He said social media spaces have been "unable or unwilling" to take action against antisemitic posts effectively. This study differs from others, he said, in that CCDH wanted to prove that social media companies aren't unable to moderate content — they just choose not to.
That's why Ahmed and his team chose to focus on posts that had already been flagged to social media companies through the companies' own internal systems. And still, even following their own standards, the social media companies failed to act.
For posts that included antisemitic conspiracy theories about 9/11, the pandemic and Jewish people controlling world affairs, social media companies didn't take action on 89 percent of them. These platforms also didn't act on 80 percent of posts denying the Holocaust, as well as 70 percent of posts with neo-Nazi and white supremacist images.
In October, Facebook shifted their policy on handling hate speech and Holocaust denials, saying they would now "prohibit any content that denies or distorts the Holocaust."
CEO Mark Zuckerberg posted on Facebook saying, "I've struggled with the tension between standing for free expression and the harm caused by minimizing or denying the horror of the Holocaust ... with the current state of the world, I believe this is the right balance."
But the report from CCDH shows that of all five social media platforms examined, Facebook was the worst offender, failing to act on 89 percent of antisemitic posts.
"There is this enormous gulf between what they claim and what they do," Ahmed said.
The report also shows the lasting impact of hashtags on Instagram, Twitter and TikTok, all platforms that allow antisemitic hashtags. Ones like #fakejews and #killthejews that were included in the 714 posts gained 3.3 million impressions, the report said.
TikTok specifically is failing to ban accounts that directly abuse Jewish users, the CCDH said; according to the study, the platform removes just 5 percent of accounts that do things like sending direct messages about Holocaust denial.
And the hate speech that spreads online doesn't just stay online. Several studies show links between the prevalence of racist speech on social media platforms and hate crimes in the area. In Germany, for example, anti-refugee posts on Facebook were correlated with physical assaults against refugees.
"There is a reflexive interaction between online and offline racism, they reinforce each other," Ahmed said.
In an offline world, there are consequences to antisemitic behavior, he said.
But in the online space, Ahmed said, there are no limits, and people become radicalized without any boundaries.
"The online spaces then have an effect on offline spaces because these people have worsened," Ahmed said. "The failure of these companies is a cost that's paid in lives."
Editor's note: Facebook and Google, parent organization of YouTube, are among NPR's financial supporters.
Copyright 2021 NPR. To see more, visit https://www.npr.org.