A Sky News investigation found anti-Semitic, racist and white supremacist material in podcasts on one of the most popular streaming services, Spotify.
The company said it does not allow hateful content on its platform.
But we found podcasts totaling several days of listening promoting such extreme views as scientific racism, Holocaust denial, and far-right anti-Semitic conspiracy theories.
And while some of the more shocking elements were buried in episodes lasting several hours, in some cases explicit slurs could be found in the titles and descriptions of the episodes while the album artwork displayed images adopted by the episodes. white supremacists.
Spotify removed the content after reporting it to the streaming giant.
But many of these podcasts stay online elsewhere, including in largely unmoderated directories like Google Podcasts.
Google did not respond to our request for comment.
And experts fear that the “easily accessible” nature of this material may draw people into extremism.
Content Disclaimer: This article includes references to racist, anti-Semitic and white supremacist language and ideas
One of the first results returned on Spotify when searching for the phrase “Plan Kalergi” pointed us to a series that at the time had 76 episodes listed on the platform.
The so-called “Kalergi Plan” is a far-right anti-Semitic conspiracy theory that claims that Jewish elites are behind a deliberate plan to wipe out the white European race by promoting mass immigration.
We have chosen not to name any of the podcast series mentioned in this article to avoid publicizing their content.
In one episode, the speaker explicitly promotes the Kalergi Plan.
He claims that the European elite have been “replaced” by a “new urban nobility” made up of Jewish elites.
The nine-minute monologue ends with an explicit call for violence against the Jewish people.
Another episode by the same creator advances the racist and unfounded idea that whites are biologically superior to people of color.
“There is something about [white men] that makes us privileged, it’s in our blood, “he said.
He promotes this point of view, undisputed, for 13 minutes. The monologue is littered with dehumanizing language and makes comparisons too offensive to be included in this article.
The series album cover art depicts the Raven’s Flag – a symbol originally found in Norse mythology, but which has been appropriated by some white supremacists in recent years.
We showed our findings to Maurice Mcleod of Race on the Agenda, a social research charity focused on issues impacting ethnic minorities in the UK.
“It’s incredibly dangerous,” he told Sky News.
“At the beginning of May, we had the most [monthly] number of reported incidents of anti-Semitism and in the year to March we had 115,000 reported hate crime incidents. That’s just what’s reported, which is still just the tip of the iceberg. “
“It feels like it normalizes that kind of thing if you can go to Spotify and listen to Adele, then you can listen to that stuff right next door.” he said.
The Kalergi Plan is a variation of the Great Replacement white nationalist conspiracy theory.
Jacob Davey, head of far-right and hate movements research and policy at the Institute for Strategic Dialogue, said it was a belief that has grown steadily in popularity. over the past decade.
“It went from what was really a marginal topic of discussion among a few European extremists to discussing the bread and butter of extremists around the world,” he told Sky News.
But those ideas don’t exist in an online vacuum, he said.
“In 2019, when an individual committed a truly horrific terrorist attack in Christchurch, New Zealand, he was doing it directly in response to that theory,” Davey said.
“And after that attack, there have been a number of more throughout 2019. The spread of these ideas can really have a noticeable impact in forcing people to go on and commit excruciating violence.”
This is just one of the series that we have encountered.
Another, hosted by US-based alternative law creators, uses racist slurs and white supremacist symbols in the titles and descriptions of the episodes.
The hosts openly and casually promote a range of anti-Semitic and racist beliefs and theories, including Holocaust denial and scientific racism.
A third series from a different creator included episodes discussing what they call the “beauty” of white supremacy, as well as essay and book readings by prominent Nazi Party figures, including Adolf Hitler and Joseph Goebbels. .
The creator has often used the episode description box on Spotify to advertise videos shared on other platforms. A link directs users to a video of a reading of what he calls “Dylann Roof’s insightful manifesto”.
Other episode descriptions link to a Telegram channel that has a swastika as its icon.
These three series total nearly 150 hours of content.
Responding to our findings, a spokesperson for Spotify said, “Spotify prohibits content on our platform that expressly and primarily advocates or incites hatred or violence against a group or individual on the basis of characteristics. , including race, religion, gender identity, sex, ethnicity, nationality, sexual orientation, veteran status or disability.
“The content in question has been removed for violation of our Hateful Content Policy.”
The platform allows users to report material that violates their content guidelines. The company also said it is developing new surveillance technology to identify material flagged as hateful content in certain international registries.
But what is currently being done to moderate its podcasting platform beyond responding to user reports is not common knowledge.
The sheer volume of online content means that tech companies need algorithms as well as people to moderate their platforms.
And while the technology that can detect hate speech in audio is under development, it is not yet widely deployed.
“One of the problems is that it takes a lot more memory to store long audio files. The other problem is that it’s complicated – you can have multiple speakers and fast dialogue,” said Hannah Kirk. , AI researcher at the Oxford Internet Institute. and the Alan Turing Institute.
“There are also tons of additional linguistic clues in the audio: tone, pitch of voice, even annoying silences or laughter. And that’s a problem because we don’t have the technology yet. to precisely encode those kinds of extra linguistic signals, ”she said. told Sky News.
Ms Kirk said it was possible that companies like Spotify faced resource or technology constraints, which meant they were unable to moderate their audio content on a large scale.
But, she said, companies have the ability to transcribe audio content and run it through word processing models trained to detect hate, which are much more advanced.
We also found some of the same series on Google Podcasts.
Google’s podcasting arm works as a directory rather than a platform, which means it doesn’t host content on its own server and instead collects podcast feeds that it automatically retrieves from the internet.
The company has previously been criticized for allowing users to access extreme and deceptive content on its interface. It’s one of the few places where users can still find infamous conspiracy theorist Alex Jones’s podcast.
We reported our findings to Google, but it did not respond. The series we have reported remains on its platform.
A spokesperson for the company previously told the New York Times that it “doesn’t want to limit what people can find” and only remove content in rare circumstances.
But experts fear that the accessibility of extreme material on these popular platforms could lead people to radicalize.
The Data and Forensics team is a multi-purpose unit dedicated to delivering transparent Sky News journalism. We collect, analyze and visualize data to tell data-driven stories. We combine traditional reporting skills with advanced analysis of satellite imagery, social media, and other open source information. Through multimedia storytelling, we aim to better explain the world while showing how our journalism is done.