Substack’s “Nazi problem” won’t go away after push notification apology

5 days ago 1

Substack may be legitimizing neo-Nazis as "thought leaders," researcher warns.

After Substack shocked an unknown number of users by sending a push notification on Monday to check out a Nazi blog featuring a swastika icon, the company quickly apologized for the "error," tech columnist Taylor Lorenz reported.

"We discovered an error that caused some people to receive push notifications they should never have received," Substack's statement said. "In some cases, these notifications were extremely offensive or disturbing. This was a serious error, and we apologize for the distress it caused. We have taken the relevant system offline, diagnosed the issue, and are making changes to ensure it doesn’t happen again."

Substack has long faced backlash for allowing users to share their "extreme views" on the platform, previously claiming that "censorship (including through demonetizing publications)" doesn't make "the problem go away—in fact, it makes it worse," Lorenz noted. But critics who have slammed Substack's rationale revived their concerns this week, with some accusing Substack of promoting extreme content through features like their push alerts and "rising" lists, which flag popular newsletters and currently also include Nazi blogs.

Joshua Fisher-Birch, a terrorism analyst at a nonprofit non-government organization called the Counter Extremism Project, has been closely monitoring Substack's increasingly significant role in helping far-right movements spread propaganda online for years. He's calling for more transparency and changes on the platform following the latest scandal.

In January, Fisher-Birch warned that neo-Nazi groups saw Donald Trump's election "as a mix of positives and negatives but overall as an opportunity to enlarge their movement." Since then, he's documented at least one Telegram channel—which currently has over 12,500 subscribers and is affiliated with the white supremacist Active Club movement—launch an effort to expand their audience by creating accounts on Substack, TikTok, and X.

Of those accounts created in February, only the Substack account is still online, which Fisher-Birch suggested likely sends a message to Nazi groups that their Substack content is "less likely to be removed than other platforms." At least one Terrorgram-adjacent white supremacist account that Fisher-Birch found in March 2024 confirmed that Substack was viewed as a back-up to Telegram because it was that much more reliable to post content there.

But perhaps even more appealing than Substack's lack of content moderation, Fisher-Birch noted that these groups see Substack as "a legitimizing tool for sharing content" specifically because the Substack brand—which is widely used by independent journalists, top influencers, cherished content creators, and niche experts—can help them "convey the image of a thought leader."

"Groups that want to recruit members or build a neo-fascist counter-culture see Substack as a way to get their message out," Fisher-Birch told Ars.

That's why Substack users deserve more than an apology for the push notification in light of the expanding white nationalist movements on its platform, Fisher-Birch said.

"Substack should explain how this was allowed to happen and what they will do to prevent it in the future," Fisher-Birch said.

Ars asked Substack to provide more information on the number of users who got the push notification and on its general practices promoting "extreme" content through push alerts—attempting to find out if there was an intended audience for the "error" push notification. But Substack did not immediately respond to Ars' request to comment.

Backlash over Substack’s defense of Nazi content

Back in 2023, Substack faced backlash from over 200 users after The Atlantic's Jonathan Katz exposed 16 newsletters featuring Nazi imagery in a piece confronting Substack's "Nazi problem." At the time, Lorenz noted that Substack co-founder Hamish McKenzie confirmed that the ethos of the platform was that "we don’t like Nazis either" and "we wish no-one held those views," but since censorship (or even demonetization) won't stop people from holding those views, Substack thought it would be a worse option to ban the content and hide those extreme views while movements grew in the shadows.

However, Fisher-Birch told Ars that Substack's tolerance of Nazi content has essentially turned the platform into a "bullhorn" for right-wing extremists at a time when the FBI has warned that online hate speech is growing and increasingly fueling real-world hate crimes, the prevention of which is viewed at the highest-level national threat priority.

Fisher-Birch recommended that Substack take the opportunity of its latest scandal to revisit its content guidelines "and forbid content that promotes hatred or discrimination based on race, ethnicity, national origin, religion, sex, gender identity, sexual orientation, age, disability, or medical condition."

"If Substack changed its content guidelines and prohibited individuals and groups that promote white supremacism and neo-Nazism from using its platform, the extreme right would move to other online spaces," Fisher-Birch said. "These right wing extremists would not be able to use the bullhorn of Substack. These ideas would still exist, and the people promoting them would still be around, but they wouldn’t be able to use Substack’s platform to do it."

Fisher-Birch's Counter Extremism Project has found that the best way for platforms to counter growing online Nazi movements is to provide "clear terms of service or community guidelines that prohibit individuals or groups that promote hatred or discrimination" and take "action when content is reported." Platforms should also stay mindful of "changing trends in the online extremist landscape," Fisher-Birch said.

Instead, Fisher-Birch noted, Substack appears to have failed to follow its own "limited community guidelines" and never removed a white supremacist blog promoting killing one's enemies and violence against Jewish people, which CEP reported to the platform back in March 2024.

With Substack likely to remain tolerant of such content, CEP will continue monitoring how extremist groups use Substack to expand their movements, Fisher-Birch confirmed.

Favorite alternative platforms for Substack ex-pats

This week, some Substack users renewed calls to boycott the platform after the push notification. One popular writer who long ago abandoned Substack, A.R. Moxon, joined Fisher-Birch in pushing back on Substack's defense of hosting Nazi content.

"This was ultimately my biggest problem with Substack: their notion that the answer to Nazi ideas is to amplify them so you can defeat them with better ideas presupposes that Nazi ideas have not yet been defeated on the merits, and that Nazis will ever recognize such a defeat," Moxon posted on Bluesky.

Moxon has switched to Ghost for his independent blog, The Reframe, an open source Substack alternative that woos users by migrating accounts for users and ditching Substack's fees, which take a 10 percent cut of each Substacker's transactions. That means users can easily switch platforms and make more money on Ghost, if they can attract as broad an audience as they got on Substack.

However, some users feel that Substack's design, which can help more users discover their content, is the key reason they can't switch, and Ghost acknowledges this.

"Getting traffic to an independent website can be challenging, of course," Ghost's website said. "But the rewards are that you physically own the content and you’re benefitting your own brand and business."

But Gillian Brockell, a former Washington Post staff writer, attested on Bluesky that her subscriber rate is up since switching to Ghost. Perhaps that's because the hype that Substack heightens engagement isn't real for everyone, but Brockell raised another theory: "Maybe because I'm less ashamed to share it? Maybe because more and more people refuse to subscribe to Substack? I dunno, but I'm happier."

Another former Substack user, comics writer Grek Pak, posted on Bluesky that Buttondown served his newsletter needs. That platform charges lower fees than Substack and counters claims that Substack's "network effects" work by pointing to "evidence" that Substack "readers tend to be less engaged and pay you less."

Fisher-Birch suggested that Substack's biggest rivals—which include Ghost and Buttondown, as well as Patreon, Medium, BeeHiiv, and even old-school platforms like Tumblr—could benefit if the backlash over the push notification forces more popular content creators to ditch Substack.

"Many people do not want to use a platform that does not remove content promoting neo-Nazism, and several creators have moved to other platforms," Fisher-Birch said.

Imani Gandy, a journalist and lawyer behind a popular online account called "Angry Black Lady," suggested on Bluesky that "Substack is not sustainable from a business perspective—and that's before you get to the fact that they are now pushing Nazi content onto people's phones. You either move now or move in shame later. Those are the two options really."

Photo of Ashley Belanger

Ashley is a senior policy reporter for Ars Technica, dedicated to tracking social impacts of emerging policies and new technologies. She is a Chicago-based journalist with 20 years of experience.

Read Entire Article