After the Capitol Insurrection, Telegram Moved to Ban White Nationalists. Today, It’s a Different Story.

Mother Jones illustration; Getty

Let our journalists help you make sense of the noise: Subscribe to the Mother Jones Daily newsletter and get a recap of news that matters.For a moment last month, it looked like Telegram was finally doing something about its neo-Nazi problem. The secure messaging app, whose largely untraceable platform has laudably provided a safe haven of communication for people in autocratic countries even under government pressure, has also proven popular among racists who have been kicked off other platforms. Telegram largely did nothing about those users until last month, when, shortly after the January 6 riot at the Capitol, the company finally started taking down white supremacist and neo-Nazi channels, the app’s term for public-facing chat groups which, similar to Twitter, allow creators to send messages to anyone who signs up.
But within weeks, Telegram appears to have lost its sudden interest in banning white supremacists, according to researchers monitoring the platform. “A lot of the banned groups just immediately reformed and gained back huge numbers of their followers,” says Emmi Bevensee, a Mozilla Open Web Fellow and PhD student at the University of Arizona who tracks extremism online. “The terrorgram channels particularly are extremely agile,” Bevensee said, referring to the loose network of Telegram channels where pro-Nazi, anti-semitic memes using glitchwave and cyberpunk aesthetics are shared.
Bevensee and Max Aliapoulios, a PhD student at New York University studying cybersecurity, who are both contributors to the Social Media Analysis Toolkit, say there’s evidence to suggest Telegram’s limited actions had little effect. After looking at data they pulled from the platform, Bevensee said, “we did not notice a dramatic change in the volume of hate and conspiracies on Telegram in the time surrounding enforcement,” caveating that while their data sets were comprehensive, it’s possible that they missed some far-right channels. 
Other researchers who track extremism online, including Marc-André Argentino, a PhD candidate at Concordia University, and activist Gwen Snyder reported similar patterns of users of banned channels quickly regrouping elsewhere on Telegram. And I, monitoring the app during the crackdown, saw what they described firsthand. As Telegram cracked down on white supremacist Telegram channels, the poster immediately started spreading the names of backup accounts to their followers. In a lot of cases, they were brazen enough to just name the new channel their old name plus a “2.”
When I first talked to Snyder in mid-January about how it looked like Telegram was finally taking action to target its neo-Nazis and white supremacists, she was cautiously hopeful. She and other activists were pushing a Twitter campaign urging the mass-reporting of terrorgram channels that had shown initial results. At the time, Telegram taking any enforcement on white nationalists was essentially unprecedented except for some minor content moderation in January of 2020, following Mother Jones reporting. 
But when we spoke again in early February her tone had changed. “About half the channels have reestablished themselves.” Snyder said, explaining that Telegram’s urge to moderate seemed to have “slowed after the first week or so.”
“Some of the channels I’m reporting [to Telegram] are posting instructions on how to make bombs and they’re targeting Planned Parenthood clinics,” she said. “You can see in the channels that they’re becoming more emboldened now that the wave has passed.” 
Some Telegram channels with followings in the thousands that espoused white nationalist messages have remained active throughout the last several months, including at least one that was directly acknowledged by Telegram founder Pavel Durov in a public channel as he was discussing banning certain groups from Telegram. That such high-profile offensive content remains on the platform suggests the company just isn’t interested in doing anything about it. 
Telegram did not respond to an email seeking comment on their moderation practices.
To Snyder, driving white nationalists onto more obscure platforms is an essential part of stifling their movement. When they get kicked off platforms like Telegram “they have a harder time to recruit,” Snyder said, “and it’s easier to discourage them and get them to disband.”
Right now though, white supremacists are using Telegram to do just the opposite. They’ve continued to take advantage of the wealth of fresh users who arrived to Telegram after Parler was first banned or after QAnon and Stop The Steal groups were taken down by Facebook, and have worked to recruit them into more extreme politics. 
The Atlantic Council’s Digital Forensic Research Lab has documented how the Proud Boys, a violent neo-fascist group with extensive ties to white nationalists, has tried to gain new members from this pool. Snyder told me that she had seen white nationalists and other hard-right groups on Telegram do the same.
“Deplatforming works, but it only works as long as the Nazis stay deplatformed,” she said. “When a platform seems to sympathize with them, they get bolder.”