How the Far Right Exploded on Steam and Discord

New research found that several of the major gaming platforms are hosting extremist activity, from racist livestreams to open support for neo-Nazis. 

Since the online harassment campaign known as Gamergate, in which sections of the gaming world hounded female journalists with rape, bomb, and death threats, it’s been presumed that gaming culture has an extremism problem. Yet the specifics of this relationship have remained unclear. How widespread is the problem? How do extremists use games? And, of course, a point of morbid curiosity: What games do extremists play?

New research published by the Institute for Strategic Dialogue (ISD), a counter-extremism think tank, attempts to answer these questions. Investigating the online strategies of the far right, the ISD has found that several major gaming platforms play host to extremist activity—from racially abusive livestreams to open support for neo-Nazi terrorists.

The ISD investigated four platforms: Steam, Discord, DLive, and Twitch. It analyzed 24 far-right chat servers on Discord, 45 public groups associated with the far right on Steam, 100 far-right channels on DLive, and 91 channels and 73 videos on Twitch. These spaces were publicly accessible and the ISD did not look at closed channels, such as private chats or groups requiring passwords. The authors speculate these would likely be home to more coordinated radical groups.

The entrenchment of these communities varied across platforms. Of the four, it’s Steam that has the most severe problem. The ISD found a “well established, large network” of far-right communities, some dating back as far as 2016. “The content we encountered on Discord and Steam was more egregious than the content you would expect to easily find on mainstream social media platforms, but at a smaller scale than you would expect find on alt-tech platforms such as Gab and Telegram,” explains Jacob Davey, head of research and policy of far-right and hate movements at the ISD. “I think Steam in particular is noteworthy because the communities there are several years old, suggesting that the extreme right is well entrenched on the platform.”

The investigation found two Steam groups with links to violent terrorist organizations: one to the Nordic Resistance Movement, connected to bombings in Gothenburg in 2016 and 2017, and another to the Misanthropic Division, a Russian group active in Ukraine, Germany, and the UK.

Extremist usage of the platform varies. Some, such as groups linked to political movements such as Generation Identity or Britain First, weren’t found to be posting gamer-specific content; instead, they used Steam as a social media platform, dumping propaganda to attract new recruits. Others, including some tied to neo-Nazi podcasts and forums, were set up explicitly to form gaming clans.

“[Steam] is essentially acting as a community hub for people who are affiliated with the extreme right to come together, to socialize, to communicate, to have fun with their friends in a relatively safe space, but also to discuss extreme right wing ideology, and some of those points are then being used to off ramp people onto the website of extremist organizations or other social media pages,” says Davey.

The ISD found that, in general, extremists do not play extremist games. This is primarily because these games are awful. While users might display an association to a game like Feminazi: The Triggering, this is largely just a badge of honor. “[Games like] Angry Boy 2 or Ethnic Cleansing—no one plays them, they’re barely available,” says Pierre Vaux, a research manager with the ISD. “They’re crude 16-bit titles that look awful and are badly designed and are probably filled with viruses—no one wants to download them.”

The ISD did note, however, that extremists enjoy historical games, such as Hearts of Iron, Europa Universalis, and Crusader Kings. In these games, extremists live out their fantasies: conquering the world as Hitler, for instance, or exterminating Muslims in the Middle East.

But the extremist game of choice was Counter-Strike: Global Offensive—not for ideological reasons but because it is fun and free. “I think the big takeaway is the most obvious one, but I think also the most counterintuitive for people who want to present a nefarious picture of gaming: It’s that when extremists game, they play the same popular games you or I play, and they game largely for the same reason you or I game, because it’s fun and it’s popular,” says Davey. “It’s a great pastime for spending time and building a community with your friends, particularly during Covid.”

While Steam was found to have the worst problem, extremists were gathering on every platform. The live streaming service Twitch, for instance, plays host to “Omegle Redpilling,” where white supremacists—often dressed up in military gear, or as characters such as “Racist Super Mario” or the Joker—search Chatroulette and Omegle for victims to racially abuse. Clips from these streams, some of which stayed up for more than an hour before being blocked, are popular, and have ended up on TikTok; one of these accounts, from the jailed white supremacist Paul Miller, was still up when the ISD last checked.

Redpilling is linked to the practice of “raiding,” often coordinated on Discord servers. “You might have a space where you stockpile gore pics or racist memes or whatever, and you’d have individuals suggesting another server, usually a server associated with political opponents, or what they see as their out groups: so LGBTQ communities, people of color,” says Davey. “Then you would all pile into that server at an agreed time and spam it with hateful activity or hostile activity, and unpleasant memes and unpleasant images.”

As is typical of the modern far right, the ISD found that these groups were transnational. Another important point to note is that little evidence was found of active recruitment; these were primarily groups of like-minded believers. Worryingly, however, some members were extremely young. On Discord, in particular, the ISD estimated that the average age was 15, with young people asking to learn more about certain neo-Nazi groups.

“We found content being shared by prescribed neo-Nazi organizations groups which are banned in the UK,” says Davey. “It seems like Discord is acting as a hub for these communities of young internet trolls—people who are engaged in raiding, people who are engaged in 8chan or 4chan forum culture—but potentially provides a place for them to come into contact with some of the really the most egregious and actually illegal content out there.”

Steam’s particular problem is down to its laissez-faire approach to moderation. These groups were easy to find, with just the most basic keyword searches for racial slurs and Nazi terminology.

“On DLive, for example, we noticed several accounts got taken down over the course of our analysis, and similarly we hypothesize that the relatively small size of the communities we found on Discord and Twitch is indicative of proactive, if imperfect, moderation efforts,” says Davey. “On Steam I think the fact that these communities have been active for a number of years is a good indication that they have limited moderation against this sort of harmful activity.”

When platforms have cracked down and moderated extreme right discussion, activity has decreased. DLive initially played host to Patriotic Alternative, a British far-right white nationalist group, which produced chat shows and live streamed games. But, during the ISD analysis, DLive began to take action and deleted several accounts. Extremists then moved to alternative platforms where they felt they could reach more users without fear of censor.

Steam, famously, enforces only a very loose set of content guidelines. The company may fear inciting another Gamergate by tightening these restrictions, says Davey. But its moderation policy, he argues, seems out of step with that of other media platforms—and it may soon be forced to act regardless.

This story originally appeared on WIRED UK. 


More Great WIRED Stories