Home News How neo-Nazis used the internet to instigate a right-wing extremist crisis

How neo-Nazis used the internet to instigate a right-wing extremist crisis

How neo-Nazis used the internet to instigate a right-wing extremist crisis

The number of active hate groups in the United States has fallen by about 10 percent in the past year. This isn’t necessarily good news.

There were 838 active hate groups this year, compared to 940 in 2019, according to an annual report by the Southern Poverty Law Center (SPLC). The organization attributes the drop to the fact that these groups have become more diffuse and difficult to track, largely because of changes in technology. The pandemic has also played a role in limiting in-person activities.

Even then, 838 is still a very high number of active hate groups. In 2000, there were 599 hate groups on the list. It peaked in 2018 with 1,020 groups, which reflects a surge in extremism that has paralleled Donald Trump’s rise to national office. Even if the overall number is lower this year, the SPLC warns in its latest report of a “reactionary, authoritarian populism that is mobilizing on the heels of Trump’s loss.”

“Technology and the pandemic in the last year have changed how hate groups operate,” Margaret Huang, president of the SPLC, told reporters on Monday. “They now have the tools to disseminate their ideas beyond their members, beyond geography, and shift tactics and platforms to avoid detection. This likely represents a transition in far-right communities away from traditional organizational structures, and toward more diffused systems of decentralized radicalization.”

That’s because social media platforms have made it easier than ever for extremists to recruit new adherents and push their fringe beliefs into the mainstream. This was on full display on January 6, when militant white nationalists groups that have primarily used the internet to organize — the Proud Boys, the Three Percenters, and the Oath Keepers — stormed the Capitol alongside MAGA moms, QAnon adherents, and other groups brought together in recent years by their love of conspiracy theories and Donald Trump. Many members of all these groups had met online before the event, and their attack on the Capitol showed their alarming capacity for offline violence.

That public show of force was decades in the making — neo-Nazis have been using the internet since the early ’80s to recruit new followers. You can draw a line from the first neo-Nazi online bulletin boards to the online hate forum Stormfront in the ‘90s to the alt-right movement that helped Donald Trump rise to power in 2016.

Over the years, these groups used an evolving set of organizing techniques to spread extremist messages to larger and more mainstream groups of people online. They found ways to game the algorithmic feeds of Facebook, Twitter, and YouTube, so that their new audiences didn’t necessarily know they were being radicalized. And there’s reason to believe this is only the beginning, since these platforms tend to amplify provocative content.

“Twitter, Facebook, and YouTube provided a safe space for these different strains of far-right thought to mix and breed. For years this stuff was allowed to spread algorithmically, and communities were able to form and self-radicalize,” Robert Evans, an investigative journalist who studies far-right groups, told Recode. “All that culminated on January 6 — although, of course, that will not prove to be the end of any of the chains of violence we’ve seen evolve over the last six years.”

Facebook helped enable spread of extremist posts by pioneering the algorithmic distribution model for content shared on its platform when it introduced the “Like” button in 2009. This was an early example of an engagement tool — user feedback on content that helps train an algorithm to give them more content the user might like. That means if you click “Like” on a Facebook post about a conspiracy theory, like QAnon, you would probably see more posts about conspiracy theories in your News Feed. Other social media companies, including Twitter and YouTube, have adopted similar algorithm-based recommendation engines, and some say it’s turned these platforms into radicalization machines.

Recently, Facebook, Twitter, and YouTube have been making a public effort to crack down on extremist content, and after January 6, they promised to do better. Donald Trump has been banned from all three sites for his role in inciting violence at the Capitol. But at the same time, encrypted messaging apps like Telegram and Signal are seeing record numbers of new users, and some of them are extremists and conspiracy theorists who have been booted off the main platforms.

“As these technology companies began to crack down in an attempt to curb the extremist elements on their platform, we saw mass migrations to other spaces, that essentially provide very little or no content moderation,” explained Joanna Mendelson, associate director of the Center for Extremism at the Anti-Defamation League (ADL). “And unfortunately, it forces this population into an echo chamber, and surrounds them with propaganda — with video footage, with memes, with a kind of state of the art ways to communicate rapidly — further exacerbating the situation.”

A man wearing military fatigues holds a US flag while trespassing on US Capitol grounds.

On January 6, internet trolls joined up with Trump supporters, white nationalists, and right-wing militia groups to storm the US Capitol.
Samuel Corum/Getty Images

Efforts to push back against this are underway. The Biden administration is now working on a plan to combat domestic online extremism, while Congress considers a number of proposals to reform the laws that regulate free speech online. At least one bill wants to force social media companies to fix their algorithms and address their radicalization issue head on. But it’s unclear if any of these bills will become law, and either way, it would take time to pass and begin enforcing them.

In the meantime, extremist groups are splintering in somewhat unpredictable ways and finding new ways to spread hate and conspiracies online. Because we can’t predict what exactly they’ll do, it helps to look to the past — white supremacists have been organizing online almost as long as the internet has existed — and understand how we got here.

A brief history of white supremacists, the internet, and the United States

White supremacists have historically been early to technological trends, sometimes even shaping how mainstream Americans experienced them. Consider that The Birth of a Nation, an influential 1915 film by D.W. Griffith based on a 1905 novel called The Clansman and credited with reviving the Ku Klux Klan, was the first film to be shown at the White House. One could argue that almost a century later, tech-savvy white supremacists played a critical role in putting Trump in the White House. From the beginning, they seemed to know just how powerful and transformative the internet would be.

In 1983, a white supremacist named George Dietz connected his Apple IIe, one of the first personal computers, to the internet and took the Liberty Bell Network online. This dial-up bulletin board system (BBS), a precursor to the World Wide Web, allowed anyone with a modem and computer to read through endless screens of Holocaust denial literature and anti-Semitic diatribes. Dietz also published most of this in print, but because such literature was banned in places like Canada and Germany, the BBS systems offered international reach. Within two years of the network’s launch, the Anti-Defamation League identified Dietz, a former member of the Hitler youth, as the largest distributor of neo-Nazi literature in the United States.

The concept of using computers to recruit and organize people to join the white power movement took off. Not long after Dietz’s network went live, a grand dragon of the Texas Ku Klux Klan named Louis Beam set up the Aryan Nations Liberty Net in 1984. Beam said in a post announcing the network, “Imagine any patriot in the country being able to call up and access these minds.” Around that time, Tom Metzger, another former Klansman, set up the White Aryan Resistance (WAR) network, which was also a BBS system, using his Commodore 64 computer. “The major reason for computer bulletin boards is that you’re reaching youth — high school, college and even grade school youths,” Metzger told the Washington Post in 1985.

The extremists’ effort took a big technological leap in the 1990s, when the web enabled more advanced destinations for hate like Stormfront, a website that describes itself as “a community of racial realists and idealists” and allows registered users around the world to create basic profiles and post to a variety of message boards. The early aughts saw the emergence of imageboards, which work a lot like forums but revolve around the posting of images, and the rise of 4chan, an imageboard that started out as a place to discuss anime but later became a hub for the meme culture that propelled its white nationalist ideals into the mainstream. (White supremacists believe that whites are generally superior, while white nationalists have white supremacist tendencies but also call for the establishment of a white ethnostate.)

On 4chan and newer neo-Nazi hubs like the Daily Stormer, an evolution of the far right that became known as the alt-right began to attract attention in more mainstream venues about a decade ago through trolling and meme-making. The trolling, a tactic of making provocative statements for the sake of being provocative that often amounts to harassment, wreaked havoc on online communities and spread misinformation.

This often went hand in hand with hiding extreme messages in coded memes, like Pepe the Frog, a once-obscure cartoon character that members of the alt-right included in racist or anti-Semitic images so often that Pepe himself became a symbol of hate. These tactics helped these racist and harmful memes hop from platform to platform, leaving the relative obscurity of 4chan and finding some more mainstream traction on Reddit or Twitter as the alt-right learned how to game sorting algorithms in order to get their memes in front of bigger and bigger audiences. And because these groups at first just seemed like trolls being trolls, many people wrote them off.

“By the time we go from the memes about Obama to Pepe the Frog, the folks on the far right are incredibly adept at figuring out how to use the algorithms to push their content forward,” explained Jessie Daniels, a sociology professor at the Graduate Center CUNY.

White supremacists hold torches on UVA grounds.

The “Unite the Right” rally on the University of Virginia campus in Charlottesville is described by some as the peak of the alt-right movement.
Zach D Roberts/NurPhoto via Getty Images

A powerful example of this alt-right strategy happened during Gamergate. What started out in 2014 as a harassment campaign aimed at women video game developers and critics would become a full-fledged movement, driven not only by far-right figures but also outright neo-Nazis, many of whom eventually rallied behind Donald Trump and his presidential campaign.

The alt-right’s racist messaging, white nationalist underpinnings, and anti-immigrant and anti-Semitic sentiment — what had previously been couched in irony by the internet trolls — were not condemned by Trump or his millions of followers. This was on full display when Trump said there were “very fine people on both sides” of the deadly “Unite the Right” rally in Charlottesville, Virginia, in 2017, which was organized by alt-right leaders and white supremacists. The ADL later pointed to Gamergate as the event that precipitated the rise of the alt-right, and Charlottesville as the alt-right’s moment of triumph.

By the time Charlottesville happened, online hate groups had obviously expanded their reach beyond obscure internet forums. They were not only showing up in the streets but also very active on the major social media platforms, where they’d become adept at disseminating misinformation and stoking reactions that would increase engagement on their posts. As research has shown, the most engaged content often wins the favor of social media companies’ sorting algorithms, so these hateful posts tend to end up in front of increasingly mainstream audiences.

“The fundamental metric that all these major networks are built around is who can incite the most activating emotion, who can get people to feel the sharpest, quickest burst of emotion — and not only any emotion, but certain kinds of emotion,” said Andrew Marantz, author of Antisocial, a book about extremist propaganda online. “As long as the incentive structure is built around that, there’s going to be a tendency in this direction.”

Even in their early experiments with technology 100 years ago, white supremacists succeeded at inciting emotion. In 1915, The Birth of a Nation film twice depicted a fictional Klan ritual, drawn from the novel, that involved setting a cross on fire. Ten months after the film’s debut, a former pastor named William J. Simmons invited a group of 15 men to the top of Stone Mountain, and they burned a 16-foot cross. It was a first for the Klan and ushered in its second era. Some historians say that what we’re witnessing in 2021 is the emergence of the fourth Klan — the third happened in response to the civil rights movement in the ’50s and ’60s — though this time, there’s not really an overarching organization.

“What’s different, though, is that we live in the era in which social media allows many disparate groups to communicate and make common plans — like their plans to invade the Capitol,” Linda Gordon, author of The Second Coming of the KKK, told Vox’s Anna North earlier this year. “In other words, they just have a very different communication structure. And that communication structure means that it really isn’t necessary for them to have one single large organization.”

This brings to mind an essay called the “Leaderless Resistance” written nearly 30 years ago by Louis Beam, the white supremacist who founded the Aryan Nations Liberty Net. Beam warned that the extremists should work in small groups and communicate through “newspapers, leaflets, computers, etc.” in order to avoid being disrupted by the federal government. The decentralized strategy doesn’t sound all that different from what’s happening today.

“Online spaces have really helped facilitate a more diffused structure within the far right,” Cassie Miller, a senior research analyst at SPLC, said. “Extremists can join a number of Facebook groups or Telegram channels, and get the same sense that they are part of an in-group or that they are participating in a movement that they may have gotten from joining a more formally organized structure in years past.”

That communication structure has evolved dramatically since a few ambitious neo-Nazis plugged their computers into dial-up modems and built the early networks of hate. Being an extremist is a mobile, multimedia experience now, thanks to smartphones, social media, podcasts, and livestreaming. And it’s not just the leaderless resistance strategy that has endured among right-wing extremists. A number of neo-Nazi themes — namely those drawn from a racist dystopian novel from the 1970s called The Turner Diaries — have also transcended the decades of technological advancement to crop up again during the Capitol riot in January.

Members of the Proud Boys militia group make white power hand gestures while posing for a photo in front of the US Capitol on January 6.

Members of the Proud Boys militia group make white power hand gestures while posing for a photo in front of the US Capitol on January 6.
Amanda Andrade-Rhoades/For the Washington Post via Getty Images

The Day of the Rope is the culminating event in The Turner Diaries and depicts a group of white supremacists who try to overthrow the federal government and kill several members of Congress. The novel is credited with inciting at least 40 white nationalist attacks in recent decades, including the Oklahoma City bombing. (Amazon removed the book from its site following the Capitol riot.)

References to the Day of the Rope popped up in tweets and extremist chat rooms in the days leading up to January 6. Trump supporters showed up to the Save America March — the rally where Trump told the crowd to march to the Capitol that preceded the riot — with nooses. On the steps of the Capitol, rioters chanted, “Hang Mike Pence!” Their outcry came just after the vice president had refused to overturn the results of the election.

“To an extent, the Day of the Rope has been divorced from some of its white nationalist underpinnings in order to make it go viral,” said Evans, the investigative journalist. “But the fact that you saw people bringing gallows and trying to kidnap democratic legislators in real life on the Capitol is the culmination of an attempt to mainstream that idea.”

It’s just one example of a stream of white supremacist lore, no matter how absurd, that’s continuing to find its way into the mainstream on the internet. Even if it’s surprising to hear now, watchdogs have warned of the threat of online extremists recruiting new members online since the early days of the internet. The ADL published the first extensive report explaining how neo-Nazis were using this new technology to unite hate groups back in 1985.

The algorithms that determine what people see on social media sites have simply supercharged these efforts. Some worry that it’s too late to reverse the damage, and that the hate is bound to spill over into the real world.

“The radicalization online — the brain just soaking in this poison — goes on so long that [people] just feel that they’re not going to be able to enact fascism with their house pets, and it becomes too frustrating. And they just need to see it in real life,” said Michael Edison Hayden, a senior investigative reporter at the SPLC. “There is that, and then there is the degree to which the echo chambers that social media creates presents a world in which doing such things no longer seems wrong.”



This article is auto-generated by Algorithm Source: www.vox.com

Related Posts

0

Ad Blocker Detected!

Refresh