The Gilroy Garlic Festival. The Poway Chabad synagogue. The Charleston Emanuel church. The El Paso Walmart. One common denominator in these mass shootings and countless others? A perpetrator whose interactions in online white supremacist networks played a part in inciting, energizing, and detonating racial hatred into real violence, says UNLV sociologist Simon Gottschalk.
Gottschalk and graduate students Celene Fuller, Jaimee Nix, and Daniel Okamura recently analyzed more than 4,400 discussion threads from eight blogs hosted on three prominent white supremacist websites. Comments were posted during and immediately after 2017 rallies in Charlottesville and on the University of Florida campus, as well as on “a relatively uneventful day” in terms of media attention to the white supremacist movement and activities in August 2017.
Researchers developed a model that explains how individuals who join white supremacist networks transform private feelings of fear, anger, and shame into a sense of power, pride, belonging, and a desire for vengeance. Eventually, some of these individuals convert those emotions into violence.
Ahead of the study’s release in an upcoming issue of , we sat down with Gottschalk to learn why he believes it’s crucial to understand how interactions in these online networks recruit, transform, and radicalize members; how they can prompt some to engage in violent acts; and what can be done about it.
The Typical Profile
Gottschalk says it’s difficult to trace the profile of people who post on these sites because, except for a username and an icon, they are purposefully anonymous and invisible — which explains the attraction of those networks. He aims to develop a social-psychological profile instead.
Individuals who are denied the social recognition they expect — for example, love, esteem, respect, solidarity — can experience feelings of anger, fear and shame. Individuals in these situations typically repress those painful emotions, which only intensifies them, and are also motivated to blame others. This switch in the target of negative feelings can lead to feelings of anger, ranging from fury to the desire for revenge against the imagined victimizer.
“One of the key functions of white supremacist networks is to tap into and manipulate those repressed emotions,” Gottschalk says. “They do so by convincing recruits that the social psychological pain they experience at the personal level is actually caused by anti-white discrimination.”
Members, who now see themselves as victims, also find that publicly expressing their negative emotions in online white supremacist networks is encouraged, validated, and rewarded by their peers, which boosts feelings of acceptance, solidarity, power, pride, and even potentially pleasurable and addictive neural rushes.
Gottschalk emphasizes that while these dynamics are not unique to American white supremacist networks, the desire for revenge and the potential for violence become especially volatile among members of social categories who confront a “reversal of fortune.”
“Individuals who are no longer granted the respect and esteem they expect and feel entitled to by virtue of their race will experience this condition as a frontal assault on their sense of self and identity.”
Escalating Online Hate to Violence
UNLV’s research of online white supremacist networks follows previous research findings that members interpret negative personal experiences as caused by discrimination. But the UNLV researchers argue that their model explores yet another important step that explains the path to violence: a switch in members’ perceptions whereby they are not only outraged because they believe that an “enemy” discriminates against them, but are now also afraid because they believe that this enemy threatens to physically harm them.
“When this switch occurs, variants of anger fuse with variants of fear to form an especially explosive compound. Under the ‘right’ conditions, some individuals motivated by those emotions can easily surrender to bloodlust and justify violence as self-defense,” Gottschalk says.
Gottschalk’s model suggests that interactions in online white supremacist networks produce those very conditions. He says the conversion to violent behavior is especially likely in fascist networks — regardless of race, religion, or nation — as their ideologies typically bestow the ultimate prestige not to those who talk a good talk about violence, but to those who actually act on it.
Assessing Calls for Violence
Gottschalk’s research team examined users’ posts to determine their prevailing emotions, grievances, and motivations.
Their analysis found that anger was the prevailing emotion (51%). And among the variants of anger, vengeance was the emotion most frequently expressed (37%). Among different types of vengeance, more than 15% of those threads mention sadistic fantasies of physical harm, killing, mutilation, and extermination. The most frequently voiced accusation (41%) is that enemies — especially Jews — are intent on physically destroying white people.
Researchers called it “noteworthy” that there was a “complete absence” of comments seeking to temper calls for violence.
“Analyzing anonymous online discussion threads provided us the unique advantage of gleaning the uncensored, spontaneous views of white supremacists who might not have been so candid in a face-to-face interview,” Gottschalk said. “Online hate group members find that — under the double-cloak of anonymity and invisibility — they no longer have to censor or even moderate hostile dispositions that would be considered paranoid or criminal in most other settings.”
Why This All Matters
In contrast to physical networks, online ones provide portability, mobility, and 24/7 access. And as these networks are multiplying worldwide, their content can easily migrate to less extremist ones, contaminate them, normalize extremist beliefs, and shift the range of acceptable ideas that can be discussed in society.
Even more worrisome, Gottschalk says, algorithms on search engines and social media sites systematically manipulate the likelihood that increasingly extreme posts will circulate more broadly, thereby capturing members’ attention, stirring powerful emotions, and in some cases, encouraging violent action.
“You can literally carry the network in your back pocket and stream its hateful ideology straight into your brain. It can provide you with constant and instant positive feedback whenever you voice the emotionally correct messages, paranoid beliefs, genocidal threats, or plans of action. Once individuals are hooked to the network, it becomes relatively simple for those in the control booth to modulate the anger-fear complex,” Gottschalk said.
“Obviously, how some people respond to what they see on the screens of online white supremacist networks can become a matter of life and death,” he continues. “And when you consider the combination of those powerful online group dynamics, the increasing tolerance for the online expression of hateful emotions, and the ready availability of weapons in our society, we should move quickly and intelligently.”
To illustrate the risks these networks represent, Gottschalk points to a that exposed nearly 700,000 unaware users to mild negative emotions, and found that these users quickly contaminated their own networks with these emotions.
“If you can achieve this effect with mild negative emotions, just imagine the velocity and ferocity with which strong primary negative emotions such as anger and fear can infect online extremist and other networks, especially in times of social confusion and instability. It could even get worse,” he says. “Remember how Russian trolls mobilized real individuals to participate in fictitious street protests during the 2016 elections? Well, bots will soon be able to perform this function more precisely, quickly, and efficiently than human beings. Add to that the growing concern with deep fakes, and we have a rather volatile situation in our hands.”
Possible Solutions
Gottschalk — who has spent a decade studying the social psychological impacts of technology on our lives — believes that until we better understand the influence of these networks, they should be shut down.
He points to the example of the French government, which in November 2018 blocked all access to the Démocratie Participative website — the French equivalent of The Daily Stormer — and similar networks.
“I know this suggestion sounds unrealistic to many, but is it really?” Gottschalk said. “While it does not guarantee immediate success, it will at least disrupt the dynamics driving the networks of the ‘fascoscphere’ and hopefully contain their effects. At the same time, we should develop additional strategies that address white supremacists’ claims.”