Down the Rabbit Hole: Exploring Alt-Right Radicalization Online
Emily Wilson
November 26 2019
This article is in collaboration with Contact Report, a blog run by the Centre for International and Defence Policy (CIDP) at Queen’s University. To read more about the security implications of alt-right terrorism, and for expanded takes from Dr. Amarnath Amarasingam, click here.
Content Warning: This piece contains discussion of the alt-right, as well as discussion of the Christchurch Mosque Shooting
After thoroughly destroying my search history and probably permanently altering my recommended results on Google, YouTube, and every other algorithm-based website (read: the entire Internet), I still feel as though I’ve only scratched the surface of the alt-right presence online. This subculture is, by design, unmeasurable, obscured, and decentralized to the nth degree. The internet has taken this dangerous movement global, making the threat it poses to international security the topic of many policy debates .
Inspired by the YouTube series ‘The Alt-Right Playbook’, my goal when I started this piece was to tease apart the ways the alt-right works to radicalize people. The experience left me feeling a little bit like a conspiracy theorist, but unfortunately this is a terrifyingly real phenomenon that doesn’t take red yarn and an evidence board to discover. For the purposes of this article I will be using the alt-right as an umbrella term for the ideological subculture associated with extreme conservatism, white supremacy, and reactionary politics. They are a decentralized group that operates mainly in the online sphere under the protection of anonymity.
It is important to note that the alt-right is not a cohesive group. Through memes and shared ‘jokes,’ people (mostly young white men) find themselves welcomed into online spaces that gradually become more apparent with their intentions of radicalization. Under the guise of edgy humour, the subject is groomed until they are both aware of and okay with the ideologies being promoted and reproduced. As mentioned above, they are decentralized and scattered by design – there is no singular ‘alt-right’ subreddit, Facebook group, or 4Chan board.
Professor Amarnath Amarasingam works with the School of Religion at Queen’s University researching extremism, terrorism, and hate movements. I sat down with him to discuss the growing presence of online alt-right hate and he identified the appeal of recruitment in online spaces: “They’re much more clandestine, they exist on a variety of different platforms, so they’re actually easy to find. In that sense, the kind of active recruitment that we used to see with ‘boots on the ground’ [techniques], doesn’t really happen anymore.” He noted that the recruitment process is more grassroots; interested young people seek out these avenues of recruitment. In my efforts to understand the ease with which people find themselves within these circles, and the appeal of staying, I dove into a few platforms, starting with YouTube.
Many articles exploring the subject of alt-right recruitment will point to Youtube’s recommendation feature as a rabbit hole in which people are exposed to increasingly reactionary ideas. I decided to test this out myself by clicking on the first video that came up when I searched Jordan Peterson, a divisive academic associated with the Intellectual Dark Web (IDW) The IDW is a phrase popularised by a New York Times article, and used to describe “iconoclastic thinkers, academic renegades and media personalities who are having a rolling conversation that sound[s] unlike anything else happening, at least publicly, in the culture right now”. A recent study out of Brazil found that people consistently migrate from IDW media to alt-right media through YouTube. After watching eleven videos on the topic of Jordan Peterson from the recommended column, YouTube suggested I watch a Young America’s Foundation (YAF) video of Ben Shapiro (YAF is a conservative youth organization). Four videos later, YouTube recommended “BEST FEMINIST CRINGE COMPILATION 2016.” One video later, I was watching a ‘news story' published by Blaze TV, a far-right media outlet offering an alternative to the ‘fake news’ of mainstream media.
The same channels continuously showed up: BlazeTV, Young America’s Foundation, and The Daily Signal – another right-wing media outlet financed by The Heritage Foundation, an extremely conservative think tank. It isn’t any stretch of the imagination to assume that had I continued clicking through the related videos, the content would have become more reactionary and polarizing. Kevin Roose, a writer for the New York Times, noted “YouTube has inadvertently created a dangerous on-ramp to extremism by combining two things: a business model that rewards provocative videos with exposure and advertising dollars, and an algorithm that guides users down personalized paths meant to keep them glued to their screens.” It’s clear that it isn’t hard to suddenly become inundated with increasingly conservative, and eventually alt-right, voices and media.
Then, I turned my attention to 4Chan, the “bulletin-board” platform notorious for its anonymity and cruel humour. It took me approximately thirty seconds to open the website, scan through the boards posted on the front page, and find one that sounded like it may appeal to someone looking for a place to be ‘edgy on purpose,’ or talk freely about their genuine bigotry. The /pol/ board, or the ‘Politically Incorrect’ board is a peek into the underbelly of the Internet that leaves an onlooker equal parts horrified and hopeful that it’s all a joke. The latter half of that sentiment raises an interesting question; does it even matter if it’s just a joke?
I spoke with a student who had briefly been drawn into these circles to find out. The student, who will remain anonymous, began consuming this type of content in high school around 2014, coinciding with the infamous GamerGate. GamerGate is a deeply misogynistic movement that began on Reddit and could be considered a microcosm of the greater alt-right phenomenon; a distillation of the recruitment techniques and content trends that exist within greater alt-right circles.
In asking how he found himself drawn into the discussions happening on r/KotakuInAction, the subreddit that became a hotbed for GamerGate, the student responded that the communities he was already part of shifted in ideology and he simply went along with it. This shift in existing online groups is also identified in the most recent addition to ‘The Alt-Right Playbook’ series. In talking about the two most common pathways to alt-right recruitment, YouTuber Ian Danskin, creator of ‘The Alt-Right Playbook’ series, notes that “the far-right [will either create] a community Gabe [a stand-in for any prospective member] is likely to stumble into, or [infiltrate] a community Gabe is already in.”
What is fascinating about this phenomenon is the speed with which members can emerge and immediately perform a 180˚, turning instead to progressive left-wing thought. Take, for example, Caleb Cain, the 26-year-old who posted a now viral YouTube video denouncing the alt-right and detailing his fall into the ‘rabbit-hole’. Cain emerged from this hole when he discovered left-wing commentators, and found that he resonated with their words as well. I asked Professor Amarasingam about this and he referenced the similarities between the “hyperemotional and intense” online presences of both ends of the political spectrum. He also noted that it may have something to do with the difficulty of properly identifying the causes of ones’ grievances.
Many consumers of alt-right media start from a place of economic stress or insecurity, including Cain. The stress caused by the inability to hold down a job or pay bills can lead people to look for desperate answers. As Ian Danskin notes in his YouTube series, a person’s opinions and thoughts will be influenced by whoever gets to them first. The alt-right has become internet savvy out of necessity and are frequently the first point of contact.
A few main patterns became apparent as I researched the alt-right. First, there appears to be an overwhelming appeal to feigning apathy. The anonymous student continuously mentioned a sentiment encoded into these spaces – that caring about anything too much or too genuinely was bad. “The whole point was getting mad at people who were getting mad,” he said. There is a sense that everything was fine before people started getting too sensitive and, for example, caring about feminism enough to make a Ghostbusters film with an all-women cast.
The second thread is that alt-right communities position themselves as (excuse the irony) ‘safe-spaces’ for likeminded individuals to form bonds and discuss controversial topics freely. Loneliness is what many of these young men identify as part of what pushes them to engage in the ‘edgy’ jokes of these communities. Quickly enough, these jokes form the cracks in the ground for the rabbit hole to open up and send someone tumbling into the depths of alt-right media. Professor Amarasingam notes that social media companies are, “very slow to catch on to this issue.” He continued, saying “I think a lot of these companies have a culture, or at least an idea of themselves as libertarian free speech havens.” This is just as true for websites like Gab, a site known for its concentration of alt-right neo-Nazis in its user base, as it is for popular social media platforms like Twitter.
The final common trend I noticed while writing this article is the ever-growing risk of real-world action being taken in support of alt-right ideologies. Although seemingly centralized in the United States, the use of the internet in recruitment and organizing has made the alt-right an undeniably international presence. A New York Times article found that “at least a third of white extremist killers since 2011 were inspired by others who perpetrated similar attacks,” spanning from Norway, to Sweden, New Zealand, Canada, the US, and more. Professor Amarasingam noted that, “With the Trump campaign it really kind of ramped up. Not necessarily in numbers [of organizations] but more so in kind of, how emboldened they were, how open they were in talking about racist principles and those kinds of things.” The Overton Window, named for conservative scholar Joseph P. Overton, describes the range of ideas that are considered appropriate for political discourse, and has moved considerably further right under Trump. More overt racism doesn’t shock as much as it would have four years ago – we become desensitized. It wasn’t until the Christchurch shooting in March 2019 that governments really shifted in their handling of alt-right threats.
After the horrific attack, Professor Amarasingam said, the threat alt-right extremism poses was more openly talked about. For instance, the United Nations (UN) and the Five Eyes Network (an intelligence sharing network between Australia, Canada, New Zealand, the UK, and the US) have been facilitating policy discussions and directives. Although government pushback against violent extremism on the alt-right has increased, there are still obstacles in terms of delegitimizing these groups when a major source of their validity currently holds office in the US. “The media has redrawn the boundaries of the conversation and made more things acceptable,” Professor Amarasingam noted, in discussion of the Trump administration and the hateful rhetoric coming from the top down. “Those in power have the ability to recast the Overton Window in some ways.”
The validation that comes with having people in power echoing your group’s ideas and opinions is integral to the longevity of most movements. Deplatforming has been identified as an effective way to combat the echo chamber of alt-right rhetoric by people from varying perspectives, including Professor Amarasingam and the anonymous student interviewed, as well as other countless other journalists and academics.” Whereas it is incredibly difficult to deplatform Donald Trump (here’s looking at you, Election 2020), it is possible to deplatform alt-right figureheads like Milo Yiannopolous.
I began this article with a discussion of how some people may be sucked into this movement to demonstrate just how easy it is for intensely hateful rhetoric to be incorporated into one’s digital diet. In an age where so much of our time as young adults is spent formulating thoughts and opinions online, the alt-right has found the perfect formula to sneak its ideologies in under our noses, bit by bit. As Professor Amarasingam said to me in closing: there will always be a guy on the corner with a sign saying ‘the end is near,’ we just have to be mindful of how many people are listening, and what pushed them to listen. The alt-right has been around for much of modern history and will probably continue to be present in world society in some form for the foreseeable future, but right now more people are listening to its hateful narrative. For the sake of international and domestic stability, individuals and governments alike need to work together to identify and remedy what it is that is pushing people to the far right. The fight against violent hate will always be ongoing and unfinished, but it is not hopeless nor is it already lost.
Contact Report blog is designed to be a venue for CIDP fellows, partners, and collaborators to provide their learned opinions on the topics of the day. Contact Report focuses on international and defence policy-related events, and is designed to relay information in a timely, accurate, and thoughtful manner that enables the CIDP to leverage the knowledge of their researchers to inform the debates that ensue. We’re looking for submissions! Click here to go to our submission form, or email Anna (16akm13@queensu.ca) or Bibi (14bim1@queensu.ca), our assistant editors for more information.