Discussions surrounding far-right extremism following the attack on the U.S. Capitol Building on Jan. 6, 2021, led many Canadians to wonder whether the same actions could happen on our own soil.
Canada is certainly no stranger to the far-right. Incidents of violence against marginalized Canadians are documented every year and major attacks fuelled by extremism, like the Toronto van attack, left many fearing an increase in domestic terrorism.
Most recently, members of the House of Commons passed a motion urging the federal government to designate alt-right group The Proud Boys as terrorists.
While far-right groups are being documented and restricted by North American governments more urgently after Jan. 6, Canadian experts argue there must also be a movement to track and prevent the spread of far-right ideas.
When it comes to indoctrinating Canadians into far-right extremism, especially Canadian youth, the internet is becoming right-wing extremists’ most potent tool.
Canada’s growing far-right presence
Dr. Barbara Perry is director of the Centre on Hate, Bias and Extremism at Ontario Tech University, an organization that works to research and raise public awareness about Canadian-based hate. According to her, the nature of extremism has changed drastically since she became the centre’s director in 2018.
“There have been such important quantitative and qualitative shifts in respect to manifestations of hate and far-right extremism, in particular white supremacist extremism,” she said. “We are at historical highs for police-reported hate crime data.”
“We’re still at unprecedented levels in terms of hate crime,” Perry said. “Also, we’ve seen quite a dramatic increase […] in the numbers and activity associated with the far-right in Canada.”
“We’re identifying closer to 300 active [far-right] groups now, as well as a number of individuals engaging on social media—especially on online venues—without necessarily affiliating with particular groups.”
Perry also referenced a report published by the Institute for Strategic Dialogue (ISD), a London-based research think-tank, in June 2020. The report, focused on right-wing extremism, identified 6,660 right-wing extremist channels and accounts linked to Canada on the platforms Twitter, Facebook, and YouTube, as well as fringe websites Iron March, Fascist Forge, 4chan, and Gab.
“There’s a lot more activity, and it’s a lot more blatant,” Perry said. “[Right wing extremists] don’t feel the need to hide away as they used to.”
ISD’s report also found that anti-Muslim and anti-Trudeau conversations were the most prevalent on these right-wing channels and accounts.
“It sort of reached this crisis point for us when Corey Hurren crashed into the gates at Rideau cottage […] to arrest the Prime Minister,” she added, referencing the incident that took place on July 2, 2020, in Ottawa, when an armed Canadian Armed Forces member forced his way onto Rideau Hall grounds.
After his arrest, it was found that Hurren had made posts alluding to a U.S. far-right conspiracy theory called QAnon.
“We like to blame Trump, but we’ve had our own homegrown xenophobes,” Perry said. “We see it in the mainstream, but we also see it in on the online forums associated with some of these extremist groups […] That Islamophobia, anti-semitism, anti-immigrant sentiment, those sort of things are always sort of simmering there.”
Canada has many well-documented extremist groups, but there isn’t much coverage of how exactly individual Canadians adopt extreme beliefs.
Less still is known about young Canadians’ participation in online extremism. In any digital community, only a select few members actively post and comment. Others simply watch—or “lurk”—making it difficult to track the overall demographics of people observing online conversations.
However, what researchers do know is that young people in Canada are more online than any generation before them. In 2019, Statistics Canada reported that nearly 100 per cent of youth aged 15 to 30 used the internet on a daily basis. 77 per cent of them relied on the internet for their news and current affairs.
The same study reported that 16 per cent of young men in Canada reported experiencing at least one aspect of social isolation.
Étienne Quintal, a Master’s student at the University of Ottawa researching hate groups and their appeal to young people, told The Pigeon that isolated young men are especially vulnerable to the pull of online hate groups.
“[For] far-right hate groups, the target market will definitely be younger white males, usually cisgender heterosexual ones,” he said. “Some groups make their appeal to young people who don’t feel as though they fit into society, people that are lonely [or] rejected.”
“On a more qualitative level, we know that young adulthood [is] hard on a person’s mental health […] A lot of people will experience that kind of loneliness and isolation.”
The actual process of online indoctrination for far-right groups is less straight-forward.
“There are a lot of different pathways into those online communities […] the process of radicalization is becoming increasingly individual,” Quintal explained. “It’s not necessarily about those groups going out and outright recruiting, but rather creating content that reaches a variety of different people.”
In 2016, far-right users of image-posting site 4chan intentionally manipulated humorous images of cartoon character Pepe the Frog in increasingly offensive and inflammatory ways in order to offend “mainstream” audiences.
Since then, creating memes and other online “jokes” to depict offensive scenarios has gained popularity in far-right communities. Memes have been documented as a recruitment strategy for extremist spaces specifically used to target young people.
Quintal told The Pigeon that passing off far-right thought as humour has become a tactic for indoctrination.
“The weaponizing of irony and offensive humour [can be] tools for radicalization,” he said. “You can use them to simulate different messages under the cover of a joke.”
Far-right jokes and memes are also more likely to invite young people to engage in extremist discourse. If they find “ironic” commentary about marginalized groups funny, but their peers don’t, this can create the beginnings of what Quintal calls a “victim identity.”
“That victim identity is a very important factor in radicalization and that’s something that hate groups will capitalize on,” he explained. “It’s this idea of: ‘You like these jokes, but people don’t want you to make those jokes. They’ll cancel you, they’ll censor you, they’ll send you to prison.’
“[The far-right will] convince people that they’re actually victims […] and then that person could become weaponized in terms of directing their hate or resentment towards hating different subsets of individuals.”
Canada’s emerging online far-right communities
In 2017, UNESCO published report mapping trends involving youth and online extremism. It found that chatrooms, as opposed to news sites, were more likely to be sites for radicalization because they fostered a sense of community.
The idea of community is vital for indoctrinating young Canadians into alt-right spaces, said Quintal.
“Online communities are an avenue for people who don’t necessarily have […] real-life support groups, real-life friends,” he explained. “Not ones that necessarily understand everything that they’re going through, or that understand their beliefs.”
Multiple far-right discussion websites focusing on U.S. politics and current events exist, and were even crucial for planning the Jan. 6 storming of the Capitol. While Canadians with similar views frequently read, participate in, and disseminate American extremist ideas—Q-Anon being the most recent transplant—there are also online forums specifically targeted towards Canadians.
One of the most infamous was the Reddit subreddit page r/MetaCanada, which described itself as “your premier source for Can-Con.”
MetaCanada was shut down by Reddit in Aug. 2020, but regrouped on a new website called Omega Canada, in the same network as “Patriots”—formerly “The Donald”—a site created by Reddit users from the banned subreddit r/TheDonald.
On Omega Canada, former MetaCanada users are able to discuss their far-right beliefs without the fear of being censored by Reddit. The Pigeon found recent discussions on the site where racist, sexist, anti-Semitic, and transphobic slurs were used frequently. Posts were primarily focused on Justin Trudeau, Ontario Premier Doug Ford, and COVID-19 public health measures.
One user stated that “the only good commie is a dead commie,” while in another discussion about immigration to Canada a commenter asserted that “you need white genes to fully assimilate to Western culture.”
Posts about U.S. politics were common, too. One user described how they believed COVID-19 is a virus manufactured and politicized by President Joe Biden to bring down Donald Trump. When Biden selected Rachel Levine—a trans woman—as his assistant health secretary, one Omega Canada participant called her “a fat guy with a mental disorder.”
Discord is another platform where far-right extremists flocked, and was even used to plan the 2017 rally in Charlottesville, Virginia. It’s a chat-based site, meaning conversations are more private than on Reddit.
While Discord has cracked down on alt-right activity since 2017, it used to house forums with names like Atomwaffen, Nordic Resistance Movement, Ironmarch, and Pagan Pathway. These servers, named with thinly-veiled references to Nazi imagery and political movements, where pro-Nazi and violent ideologies were disseminated, were invite-only and allowed users to feel more comfortable sharing extreme views.
Far-right Canadian servers on Discord are less documented, but just as vitriolic. In April 2019, The Globe And Mail examined 150,000 chat room messages between users of the “Canadian Super Players” server, a right-wing forum disguised as a video game chat.
The publication found that “many of the in-jokes and memes the members share resemble those propagated by the far right in the United States and Europe.”
Further, The Globe concluded that “the overarching goal for many in the Canadian Super Players chat group was the eventual creation of a white ethnostate,” a republic where all residents are white.
The Globe also found significant discussions about Canadian youth. Users of the Discord server who were university students bragged about posting signs around their campuses stating “It’s Okay To Be White,” while one teenage user complained to others about the multiculturalism he saw at his Vancouver high school.
Platforms like YouTube, Discord, and Reddit are now well-known for harbouring far-right extremists. As a result, the websites have cracked down on dangerous groups, banning individual channels or communities as a whole, with mixed results.
That doesn’t mean far-right groups have been stopped from using online platforms entirely. In fact, Quintal’s research has honed in on one fairly new app—video-sharing platform TikTok, which is populated mainly by young people.
TikTok has been responsible for countless political trends and discussions since its launch. Creators have normalized discussions about autism, celebrated Indigeneity, and even debated the merits of socialism.
It’s also been the platform for countless young right-wing influencers, posting videos and collaborating with the goal of celebrating Donald Trump and the Republican Party. Canadian teens have gotten involved in political TikTok videos, too, and Jagmeet Singh is a frequent poster.
Quintal said TikTok’s unique format and youth-led content make it a prime contender for extremist messaging.
“We’ve known for at least the last year, if not a little more, that TikTok is a potent app for political mobilization,” Quintal said. “The engagement that people have with the app and the creators is really meaningful.”
Not only has TikTok created discussions about the power of its user algorithm which expertly tailors a user’s content to their interests. Quintal said the app’s power to create a feedback loop about the same topic, paired with its other attractive engagement features, makes it an excellent platform for spreading far-right ideas.
“[TikTok] gives [viewers] a sense of community, a sense of belonging, and it gives them a list of influencers to admire and emulate,” he explained. “Those are all things that go towards building and reinforcing strong identities. So of course, leaders are going to capitalize on that.”
Before platforms like TikTok, or Parler, or any other number of websites geared towards political discussion are able to take hold, Quintal and Perry hope governments and social media giants take preventative measures.
“We need to continue to lobby and to advocate for social media giants to take more responsibility for [right wing extremism] in terms of not just de-platforming groups, but individuals as well,” Perry said. “I also think that the government needs to continue to fund initiatives in programming that will help to build the capacity within communities […] to recognize and resist this kind of grooming.”
Quintal added that controlling the spread of far-right extremism is key if the Canadian government wants to avoid similar sentiments being pushed politically.
“People aren’t going to go out and do something atrocious, but they’re going to go on election day and cast a ballot for someone who is going to take [far-right] ideas and make them into actual policies.”