December 28, 2022

Article at Self-Published

Perfecting Propaganda: How online misogyny is radicalizing adolescent boys

Terrorism related to online misogyny is on the rise in recent years. Radicalization begins young, targeting adolescent boys when and where they’re most vulnerable.



The following resources, taken from the Center for Countering Digital Hate Incelosphere report, may be helpful if you find any of the content discussed in this article upsetting. 

  • Mental Health America – mental health support and advice. To reach a 24-hour crisis center call 1-800-273-TALK, or text MHA to 741741 for the Crisis Text Line.
  • The Trevor Project – LGBTQ+ support and advice. To reach a 24-hour phone hotline call 1-866-488-7386. Alternatively, you can text TREVOR to 1-202-304-1200 or use this link to find online webchat support.
  • Rape, Abuse and Incest Network - support for survivors of sexual violence. To reach a 24-hour hotline call 1-800-656-4673. Alternatively, you can click here for webchat support.
  • The Samaritans - 24 hour support helpline. To reach the hotline call 116 123. Alternatively, you can email

On April 23, 2018, a man in Toronto, Canada gets into a rental van, drives it to one of the city’s busiest districts, and charges it three blocks down an occupied sidewalk, only stopping after his windshield is obscured by a pedestrian’s scattered beverage. Eleven people were killed in the attack. Fifteen more were injured.  

In a videotaped police interview the next day, the detective’s line of questioning begins with, “when did you first go on 4chan?” 

“May 23, 2014.” 

“And how are you able to remember that?” 

“Because I remember that was a very significant day.” 

On May 23, 2014, a man in Isla Vista, California uploads a hate-filled tirade to YouTube, directed at women for denying him sex, before stabbing three men to death in his apartment, shooting three women outside of a sorority house and going on a vehicular rampage that ends with a self-inflicted gunshot. In total, six people are killed and 15 injured. 

As recent as last month, a comment under the re-uploaded manifesto reads, “I've been identifying myself as an incel for longer than a year, and saint [name of the Isla Vista killer redacted] has being a hero for me from the very first day I've discovered his day of retribution… It breaks my heart to this day that women will never go for a good guy like me.” 

Incel, short for involuntarily-celibate, is a label that has been used by online communities that claim to be “support groups” for men who lack sexual or romantic relationships. 

Beginning with the 2014 Isla Vista killings, incel ideology has been directly linked to at least 55 deaths and 55 injuries from mass murder incidents. These numbers don’t even account for the hard-to-quantify deaths by suicide that result from this ideology of hopelessness. The frequency of attacks and visibility of the content has increased significantly in the last few years. 

A study released this September by the Center for Countering Digital Hate found that on an incel forum visited 2.6 million times a month, posts mentioning incel-inspired mass murders increased by 59% from 2021 to 2022. Every 29 minutes, someone posted about rape and received encouragement from 9 in 10 users. Their misogynist ideology, which blames the education and autonomy of women for their celibacy, plagues the virtual spaces where adolescent boys interact — with virulent intentions. 

From YouTube comments to video game streaming and chatting websites to loosely moderated internet forums like 4chan, Reddit or Discord, online misogyny and its right-wing influencers are working to grow and fuel an audience of frustrated teens. Nearly every young male, from the age they first access the internet, will come in contact with this content. 

The biggest risk factors for falling into its self-defeating spiral are similar to the risk factors for youth gang involvement — absentee parents, traumatic experiences, the desire to belong somewhere — but this path preys most successfully on the insecurities of puberty and young adulthood experienced by everyone. Difficulties dating, making friends, and climbing the social hierarchies of youth, combined with a storm of hormonal changes and even the increased isolation of the pandemic world, means any male between their early preteens and mid-twenties is likely to experience external factors that create windows of vulnerability. 

The best hope for derailing the constantly ongoing online radicalization of young men is understanding the process to identify it when you see it and prevent it when you see it in others. Where youth gang involvement could be spotted by indicators as simple as the need to leave the house and physically interact with others to participate, the nature of online misogyny as self-radicalizing and invisible makes it particularly difficult to spot. Often not before an individual is so involved that their online thoughts and behaviors have developed consequences in the physical world. 

“Even before behavior changes, language will change,” said Shannon Reid, over a video interview. She’s an assistant professor in the Department of Criminal Justice and Criminology at UNC Charlotte, whose work researching gang influence on juveniles intersected with misogynist radicalization through the movement of neo-Nazi recruitment tactics from punk shows to online spaces. 

Terminology made popular in incel communities has become so widespread online that you’re likely to encounter it nearly every day without recognizing the vitriol that originates it. In his police interview, the Toronto van killer tells the detective, “It’s a movement of angry incels like myself who are unable to get laid. Therefore, we want to overthrow the Chads, which would force the Staceys to be forced to reproduce with some of the incels.” Along with the classification of men into a self-prescribed hierarchy of “alphas and betas,” the Chad/Stacey dichotomy — that there’s a finite and genetically predetermined minority of alpha males and a vast majority of women who chase them — is subtly reinforced and spread through memes and in-group jokes before it’s escalated to targeted harassment and real-world violence.

“The sort of burden of losing your virginity, of being successful with women, getting a girlfriend, these are real milestones that young men [feel they] really have to overcome,” says Pasha Dashtgard, director of research at the American University Polarization and Extremism Research and Innovation Lab (PERIL), over a video interview. Dashtgard uses his Master’s Degree in mental health counseling and Ph.D. in social psychology to research male supremacy and online radicalization.

“It really does so much to categorize you early on... The pressure that young men feel to have 'high success' in the dating realm is very real.” he said. “For people who are young now, having this framework and having it be so prevalent online, it gives these young men a way of interpreting their world that is really toxic and really harmful to themselves.”

PERIL’s “The Parents and Caregivers Guide to Youth Radicalization” recommends discussing with children the sources where they receive news and consuming news together, like listening to a current events podcast during a car ride. Not only does this teach news literacy and reinforce defense against misinformation, but the conversations bring natural insight into what types of media are influencing their thoughts and beliefs.   

TikTok influencers like Andrew Tate make misogyny look masculine. Academics like Jordan Peterson make it look intellectual. Grifters like Alex Jones make it look profitable. Right-wing pundits like Ben Shapiro make it look ethical. Links to the original content of the above creators, or fan-made gotcha compilations like “Ben Shapiro owns college liberals”, find their way into the online communities where adolescent boys are already connecting. Large group chats and communities on the online gaming-oriented social media app Discord are particular hot spots due to little or no internal oversight or content moderation. 

“If you don't have a pro-social peer group, and especially if you have don't have a peer group in the real world but you have a peer group online, where people are maybe not as conscientious or thoughtful about what they're saying or doing because you have that anonymity,” said Reid, “you have that freedom to kind of be a normal person in life and be not a normal person online.” 

Behavioral social expectations also matter in online communities and serve a key role in the process of self-radicalizing by making it voluntary. If an adolescent boy is actively participating in a Discord group to play games and make online friends, at some point someone is likely to make an edgy, racist or misogynistic joke or share a piece of content that peers might point out as offensive, factually incorrect or otherwise against the community-policed standards. But what if you laughed? Or the divisive content echoed a belief you've heard in a passing family conversation? Now its criticism is “political,” and you didn’t join an online gaming community for politics. Can’t anyone enjoy a good joke anymore? 

When such divisions happen, kids leave the larger groups in favor of groups with even less moderation where they can make jokes or share content with less pushback and more praise. This effect can happen a multitude of times. Traveling further down the chain, you’ll find increasingly smaller groups with increasingly lower bars for acceptable in-group behavior. After all, it's just jokes, right? 

Kids also turn to such communities for friendship and emotional support just as they would turn to in-person friendships. But where friend groups in person are limited by proximity, the online groups further down the chain unintentionally concentrate the most like-minded and increasingly rebellious teens from around the world with a tendency to feel isolated or outcast by peers outside of their niche online community. 

When one member of the group comes to his friends to express genuine sadness at something as benign as being rejected by a date to a dance or other general difficulties with achieving the social markers of male success, a friend says dude it's not you, it’s them. As evidence, they link to a Youtube video, a TikTok, a thread on an incel forum or an article on the incel wiki, where someone is using pseudo-science or misinformation to direct those emotions towards anger at women. 

For example, one article on the incel wiki — a collection of pseudoscience theories and incel terminology formatted to resemble Wikipedia — lists statements like, “Women are only capable of love in case of high status men. Basically that women can only love conditionally, which most wouldn't consider 'love',” as the basis of scientific statements. Another page cites a study saying that some women sometimes fantasize about scenarios involving sexual violence as evidence that women universally prefer “low-empathy males” and, “The ability to rape may also act as an honest signal of physical strength and high status.” In this group, there’s no pushback. The statement of “facts” isn’t political, but criticizing it is. 

“That [pseudo] intellectualism is a very strong cover to sort of hide behind,” said Reid. “Like, it's not that Jordan Peterson's wrong, it's that you don't like Jordan Peterson.”

Not all influencers in right-wing spaces go as far as to assert that women are incapable of love, but they don’t have to. The middle section of a Venn Diagram, between Shapiro’s rhetoric, that liberalism is causing the breakdown of natural gender roles, and the people who advocate for a return to women entering social contracts to trade access to reproduction for financial stability, is large enough that YouTube’s algorithms will carry you from one to the other and beyond in the time it takes to brush your teeth. 

Boys can be exposed to the rabbit hole through something as innocent as Googling, “how do I get a girlfriend?” Tech companies and even traditional media bear some responsibility for the proliferation of these spaces. The story you’re reading right now has refrained from linking directly to any of the incel material mentioned because such hyperlinks boost their results in search engines and influence the content algorithms of anyone who clicks them. Preventative measures like de-platforming of websites and creators and increasing news literacy in children are more effective than running damage control for incel terrorists.  

“Deradicalization is way harder, way less scalable, much more amorphous,” said Dashtgard. 

“Deradicalization is a personal process of going to therapy, confronting your biases, doing a real soul-searching kind of change,” he said. “That's really hard to do, and it's really hard to scale. It's much easier to off-ramp, or prevent people or inoculate people against these kinds of radicalizing narratives and ideologies.”

On the incel forum studied by the Center for Countering Digital Hate, YouTube was the overwhelmingly most-linked-to website, with over 14,000 unique links. YouTube has refused to de-platform or de-monetize many of the largest incel content channels, which had a combined 24.2 million views as of the date of the report. 

“The bar is set really low for what it takes to get pushed down that path of really intense, really misogynistic incel ideology… it only takes a couple of videos to get you from here to there,” Dashtgard said. “Before, it was like, who has white supremacist pamphlets just like lying around? Now it's like the digital equivalent of, they're everywhere. They're just littered all over the floor.”

The link between online misogyny and white supremacy is a strong one. Over a video interview, Megan Squire, the deputy director of data analytics and open source intelligence for the Southern Poverty Law Center, pinpointed the connection to antisemitic conspiracies like globalism and white erasure theory. 

“From the neo-Nazi groups to the MAGA groups… The misogyny, I think of it like the lingua franca. It's the one thing that all these groups have in common,” she said. “They usually trace it back to the breakdown of the nuclear family. It’s this argument that you have some outside force to blame for all the ills of society. Women in the workplace, birth control, abortion, they lay it at the feet of Jewish people. Or, particularly popular now, is not to say Jewish people but to say globalists.” 

The August 2017 Unite the Right rally in Charlottesville, Virginia that resulted in a death and thirty-five injuries from another vehicle ramming attack, was largely organized in online forums. The January 2021 attack on the U.S. Capitol building was planned in Facebook groups and isolated forums far down the ladder of spaces increasingly radicalized by the same process described earlier. The right provides a target for the grievances of incels as well as promises political action or retribution.  

“One of the things that motivates the great replacement and kind of white supremacist ideology is this fear that something is being taken from you,” Dashtgard said. “That doesn't motivate fear, unless you feel entitled to those things. If you're raised with a sort of, ‘Hey, boys will be boys. Like, I get to just kind of do what I want,’ and then somebody says, ‘well, like, actually, no, you have to be respectful of people,’ — 'this feels like you’re taking something from me.'”

The entitlement makes it incredibly difficult to derail the radicalization once it has significantly begun. Incel spaces believe they have a biological right to reproduce, and therefore have a right to be angry and violent toward anyone who impedes their access to sex. It’s a commonly spread belief among incel communities, and those who echo their talking points elsewhere, that there is a deeply entrenched conspiracy to destroy Western (to be interpreted as white) countries with interracial marriage, immigration, and an LGBTQ+ agenda. The Incel Wiki claims that psychiatric medication, such as antidepressants, prescribed to those who manage to get help are “Jew pills” made to pacify white men during this process. 

Being told that you’re being pacified to prevent you from stopping an ongoing attack on your identity implies that the correct response is to actively defend that identity. Every researcher interviewed for this story reported that threats of death and sexual violence have not only been left in their emails and voicemails but the emails and voicemails of their coworkers and peers. 

“When [they] can really build community is when they can find someone to target and then they can they entice you into, as a group, sort of piling on someone,” Squire said. “I've been harassed like that by groups of people. They sort of swarm all over you.”

“It's worth pointing out that that idea of a spectrum isn't necessarily always the best way to frame things,” she added. “It's not like they start off in something mild, and they end up at white extremism. Sometimes, we've got people who start off at inceldom and go ahead and murder someone.” 

The communities are structured in a way that targets boys in their most vulnerable years and sucks them into a self-defeated echo chamber that encourages and reinforces a hopeless worldview. Simultaneously, it fuels their rage and provides them with targets for their pain while surrounding them with praise for individuals who have reacted violently. 

“These online forums become kind of like incubators, sort of labs where they can test the effectiveness of arguments, and through upvotes and downvotes, likes and reshares, they can see which arguments resonate the most with people,” Dashtgard said. “Anyone can contribute to this philosophy and the only way that it is lifted up or pulled down, is by how effective it is. So it's this terrifying thing where it's like, the perfecting of propaganda by a community of people.”