26 September 2025
- RSIS
- Publication
- RSIS Publications
- From Banal to Extreme: When Benign Online Communities Become Breeding Grounds for the Far-Right
SYNOPSIS
The recent case of a 14-year-old male Singaporean radicalised through extremist content illustrates the dangers of how seemingly benign communities and platform algorithms are pipelines for radicalisation.
COMMENTARY
A 14-year-old male Singaporean was recently issued with Restriction Orders (RO) under Singapore’s Internal Security Act (ISA) for online self-radicalisation by what has been termed a “salad bar” of extremist ideologies, leading him to support the cause of the Islamic State in Iraq and Syria (ISIS) and far-right ideologies, such as the incel (aka involuntarily celibate) subculture.
The teenager had started unintentionally by accessing dubious or questionable far-right material posted by foreign extremists, which was recommended to him by platform algorithms after he consumed true crime content. This eventually led him to imbibe content supportive of the far-right and to join communities with violent antisemitic beliefs.
His encounter with incel (a portmanteau of “involuntary celibate”) ideology was similarly unintentional. An incel is a member of an online subculture of mostly male and heterosexual people who define themselves as unable to find a romantic or sexual partner, often blaming or hating women as a result.
In 2023, after becoming more self-conscious about his appearance, he came across “looksmaxxing” content, a part of the incel subculture that focuses on maximising one’s own physical attractiveness. He started posting and sharing incel content online.
This transition reveals an interesting, albeit insidious, aspect of digital networks – the overlap and blurring of boundaries between extreme and benign communities, creating a pathway from the banal to the extreme. Increasingly, according to Matthew Kriner, managing director of the Accelerationism Research Consortium, “anything and everything is becoming a viable pathway to violence”.
Specifically, “antisocial, decentralised, online networks” are overlapping in ways that “encourage and inspire” young people to commit atrocities and various forms of violence. Amplified by algorithms, these ideologies are often deeply embedded within internet culture, making their extremist tenets difficult to detect.
Benign Pathways to the Far-right
About the self-radicalised 14-year-old, two seemingly benign avenues to radicalisation stand out – the true crime and looksmaxxing communities.
People have been “obsessed with” the true crime genre for a long time. In fact, true crime is the number one podcast genre on Apple and Spotify, its popularity not dampened by recent discussions on the ethicality of consuming true crime content.
A less frequently discussed issue about true crime is its overlaps with far-right extremism. Some amongst the community, notably the True Crime Community (TCC), glorify the suspects behind mass killings, with research showing that, in 2024, at least seven of its members were linked to school shootings or school shooting plots in the US. This is despite the absence of outright calls for violence common in neo-Nazi circles.
As for looksmaxxing, initially obscure on the Reddit platform, it has become popularised mainly on TikTok, which provides a mask to its incel origins. Looksmaxxing is the belief that males should continuously improve their physical appearance to gain social acceptance, through means ranging from physical exercise to extreme measures like “bone smashing” faces to acquire chiselled looks.
While the community appears to encourage self-care harmlessly, some members find themselves pulled toward “increasingly extreme, age-inappropriate and violent incel and neo-Nazi content that often glorifies mass shooters and suicide”.
In the case of the 14-year-old referenced above, the algorithms involved are the pipelines themselves. Social media algorithms have fundamentally reshaped what had initially been the free exchange of ideas on the Internet. They are designed to maximise engagement, thus favouring content that is controversial and contentious, which can lead to the possible overrepresentation of fringe political opinions on social media.
According to a 2023 research by the corporate accountability group Ekō, TikTok’s algorithms begin pushing like-minded content to new users after just ten minutes of online engagement. It creates a digital environment filled with content that is of particular interest to the specific online user, and this includes glorification of self-harm and depressive elements – all of which leave its audience vulnerable to extremist ideologies.
Earlier, a 2021 report by the London-based Institute for Strategic Dialogue highlighted the spread of incel content on TikTok. However, measures to remove such content have been limited and insufficient. Similarly, a 2024 study by the Institute revealed that YouTube recommends right-leaning and Christian content to users with no previous interaction with such content.
Social media algorithms and far-right ideologies target vulnerabilities in people, particularly feelings of loneliness and loss of control, and they “gamify harmful content”, i.e., make such content more like a game to make it enjoyable. Given the universality of these vulnerabilities, people outside the sociopolitical contexts in which these ideologies originated are also vulnerable.
Where Extremism and the “Banal” Overlap
Overlaps between extremism and the banal are not confined to true crime or looksmaxxing communities. Similar dynamics can be seen in lifestyle subcultures that appear apolitical, wholesome or even aspirational, such as the “tradwives” (short for traditional wives) and homesteading communities, as well as wellness and gym cultures.
At first glance, these communities appear to reflect lifestyle choices rooted in domesticity, sustainability, and self-improvement; however, their overlap with conservative online spaces and amplification by algorithms have made them entry points where far-right ideas can subtly penetrate and circulate.
The tradwife movement, for instance, promotes a return to traditional gender roles, modesty, and homemaking. Tradwife content is usually framed as a personal or religious choice, obscuring the fact that tradwives is a far-right subculture which fixates on white supremacy and patriarchy – two essential components of the modern far-right. Within online far-right discourses, the tradwife is celebrated as a symbol of resistance to feminism, liberalism, and modernity.
Popular American content creator Hannah Neeleman, known by her social media username Ballerina Farm, has built a massive following through highly aestheticised portrayals of domestic life on a farm, showcasing baking, childraising, and rural living. While much of her content is lifestyle-oriented, her prominence in conservative-coded spaces and features on magazines like Evie Magazine, which is opposed to feminism, illustrate how such content can overlap with and indirectly amplify far-right cultural narratives.
Similarly, a 2023 Media Matters study demonstrated how TikTok accounts that initially engaged only with Tradwives’ content quickly spiralled into feeds dominated by conspiracy theories, extremist figures, and medical misinformation, illustrating how algorithms collapse boundaries between the benign and the extreme.
Homesteading – the pursuit of self-sufficiency and rural living – operates in a similar way, too. Sometimes seen as an extension of tradwife content, homesteading is saturated with narratives of “returning to tradition”, and both have been criticised for platforming conspiratorial and exclusionary views. Profiles like Gwen The Milkmaid host extreme right-wing and conspiracy views (e.g., anti-vaxx and anti-feminist views) alongside innocuous content on pasta making and gardening.
The wellness industry has long blurred the lines between health and conspiracy. A 2022 study by researcher Stephanie Alice Baker examined several alternative health influencers during the COVID-19 pandemic and found that many blended wellness content with conspiracy theories, such as anti-science narratives and anti-establishment views.
The masculine-code “gym bro” culture runs parallel to this, where algorithms and influencers push young people toward cultures of hypermasculinity, fatphobia, and misogyny. Internet figures like Andrew Tate have capitalised on this convergence, packaging self-improvement with conspiracy theories and regressive gender politics. The insecurities associated with looksmaxxing, such as obsessions with facial features, overlap with the manosphere and gym bro culture of valorisation, of dominance and sovereignty, creating another entry point for far-right narratives.
Conclusion
The examples discussed, from true crime to tradwives and looksmaxxing to wellness, show how the seemingly benign can bleed into extremist narratives. The digital nature of these communities gives rise to their porousness – social media algorithms have led to the collapse of boundaries between “benign” and extremist communities, and these communities and their narratives have transcended borders and sociopolitical contexts. The case of the self-radicalised Singaporean teenager illustrates the reach of such innocuous online content using an algorithm or computational procedure.
Because of overlapping core tenets, these communities function as a network, where the audience from one community is exposed to, and more susceptible to, messages from a parallel community.
The rise of these communities signifies the diversification of extremist entry points beyond overt political socialisation. Lifestyle subcultures demonstrate that “banal” communities should be interpreted as warning signs of potential pathways to radicalisation, should they present narrative overlaps with far-right ideologies. And their rise also signals the difficulty of detecting radicalisation when it is embedded within everyday spaces.
The banal is now a pathway to radicalisation. Digitalisation has transformed the trite into a launch pad for more insidious harm.
The way forward should involve greater scrutiny of and accountability from platforms and their algorithms, as well as an understanding of young people and their insecurities. The latter, intertwined with social media’s propensity to force its audience “into a spiral of depression, hopelessness, and self-harm”, creates an audience vulnerable to radicalisation.
About the Authors
Yasmine Wong and Antara Chakraborthy are Associate Research Fellows at the Centre of Excellence for National Security (CENS) at S. Rajaratnam School of International Studies (RSIS), Nanyang Technological University (NTU), Singapore.
SYNOPSIS
The recent case of a 14-year-old male Singaporean radicalised through extremist content illustrates the dangers of how seemingly benign communities and platform algorithms are pipelines for radicalisation.
COMMENTARY
A 14-year-old male Singaporean was recently issued with Restriction Orders (RO) under Singapore’s Internal Security Act (ISA) for online self-radicalisation by what has been termed a “salad bar” of extremist ideologies, leading him to support the cause of the Islamic State in Iraq and Syria (ISIS) and far-right ideologies, such as the incel (aka involuntarily celibate) subculture.
The teenager had started unintentionally by accessing dubious or questionable far-right material posted by foreign extremists, which was recommended to him by platform algorithms after he consumed true crime content. This eventually led him to imbibe content supportive of the far-right and to join communities with violent antisemitic beliefs.
His encounter with incel (a portmanteau of “involuntary celibate”) ideology was similarly unintentional. An incel is a member of an online subculture of mostly male and heterosexual people who define themselves as unable to find a romantic or sexual partner, often blaming or hating women as a result.
In 2023, after becoming more self-conscious about his appearance, he came across “looksmaxxing” content, a part of the incel subculture that focuses on maximising one’s own physical attractiveness. He started posting and sharing incel content online.
This transition reveals an interesting, albeit insidious, aspect of digital networks – the overlap and blurring of boundaries between extreme and benign communities, creating a pathway from the banal to the extreme. Increasingly, according to Matthew Kriner, managing director of the Accelerationism Research Consortium, “anything and everything is becoming a viable pathway to violence”.
Specifically, “antisocial, decentralised, online networks” are overlapping in ways that “encourage and inspire” young people to commit atrocities and various forms of violence. Amplified by algorithms, these ideologies are often deeply embedded within internet culture, making their extremist tenets difficult to detect.
Benign Pathways to the Far-right
About the self-radicalised 14-year-old, two seemingly benign avenues to radicalisation stand out – the true crime and looksmaxxing communities.
People have been “obsessed with” the true crime genre for a long time. In fact, true crime is the number one podcast genre on Apple and Spotify, its popularity not dampened by recent discussions on the ethicality of consuming true crime content.
A less frequently discussed issue about true crime is its overlaps with far-right extremism. Some amongst the community, notably the True Crime Community (TCC), glorify the suspects behind mass killings, with research showing that, in 2024, at least seven of its members were linked to school shootings or school shooting plots in the US. This is despite the absence of outright calls for violence common in neo-Nazi circles.
As for looksmaxxing, initially obscure on the Reddit platform, it has become popularised mainly on TikTok, which provides a mask to its incel origins. Looksmaxxing is the belief that males should continuously improve their physical appearance to gain social acceptance, through means ranging from physical exercise to extreme measures like “bone smashing” faces to acquire chiselled looks.
While the community appears to encourage self-care harmlessly, some members find themselves pulled toward “increasingly extreme, age-inappropriate and violent incel and neo-Nazi content that often glorifies mass shooters and suicide”.
In the case of the 14-year-old referenced above, the algorithms involved are the pipelines themselves. Social media algorithms have fundamentally reshaped what had initially been the free exchange of ideas on the Internet. They are designed to maximise engagement, thus favouring content that is controversial and contentious, which can lead to the possible overrepresentation of fringe political opinions on social media.
According to a 2023 research by the corporate accountability group Ekō, TikTok’s algorithms begin pushing like-minded content to new users after just ten minutes of online engagement. It creates a digital environment filled with content that is of particular interest to the specific online user, and this includes glorification of self-harm and depressive elements – all of which leave its audience vulnerable to extremist ideologies.
Earlier, a 2021 report by the London-based Institute for Strategic Dialogue highlighted the spread of incel content on TikTok. However, measures to remove such content have been limited and insufficient. Similarly, a 2024 study by the Institute revealed that YouTube recommends right-leaning and Christian content to users with no previous interaction with such content.
Social media algorithms and far-right ideologies target vulnerabilities in people, particularly feelings of loneliness and loss of control, and they “gamify harmful content”, i.e., make such content more like a game to make it enjoyable. Given the universality of these vulnerabilities, people outside the sociopolitical contexts in which these ideologies originated are also vulnerable.
Where Extremism and the “Banal” Overlap
Overlaps between extremism and the banal are not confined to true crime or looksmaxxing communities. Similar dynamics can be seen in lifestyle subcultures that appear apolitical, wholesome or even aspirational, such as the “tradwives” (short for traditional wives) and homesteading communities, as well as wellness and gym cultures.
At first glance, these communities appear to reflect lifestyle choices rooted in domesticity, sustainability, and self-improvement; however, their overlap with conservative online spaces and amplification by algorithms have made them entry points where far-right ideas can subtly penetrate and circulate.
The tradwife movement, for instance, promotes a return to traditional gender roles, modesty, and homemaking. Tradwife content is usually framed as a personal or religious choice, obscuring the fact that tradwives is a far-right subculture which fixates on white supremacy and patriarchy – two essential components of the modern far-right. Within online far-right discourses, the tradwife is celebrated as a symbol of resistance to feminism, liberalism, and modernity.
Popular American content creator Hannah Neeleman, known by her social media username Ballerina Farm, has built a massive following through highly aestheticised portrayals of domestic life on a farm, showcasing baking, childraising, and rural living. While much of her content is lifestyle-oriented, her prominence in conservative-coded spaces and features on magazines like Evie Magazine, which is opposed to feminism, illustrate how such content can overlap with and indirectly amplify far-right cultural narratives.
Similarly, a 2023 Media Matters study demonstrated how TikTok accounts that initially engaged only with Tradwives’ content quickly spiralled into feeds dominated by conspiracy theories, extremist figures, and medical misinformation, illustrating how algorithms collapse boundaries between the benign and the extreme.
Homesteading – the pursuit of self-sufficiency and rural living – operates in a similar way, too. Sometimes seen as an extension of tradwife content, homesteading is saturated with narratives of “returning to tradition”, and both have been criticised for platforming conspiratorial and exclusionary views. Profiles like Gwen The Milkmaid host extreme right-wing and conspiracy views (e.g., anti-vaxx and anti-feminist views) alongside innocuous content on pasta making and gardening.
The wellness industry has long blurred the lines between health and conspiracy. A 2022 study by researcher Stephanie Alice Baker examined several alternative health influencers during the COVID-19 pandemic and found that many blended wellness content with conspiracy theories, such as anti-science narratives and anti-establishment views.
The masculine-code “gym bro” culture runs parallel to this, where algorithms and influencers push young people toward cultures of hypermasculinity, fatphobia, and misogyny. Internet figures like Andrew Tate have capitalised on this convergence, packaging self-improvement with conspiracy theories and regressive gender politics. The insecurities associated with looksmaxxing, such as obsessions with facial features, overlap with the manosphere and gym bro culture of valorisation, of dominance and sovereignty, creating another entry point for far-right narratives.
Conclusion
The examples discussed, from true crime to tradwives and looksmaxxing to wellness, show how the seemingly benign can bleed into extremist narratives. The digital nature of these communities gives rise to their porousness – social media algorithms have led to the collapse of boundaries between “benign” and extremist communities, and these communities and their narratives have transcended borders and sociopolitical contexts. The case of the self-radicalised Singaporean teenager illustrates the reach of such innocuous online content using an algorithm or computational procedure.
Because of overlapping core tenets, these communities function as a network, where the audience from one community is exposed to, and more susceptible to, messages from a parallel community.
The rise of these communities signifies the diversification of extremist entry points beyond overt political socialisation. Lifestyle subcultures demonstrate that “banal” communities should be interpreted as warning signs of potential pathways to radicalisation, should they present narrative overlaps with far-right ideologies. And their rise also signals the difficulty of detecting radicalisation when it is embedded within everyday spaces.
The banal is now a pathway to radicalisation. Digitalisation has transformed the trite into a launch pad for more insidious harm.
The way forward should involve greater scrutiny of and accountability from platforms and their algorithms, as well as an understanding of young people and their insecurities. The latter, intertwined with social media’s propensity to force its audience “into a spiral of depression, hopelessness, and self-harm”, creates an audience vulnerable to radicalisation.
About the Authors
Yasmine Wong and Antara Chakraborthy are Associate Research Fellows at the Centre of Excellence for National Security (CENS) at S. Rajaratnam School of International Studies (RSIS), Nanyang Technological University (NTU), Singapore.