Social media algorithms and influencer content are actively contributing to the rise of new belief systems and “nihilistic violent extremism”, a parliamentary committee has warned.
The Home Affairs Committee has expressed concern that smaller digital platforms “are not receiving sufficient regulatory scrutiny”.
A new report from the cross-party panel highlights that the Online Safety Act, enacted nearly three years ago, contains “gaps that limit Government’s ability to address new forms of extremism, particularly when harmful content falls below criminal thresholds”.
Dame Karen Bradley, who chairs the Committee, has called for a unified national response to these emerging trends, bringing together police, health services, and local councils.
The Conservative MP for Staffordshire Moorlands stated: “Many of the core functions designed to divert children and young people at risk of being radicalised were established in a different age.”

Addressing the government’s Prevent referral programme, designed to deter individuals from terrorism, Dame Karen added: “Prevent has the clear and explicit function of stopping people becoming radicalised into terrorism, but more and more it is having to support those with no ideological motivation, who may have complex needs and operate in digital spaces that are poorly understood.”
She stressed the need for a more integrated approach, advocating: “There needs to be a comprehensive structure in place at a local level but implemented nationwide that triages referrals to where they can receive the right support.”
Dame Karen cautioned against a fragmented strategy, stating: “We cannot have a siloed approach that sees one agency as responsible when it will require the joint efforts of police, health, education and local government services to identify cases and intervene.”
The committee identified antisemitism, misogyny, and conspiracy theories as common entry points to extremist behaviour.
Unlike past forms of extremism rooted in fixed ideologies, individuals are now “combining elements from different, sometimes contradictory narratives”. The report also found that online influencers employ “humour, memes and coded language” to disseminate potentially harmful messages.
MPs warned that this messaging risks directing individuals online “towards smaller, private, unmoderated and encrypted platforms or messaging apps where extremist attitudes and more explicit content can be shared and reinforced with greater freedom”.
Further recommendations from the committee include teaching digital skills to both children and adults to enhance their critical evaluation of online content, and for the Home Office to establish a long-term research programme into extremism.
Members also urged a cross-Government initiative to dismantle “com networks”, described as digital communities of “predominantly teenage boys seeking to inflict harm and engaging in a range of criminality”.
A Home Office spokesperson responded: “All forms of extremism have absolutely no place in our society. We are delivering a fundamental reset in how we approach countering extremism so that we can keep the public safe.”
They outlined actions including expanding a visa taskforce to prevent foreign extremists entering the UK, strengthening disruption capabilities against extremist networks, and providing more information to frontline staff.



