The Home Office’s use of Artificial Intelligence (AI) in processing asylum claims could be unlawful, legal experts have warned, opening up the possibility of court action against the government.
Caseworkers at the Home Office use AI to summarise interview transcripts with asylum seekers. It is also used to search policy guidance, such as information about whether a country is deemed safe to return to.
Asylum seekers are not told when AI is used on their interview testimony and are instead kept in the dark about the technology’s impact on their claims.
The government’s own evaluation of the AI tool that summaries asylum interview transcripts found that nine per cent of summaries were so flawed that they had to be removed.
Five per cent of the case workers who used AI to summarise policy documents said they were “not confident in tool accuracy”. An evaluation of the 2024 pilot of the technology suggested that the AI interview summaries could save 23 minutes per case, and 37 minutes per case when officials used AI to search for information about a migrant’s country of origin.
A legal opinion, produced by lawyers at Cloisters Chambers and Doughty Street Chambers for the Open Rights Group and seen by The Independent, argues the Home Office’s use of AI is “likely to be unlawful” as it does not meet a number of legal obligations nor the standards set out in the government’s AI playbook.
These include being transparent with the public about how AI is being used, and making sure alternatives to AI are considered before using these tools.
AI tools were first trialed by Home Office caseworkers as part of a pilot scheme in 2024, but they were then rolled out more widely in 2025. In an announcement in April 2025, then-home secretary Yvette Cooper promised that AI would help officials make swift decisions on claims “preventing asylum seekers from being stuck in limbo at the taxpayers’ expense”.
There is no public data on how many asylum claims are decided with the assistance of AI.
While the backlog for an initial asylum decision has been slashed under Labour, the number of people waiting for an asylum appeal has boomed.
New tribunal statistics released last week show that more than 100,000 people were awaiting an appeal on their asylum decision at the end of December 2025. Some 36 per cent of determined appeals are successful, according to analysis of the data by charity Refugee Council, and when Home Office reconsiderations are included this jumps to a 66 per cent success rate.
Imran Hussain, director of external affairs at the Refugee Council, said the figures demonstrate “poor quality decision-making by the Home Office”.
In the opinion, lawyers argue that the government has not put safeguards in place to ensure “meaningful human control” of the AI tools and say that there has not been adequate consideration of the ways decisions are influenced by AI content.
They warn that the adoption of AI tools create the risk that decision-makers will consider inaccurate information and overlook relevant facts when determining an asylum seeker’s claim.
They also argue that the government has failed to adhere to a number of AI ethical principles, which ministers have committed to. These include fair treatment of people with protected characteristics, such as their sex or race or disability, and commitments to transparency.
The Independent has previously reported on warnings about the Home Office’s plan to use AI facial-recognition technology to assess the age of unaccompanied asylum-seeking children.
Robin Allen KC and Dee Masters of Cloisters Chambers, who helped produce the legal opinion, said: “Where AI tools are used without adequate safeguards, there is a real risk that unlawful or unfair decisions may result”. They called for “full transparency” on how AI is used.
Sara Alsherif, migrants rights programme manager at Open Rights Group, called for an “immediate ban on the use of these tools”, adding “these tools are not the answer”. The group believe the legal opinion could open the way for legal challenges against the government from asylum seekers affected by AI use.
She continued: “Determining whether someone can or cannot seek refuge in the UK is one of the most serious and life-changing decisions the government can make. There must be the utmost transparency, fairness and accuracy.
“But asylum applicants are not even being informed that opaque AI tools are being used in the assessment of their case, nor being given the opportunity to correct errors that might be made.”
A Home Office spokesperson said: “AI will not decide asylum claims. It will strengthen the support we give to caseworkers, ensuring faster, high‑quality decisions made by trained officials.”

