Employees in the Department of Government Efficiency reportedly used a flawed artificial intelligence model to determine the necessity of contracts in the Department of Veterans Affairs, resulting in hundreds of contracts, valued at millions of dollars, being canceled.
Given only 30 days to implement President Donald Trump’s executive order directing DOGE to review government contracts and grants to ensure they align with the president’s policies, an engineer in DOGE rushed to create an AI to assist in the task.
Engineer Sahil Lavingia wrote code which told the AI to cancel, or in his words “munch,” anything that wasn’t “directly supporting patient care” within the agency.
However neither he, nor the model, required the knowledge to make those decisions.‘“I’m sure mistakes were made,” he told ProPublica. ”Mistakes are always made.”
One of the key problems was that the AI only reviewed the first 10,000 characters (roughly 2,500 words) of contracts to determine whether it was “munchable” – Lavingia’s term for if the task could be done by VA staffers rather than outsourcing, ProPublica reported.
Experts who reviewed the code also told ProPublica that Lavingia did not clearly define many critical terms, such as “core medical/benefits,” and used vague instructions, leading to multiple critical contracts being flagged as “munchable.”

For example, the model was told to kill DEI programs, but the prompt failed to define what DEI was, leaving the model to decide.
At another point in the code, Lavingia asked the AI to “consider whether pricing appears reasonable” for maintenance contracts, without defining what “reasonable” means.
In addition, the AI was created on an older, general purpose model not suited for the complicated task, which caused it to hallucinate, or make up, contract amounts, sometimes believing they were worth tens of millions as opposed to thousands.
Cary Coglianese, a professor at the University of Pennsylvania who studies governmental use of AI, told ProPublica that understanding which jobs could be done by a VA employee would require “sophisticated understanding of medical care, of institutional management, of availability of human resources” – all things the AI could not do.
Lavingia acknowledged the AI model was flawed, but he assured ProPublica that all “munchable” contracts were vetted by other people.

The VA initially announced, in February, it would cancel 875 contracts.
But various veteran affairs advocates sounded the alarm, warning that some of those contracts related to safety inspections at VA medical facilities, direct communications with veterans about benefits, and the VA’s ability to recruit doctors.
One source familiar with the situation in the department told the Federal News Network that some cuts demonstrated a “communication breakdown” between DOGE advisors, VA leaders, and lawmakers who oversee the VA.
The VA soon walked that number back, instead announcing in March it would cancel approximately 585 “non-mission-critical or duplicative contracts,” re-directing around $900 million back to the agency.
Lavingia, who was fired from DOGE approximately 55 days into his job after sharing some of his work with journalists, has described the work he did on his blog and released the code he used at the VA on GitHub.
The Independent has reached out to the White House for comment.