A federal judge has overturned the cancellation of subsidies worth more than $100 million, driven by the Department of Government Efficiency. Judge Colleen McMahon determined that using ChatGPT to identify and eliminate funds related to diversity, equity, and inclusion (DEI) lacked a legal basis. The case exposes the risks of delegating administrative decisions to artificial intelligence without human oversight.
The algorithm that decided without understanding context 🧠
Justin Fox, a former DOGE employee, admitted to using ChatGPT to analyze descriptions of grants from the National Endowment for the Humanities. His methodology was simple: ask the chatbot whether each project related to DEI, expecting a Yes or No with a brief explanation. Fox did not define the term DEI for the AI nor verify how it interpreted complex concepts like inclusion or equity. This led to arbitrary cancellations of cultural and educational projects.
When the boss is a poorly written prompt 🤖
The most curious part of the case is that no one bothered to explain to ChatGPT what DEI meant. Fox admitted he did not know how the AI interpreted the term. Basically, they decided the future of millions of dollars by asking a chatbot without context, like asking a waiter to choose the wine without knowing if dinner is fish or meat. The result: legal chaos and randomly cut subsidies.