Federal Judge Rules DOGE Used ChatGPT to Illegally Cancel $100M in Grants
A federal judge has ruled that the Department of Government Efficiency's cancellation of over $100 million in National Endowment for the Humanities (NEH) grants was unconstitutional, citing the agency's use of ChatGPT to identify and eliminate funding related to diversity, equity, and inclusion (DEI). In a 143-page decision, US District Judge Colleen McMahon found that DOGE's AI-driven process violated constitutional protections by using "the mere presence of particular, protected characteristics to disqualify grants from continued funding."
The ruling stems from a 2025 lawsuit filed by humanities groups challenging DOGE's methodology for terminating grants. According to the court, DOGE deployed ChatGPT to scan grant materials and flag those with DEI-related content, then used those AI-generated determinations as the basis for mass cancellations. Judge McMahon's decision makes clear that this approach crossed constitutional lines, treating protected characteristics as automatic disqualifiers without meaningful human review or individualized assessment of each grant's actual content and purpose.
The case raises significant questions about the use of AI tools in government decision-making, particularly when those decisions affect constitutionally protected categories. While AI systems like ChatGPT can rapidly process large volumes of text, the ruling suggests that relying on such tools to make determinations about protected characteristics—without robust human oversight—can lead to constitutional violations. The decision may prompt broader scrutiny of how federal agencies integrate AI into policy implementation, especially when automated systems are used to enforce ideological criteria on federally funded programs.