The Air Force Research Laboratory (AFRL), whose tagline is “Win the Fight”, has paid more than a hundred thousand dollars to a company that is providing generative AI services to other parts of the Department of Defense. But the AFRL refused to say what exactly the point of the research was, and provided page after page of entirely blacked out, redacted documents in response to a Freedom of Information Act (FOIA) request from 404 Media related to the contract.
The news shows that while AI continues to proliferate across essentially every industry and increasingly government departments, some parts of the military can be tight-lipped about its intentions around generative AI, even when the models used are sometimes the same as what everyone else has access to or are open source, and when the work is unclassified. 404 Media previously reported that the Air Force tested a surveillance-focused AI chatbot.
“Ask Sage: Generative AI Acquisition Accelerator,” a December 2023 procurement record reads, with no additional information on the intended use case. The Air Force paid $109,490 to Ask Sage, the record says.
Ask Sage is a company focused on providing generative AI to the government. In September the company announced that the Army was implementing Ask Sage’s tools. In October it achieved “IL5” authorization, a DoD term for the necessary steps to protect unclassified information to a certain standard.
404 Media made an account on the Ask Sage website. After logging in, the site presents a list of the models available through Ask Sage. Essentially, they include every major model made by well-known AI companies and open source ones. Open AI’s GPT-4o and DALL-E-3; Anthropic’s Claude 3.5; and Google’s Gemini are all included.
The company also recently added the Chinese-developed DeepSeek R1, but includes a disclaimer. “WARNING. DO NOT USE THIS MODEL WITH SENSITIVE DATA. THIS MODEL IS BIASED, WITH TIES TO THE CCP [Chinese Communist Party],” it reads. Ask Sage is a way for government employees to access and use AI models in a more secure way. But only some of the models in the tool are listed by Ask Sage as being “compliant" with or “capable” of handling sensitive data.
In an associated Ask Sage Discord, apparent customers ask the company for support or make other comments. “Thanks for all the hard work and great enhancements that make our work lives so much easier,” one message posted this month reads. The username matches that of someone who lists their job as “AI Implementation, Information Warfare—Air Combat Command,” on LinkedIn.
But the Air Force declined to provide any real specifics on what it paid Ask Sage for. 404 Media requested all procurement records related to the Ask Sage contract. Instead, the Air Force provided a 19 page presentation which seemingly would have explained the purpose of the test, while redacting 18 of the pages. The only available page said “Ask Sage, Inc. will explore the utilization of Ask Sage by acquisition Airmen with the DAF for Innovative Defense-Related Dual Purpose Technologies relating to the mission of exploring LLMs for DAF use while exploring anticipated benefits, clearly define needed solution adaptations, and define clear milestones and acceptance criteria for Phase II efforts.”
Nicolas Chaillan, founder of Ask Sage and former chief software officer for the Air Force and Space Force told 404 Media in an email that “This was a research contract for feasibility.
This did not include any license of the product or any use of the product.” He added the only deliverable was a report and the work was not classified.
The AFRL did not respond to a request for comment.