Quote:Space Force Pauses All Use of Generative AI
Space Bots
The brave Guardians of Space Force, the American military unit created to protect us feeble terrestrial Americans from various space-related threats, has announced a new and Earthly foe: generative AI.
As Bloomberg reports, Space Force leaders have forbidden their Guardians — the unfortunate name given to the military branch's members — from using generative AI tools like ChatGPT on government devices, arguing that the web-based AI tools present security risks, among other concerns.
In an internal memo obtained by Bloomberg, Space Force chief technology and innovation officer Linda Costa reportedly told Guardians that while generative AI "will undoubtedly revolutionize our workforce and enhance Guardian's ability to operate at speed," she emphasized that the tech must be integrated responsibly. Which, at least for now, seems to mean abstaining from using generative AI tools at all, with Costa citing "concerns over cybersecurity, data handling, and procurement requirement" in the September 29 memo, according to Bloomberg.
Angry Guy
Costa's reservations, particularly her concerns regarding sensitive US military data, are fair. When we use products like ChatGPT, our interactions are swallowed up into the ever-data-hungry models, stored and saved for training purposes; thus, if a Space Force Guardian were to, say, ask ChatGPT to drum up a report embedded with sensitive government information, said sensitive data could get vacuumed into OpenAI's system and out of the department's control.
Our glorious Space Force also isn't the only prominent entity to ban generative AI. Companies including Apple and Verizon have prohibited the tech from corporate systems over similar data and privacy fears, and at least one major company, Samsung, has already experienced an embarrassing, chatbot-assisted data leak.
But according to the Bloomberg report, some folks are apparently pretty peeved by Space Force's decision — namely, a guy named Nicolas Chaillan, the former chief software officer in the Defense Department and current founder and CEO of a chatbot company called Ask Sage.
Claiming that his bot meets Space Force's security requirements, Challain told Bloomberg that the Guardians' decision is "short-sighted," adding that his tool has around 10,000 customers in the Defense Department alone. (Per Bloomberg, in a September email to Costa and other defense officials, Challain declared that not using his tech would "put us years behind China," though it's unclear how abstaining from using one dude's chatbot might have such an effect.)
It's weird to imagine Defense Department employees using chatbots to write their reports. We never thought we'd say this, but: hats off to the Guardians. Responsible, thoughtful adoption of AI, especially when it comes to our government institutions, is and will continue to be incredibly important — and right now, it feels like they're doing things right.
Quote:ChatGPT Sparks U.S. Debate Over Military Use of AI
The Defense Department currently is modernizing its nuclear command, control, and communications systems, including through the widespread integration of advanced AI systems. Some analysts fear that this process will dilute human control over nuclear launch decision-making. (See ACT, April 2020.)
To ensure that machines never replace humans in this momentous role, a bipartisan group of legislators introduced the Block Nuclear Launch by Autonomous Artificial Intelligence Act on April 26. If enacted, the law would prohibit the use of federal funds to “use an autonomous weapons system that is not subject to meaningful human control…to launch a nuclear weapon; or…to select or engage targets for the purposes of launching a nuclear weapon.”
Quote:Hangar 18 as a Department of the Air Force software factory
Matthew Jacobsen, director of Hangar 18, cited Roper’s articles “There is no spoon” and “Bending the spoon” as significant in the formation of the software factories and Hangar 18’s charter.
Chaillan abruptly left his role as Chief Software Officer Sept. 2, 2021, citing disappointment with Air Force software development.
“I think that Mr. Chaillan saw a lot of potential in the cloud, DevSecOps and Agile space, and, ultimately, we see his departure as, what will hopefully be, a forcing function for Air Force leadership and DOD leadership to change how they mean to address this problem.”
LOL. What are they saying here with this name and logo?
From "There is no spoon" link:
Quote:The Air Force’s chief software officer Nic Chaillan announced his departure in a blistering online post Thursday that criticized senior leaders for not taking IT modernization seriously and hamstringing senior IT leaders.
Uri could not be reached for comment as he is busy with IDF psyop's on the current thing.
"It is hard to imagine a more stupid or more dangerous way of making decisions than by putting those decisions in the hands of people who pay no price for being wrong." – Thomas Sowell