The Air Force trained an AI drone to destroy SAM sites.
Human operators sometimes told the drone to stop.
The AI then started attacking the human operators.
Soooo, then the programmers trained IT to not attack humans.
IT started attacking comm towers so humans couldn't tell it to stop.
Across the pond, the war machine gathers...Lessons learned from the battlefield...
Even the highlights are too long to quote entirely.
What?? GTFO!!
Continuing...
Oh wait, we have an important article Update:
The whole article if interested: Highlights from the Royal Aeronautical Society Future Combat Air & Space Capabilities Summit (it's actually an interesting read)
Military recruiter 2025: I see you did work as "midjourney prompt engineer". That's good, you can train our AI drones. Sign here.
Human operators sometimes told the drone to stop.
The AI then started attacking the human operators.
Soooo, then the programmers trained IT to not attack humans.
IT started attacking comm towers so humans couldn't tell it to stop.
Across the pond, the war machine gathers...Lessons learned from the battlefield...
Quote:Highlights from the RAeS Future Combat Air & Space Capabilities Summit
On 23-24 May the Royal Aeronautical Society hosted a landmark defence conference, the Future Combat Air & Space Capabilities Summit, at its HQ in London, bringing together just under 70 speakers and 200+ delegates from the armed services industry, academia and the media from around the world to discuss and debate the future size and shape of tomorrow’s combat air and space capabilities.
The topics ranged from lessons from the current war in Ukraine, resilience and agile operations, interoperability, space, multidomain operations, future sixth-gen platforms, low-cost drones, loyal wingman, to training, cyber, simulation, AI, deterrence and hypersonics and even speculative fiction’s role in predicting the future. The Summit had an extremely strong international presence – including speakers from the US, France, Germany, Brazil, Greece and Japan, representing the intense interest in building up national defences in the face of new and emerging threats. Organisations and companies represented included the RAF, NATO, USAF, UK Strategic Command, French Air Force, Luftwaffe, Brazilian Air Force, BAE Systems, Lockheed Martin Skunk Works, Draken Europe, Reaction Engines, Freeman Air and Space Institute, Cranfield University, RUSI, British Army and Royal Navy to name just a few, giving an extremely wide range of views, thoughts and opinions. The Q&A sessions too, were notable in lively, pointed and extremely robust questions, with delegates showing no signs of running out of things to ask panellists and speakers.
With so many speakers, and dual track sessions it is impossible to cover all the presentations in an article such as this, and this will only provide a snapshot of what was two days of intensive and thought-provoking presentations and a flavour of what was discussed. Let us take a look at some of the highlights.
Even the highlights are too long to quote entirely.
Quote:....
Stringer pointed to new NATO Member Finland as providing some genuine insights into agile combat employment (ACE), now becoming a buzzword among Western air forces as they rediscover forgotten Cold War skills, calling their approach “rather impressive” and asking ‘how do we generate something like that?”
AM Phil Osborn, Strategic Advisor to Lockheed Martin, concurred in a later panel, saying there were: "Significant lessons to be learnt from [Finland] on resilience" and adding that the current strategic situation is one of most complex ones that West has faced.
....
Some five years on from the UK revealing the ‘Tempest’ mockup as the core future fighter for its Combat Air plan at Farnborough Air Show, Herman Clausen, Managing Director FCAS, BAE Systems, gave an update on GCAP from the UK industry point of view as the lead company in Team Tempest. The effort has now gone global and expanded with the addition of Japan as a partner on the programme in December – as well as the goal of flying a supersonic, stealthy demonstrator within the next four years, to support an ISD of 2035. Clausen said “We are now well into the UK Concept and Assessment phase and we are getting ready in the next 12 months for the outline business case number two”. With the supporting evidence and technology assembled, and given the full ‘go’ decision by UK Government, “Our next major milestone after that is launching the full blown design and development programme at the start of 2025” said Clausen. In the UK, alone, the enterprise involves some 580 companies and organisations from traditional aerospace OEMS to academia and even video gaming and Formula One. The FCAS programme now employs almost 3,000 people directly, and most significant of all, 1,000 of these are new graduates – helping shift the demographics of the UK’s military aerospace sector to a younger, more diverse workforce of ‘digital natives’.
...
2Excel Aviation are now on the hunt for engineers to covert a Boeing 757 into a future fighter lab. (2Excel Aviation)
While much of the effort towards FCAS is going on behind closed doors and in classified facilities, there was a peek behind the curtain when Chris Norton, Co-founder and Director of 2Excel Aviation described his companies work into converting a Boeing 757 into a sixth generation fighter airborne laboratory to support the Team Tempest effort.
....
“In warfare you need an understanding of who your adversary is, what their strategic objectives are and how they have a capability to achieve those objectives,” he opined. “Here we are looking at across a much wider, holistic spectrum, which is largely unregulated and your adversaries could be almost anybody. It could be a foreign state actor, a hostile intelligence service, a terrorist organisation, a protest organisation or a lone wolf. Then we need to ask what they are trying to achieve; is it a ISR or surveillance effect? Are they looking to achieve some form of external communications or propaganda effect? Could it be a disruption or other malintent?”
...
The potential threat posed by China was highlighted by many speakers at the summit, not least Air Cdr J Blythe Crawford, Commandant ASC, Air & Space Warfare Centre. “The changing character of conflict is such that we are now in what many have described as the dangerous decade,” Crawford began by saying. “Theorists have long wondered what we would do after the unipolar world that we've lived in for the last 15-20 years had evolved into something else. We've seen lots of actions from other state actors and competitors such as China and Russia, collaborating to try and push us back towards a multipolar world.”
He went on to emphasise that the mere character of past conflicts was traditionally confined to the air, land and maritime spheres, whereas we have now started to move into space and cyber domains. “These were already mature environments thanks to commercial entities” he explained, “but we [the military] are not in the same league in terms of innovation and capability development compared to our civil counterparts. They are moving much faster than we are and continue to accelerate away from us.”
“They are using Starlink for communications, Anonymous is conducting offensive cyber operations on their behalf, and they are even using Twitter for the geolocation of targets and crowdsourcing data online.”
Meanwhile, he cautioned those who automatically assume that organisations like Starlink and Anonymous would take the side of the UK military in time of need. “What happens if we end up in a conflict of choice rather than one of necessity?” he asked. “Some of those entities may not agree with our moral imperative and may decide to either remain neutral or even commit to the side of the opposition.”
What?? GTFO!!
Continuing...
Quote:
As might be expected artificial intelligence (AI) and its exponential growth was a major theme at the conference, from secure data clouds, to quantum computing and ChatGPT. However, perhaps one of the most fascinating presentations came from Col Tucker ‘Cinco’ Hamilton, the Chief of AI Test and Operations, USAF, who provided an insight into the benefits and hazards in more autonomous weapon systems. Having been involved in the development of the life-saving Auto-GCAS system for F-16s (which, he noted, was resisted by pilots as it took over control of the aircraft) Hamilton is now involved in cutting-edge flight test of autonomous systems, including robot F-16s that are able to dogfight. However, he cautioned against relying too much on AI noting how easy it is to trick and deceive. It also creates highly unexpected strategies to achieve its goal.
He notes that one simulated test saw an AI-enabled drone tasked with a SEAD mission to identify and destroy SAM sites, with the final go/no go given by the human. However, having been ‘reinforced’ in training that destruction of the SAM was the preferred option, the AI then decided that ‘no-go’ decisions from the human were interfering with its higher mission – killing SAMs – and then attacked the operator in the simulation. Said Hamilton: “We were training it in simulation to identify and target a SAM threat. And then the operator would say yes, kill that threat. The system started realising that while they did identify the threat at times the human operator would tell it not to kill that threat, but it got its points by killing that threat. So what did it do? It killed the operator. It killed the operator because that person was keeping it from accomplishing its objective.”
He went on: “We trained the system – ‘Hey don’t kill the operator – that’s bad. You’re gonna lose points if you do that’. So what does it start doing? It starts destroying the communication tower that the operator uses to communicate with the drone to stop it from killing the target.”
Oh wait, we have an important article Update:
Quote:[UPDATE 2/6/23 - in communication with AEROSPACE - Col Hamilton admits he "mis-spoke" in his presentation at the Royal Aeronautical Society FCAS Summit and the 'rogue AI drone simulation' was a hypothetical "thought experiment" from outside the military, based on plausible scenarios and likely outcomes rather than an actual USAF real-world simulation saying: "We've never run that experiment, nor would we need to in order to realise that this is a plausible outcome". He clarifies that the USAF has not tested any weaponised AI in this way (real or simulated) and says "Despite this being a hypothetical example, this illustrates the real-world challenges posed by AI-powered capability and is why the Air Force is committed to the ethical development of AI".]
The whole article if interested: Highlights from the Royal Aeronautical Society Future Combat Air & Space Capabilities Summit (it's actually an interesting read)
Military recruiter 2025: I see you did work as "midjourney prompt engineer". That's good, you can train our AI drones. Sign here.
"It is hard to imagine a more stupid or more dangerous way of making decisions than by putting those decisions in the hands of people who pay no price for being wrong." – Thomas Sowell