WASHINGTON — An artificial intelligence algorithm has defeated a human F-16 fighter pilot in a virtual dogfight simulation.
The Aug. 20 event was the finale of the Pentagon research agency’s AI air combat competition. The algorithm, developed by Heron Systems, easily defeated the fighter pilot in all five rounds that capped off a yearlong competition hosted by the Defense Advanced Research Projects Agency.
The competition, called the AlphaDogfight Trials, was part of DARPA’s Air Combat Evolution program, which is exploring automation in air-to-air combat and looking to improve human trust in AI systems.
“It’s easy to go down the wrong path of thinking that that is either A) definitive in some way as to what the future of [basic fighter maneuvers will be]; or B) that it is a bad outcome,” said Justin Mock of DARPA, a fighter pilot and commentator for the trials. “From a human perspective, from the fighter pilot world, we talked about we trust what works. And what we saw was that in this limited area, in this specific scenario, we’ve got AI that works.”
The human pilot, only known to the audience by his call sign “Banger” for operational security reasons, is a graduate of the Air Force’s weapons instructor course, a highly selective training course reserved for top fighter pilots. While the victory for the AI system is a big step forward for the young DARPA program, the work is far from over.
The conditions in the simulation weren’t realistic for aerial combat. To start, the artificial intelligence system had perfect information, which experts commentating on the event noted never happens in the field. The human pilot was also flying a fake stick in a virtual seat.
“There are a lot caveats and disclaimers to add in here,” Col. Dan Javorsek, program manager in DARPA’s Strategic Technology Office, said in a post-event livestream.
Heron’s AI system gained notoriety throughout the competition for its aggressiveness and the accuracy of its shot. Mock noted before the human-AI matchup that the AI system will “take shots that we would never take in our training environments.” Mock also said Heron often made an error in basic fighter maneuvers by turning away from enemy aircraft to where the AI thought the other aircraft would go, but was able to recover throughout the fights because of Heron’s “superior aiming ability” and the competitor aircraft taking the bait.
Heron was one of eight AI teams selected by DARPA to take part in the final round of the agency’s competition. Heron topped the likes of Lockheed Martin, Perspecta Labs, Aurora Flight Sciences, EpiSys Science, Georgia Tech Research Institute, PhysicsAI and SoarTech.
DARPA’s leadership on the project acknowledged that the results of the simulated dogfight are just the first step in a long journey to fielding AI that can fight in air combat.
“Artificial intelligence shows a lot of promise. It’s kind of been bang or bust in the past,” Javorsek said. “In the larger ACE program, our plan is to take the modeling and simulation work that we’re doing here and translate it from that digital environment into the real world. And it turns out that’s a pretty important jump to make.”
Andrew Eversden covers all things defense technology for C4ISRNET. He previously reported on federal IT and cybersecurity for Federal Times and Fifth Domain, and worked as a congressional reporting fellow for the Texas Tribune. He was also a Washington intern for the Durango Herald. Andrew is a graduate of American University.