AI and industry-level safety: Recent Airbus issue
- Melodena Stephens
- Nov 30
- 3 min read
Updated: 5 hours ago

Cartoon Image: Gemini Nano Banana
Updated 3 December 2025
It took 65 years for the airline industry to get acceptance from the masses – for 50 million people to adopt it. Meanwhile that time allowed the industry to deliver standards and regulations for safety. Yet behind the scenes a key change has been AI adoption – whether we are looking at airports or airplanes.
Yesterday it was Airbus’s turn to recall 2/3 of it A320 family of jets being flown across 350 operators’ due to a limitation of the AI system. There are over 11,000 planes in this family in service globally. The aircrafts needed a software update on its ELAC 2 computer, as the solar radiation was causing bit flips "when a binary number changes from 0 to 1 and rewrote the software" and in one case caused a flight to plunge mid-air (luckily the pilots recovered the situation unlike the Max Boeing AI issue with the MCAS system). The ELAC (Elevator and Aileron Computer) systems manages key flight parameters such as stabilizer control and ensures the aircraft stays within its prescribed flight limits. Now the way the ELAC is built makes it technically NOT an AI as described by the EU AI Act, but it is part of the AIS systems the plane uses.
This has happened before in 2008 - when Qantas Airlines Flight 72, an Airbus A330 was flying from Singapore-Changi International Airport to Perth Australia, and the aircraft pitched over. For the first event, the aircraft was at 37,000 feet and the autopilot automatically disconnected following warnings from one of the three Air Data Inertial Reference Units (ADIRUs). Two minutes after the disconnect, the aircraft pitched over abruptly to down angle of 8.4°. The captain corrected with aft stick but the fly-by-wire system didn't react for 2 seconds. The problem repeated a few minutes later and the crew correctly decided to land as soon as possible (read more here).
We forget that AI systems are a combination of hardware, software and data and human beings. There was a limitation in the software that could not manage for solar radiation of high amounts. It sent signals to the systems which led to an automat maneuvers. The aircraft that faced the issue had the newest ELAC software installed and Airbus determined reverting to the previous version eliminates the risk.
There is a limitation of hardware as Airbus has said “5,100 A320 aircraft were updated quickly, but 900 older models will need a replacement computer”. In some cases, the boxes needed to be shipped to Thales for the work which manufactured the ELAC system. Thales initially responded saying that while it manufactures the ELAC hardware, and that comply fully with Airbus specifications and regulatory certifications, the vulnerability of the software was outside Thales's responsibility.
The software updates could take 15 minutes to two hours. Kudos to A European Union Aviation Safety Agency (EASA) Airbus for promptly addressing the issue which was not the case for Boeing despite two crashes!
The recall shows
(1) how AI continues to test the Airline safety standards
(2) why pilots still need to be able to override the computer systems and need the training to do so
(3) intense fragmentation of the supplier systems with AI systems
(4) the limits of our knowledge when we realize AI needs to operate in various contexts (even edge cases).
The fly-by-wire technology—replacing mechanical controls with electronic systems began as early as 1980s. By the way we see the same effect with airports too and there have been some major cases of bug releases gone wrong in UK in 2023 and India this year.
Here is an interesting research article on AI systems in airplanes: https://www.researchgate.net/publication/232632092_Architecture_Optimization_Based_on_Incremental_Approach_for_Airplane_Digital_Distributed_Flight_Control_System


Comments