Aviation is constantly looking at ways to make flying safer and prevent accidents. The interaction between the humans in the flight deck and the increasingly automated jets they fly is a major topic of conversation in the industry, with the Royal Aeronautical Society arguing earlier this year for pilot training to be reset to meet the challenges of automation and skill fade.
[Airline pilot training: Time to revisit the basics? Royal Aeronautical Society]
The crash of an Air France Airbus A330 in 2009 is the cautionary tale of a disconnect between pilots and automation, with the lessons of flight 447 being taught in flight schools across the world.
The makers of eye-tracking technology believe their systems have a role to play, both in simulators and on the line.
“Gaze scanning has gone from being an exciting idea, an area of research, to being installed in new car models all over the world and taking its place as one of the most important technologies for road safety,” Soli Shahmehr, vice president of research instruments at Swedish eye tracking company Smart Eye, tells us. “Right now, a lot is pointing to a similar development in the aviation industry.”
The most mature market for eye tracking technology is currently road hauliers, where various products can spot if the driver is distracted or in a microsleep and trigger a warning. Gaze tracking systems are also being installed in many new cars, to complement the increasing amount of automation in new models.
“There’s so much money invested in automation these days,” Patrick Nolan, head of the aviation business at Australian computer vision company Seeing Machines, tells us. “What our technology around head, eye and face tracking is about is spending money on technology but focused on the human in the loop, understanding how that human is interacting with the automation. Are they alert, are they aware, do they understand what is happening?”
In a flight simulator, eye tracking technology — like that from Seeing Machines or Smart Eye — provides a real-time overlay on an instructor’s screen of where the pilots are looking on the Primary Flight Display or the Navigation Display. The resulting information can also be reviewed in debriefs after the simulator session is over.
A study carried out by researchers at the ETH Zurich university, in collaboration with Swiss, NASA and Smart Eye, found that the technology could help instructors analyse flying performance more accurately, by recognising weaknesses in scanning and helping them to assess the causes of pilot error. ETH Zurich researchers are now working on an eye gaze and gesture tracking project, PEGGASUS, which involves Swiss and Thales and has funding from the European Commission.
Emirates and Qantas have been using Seeing Machines’ technology, which sees a camera mounted in the aircraft’s glare shield, in some of their simulators. Qantas has used the technology to help pilots adjust to using the Head-Up Display (HUD) on their Boeing 787 flight simulators. Seeing Machines’ Nolan says an interesting simulator scenario is during an engine failure just prior to V1, the speed at which the stop or go decision is made during takeoff.
“It’s fascinating to watch the decision making,” Nolan says. “Instructors have told us that they have never had that level of granularity on where the pilots were looking when decisions were being made.”
Another potential opportunity could be to support crash or incident investigations by recreating the relevant scenarios and using the gaze-tracking outputs to better understand how pilots were making decisions in such a scenario.
The technology has of course come across some pushback over privacy concerns. Seeing Machines’ Nolan says the company is in touch with aviation regulators and unions. Both Nolan and Shahmehr each highlighted that their cameras do not record video footage or pictures and once this is explained, the technology is generally well received.
Captain Tanja Harter, technical affairs board director at the European Cockpit Association, who has looked at studies of eyeball tracking using individually calibrated goggles, said the accuracy of glare-shield mounted systems and the intended use of such technology could have pitfalls. She explains to us that, for her, the value of the studies was in looking at the larger context, where they take the statistics and adjust training needs as a result.
“It becomes a lot more tricky if the targeted use is for the evaluation of individual pilots (one reason being the lack of accuracy) and trying to assess a specific performance or draw conclusions about e.g. situational awareness. In general people use a lot more than just eye movement to ‘get a picture’, so it’s a combination. This is not to say such devices are useless…they might have a value in an overall concept, but it’s way too early for me to really judge on that,” she says.
Seeing Machines and Smart Eye say their systems are very accurate. Nolan says the glare shield location actually provides the perfect spot for their camera, while Shahmehr says the Smart Eye system gets its accuracy from its multiple cameras.
For both Seeing Machines and Smart Eye, the goal is to get their technology into real-life flight decks.
“The real aircraft solution is even more contentious,” Nolan admits. “There’s certainly still some hoops to get through if we want to get this technology into aircraft and that’s where we’re going. We make no apologies for that.”
For Smart Eye’s Shahmehr, eye tracking systems in the cockpit could prove important for the safety of passengers, crew and aircraft.
“By tracking a pilot’s gaze patterns, pupillometry and eyelids, the system could first of all determine if there is a pilot in the cockpit who is alive, awake and alert enough to fly the aircraft.”
The technology could go even further, she explains. “An advanced system could even predict which decisions the pilot is about to make and adapt to the crew’s state of mind and expected course of action. Should a pilot fail to pay attention to an important flight instrument, a gaze tracker could not only recognize this, but direct the pilot’s attention by highlighting important areas on the instrument board. Eventually, the aircraft could be able to adjust its automation settings according to the pilot’s attentional state and upcoming decision-making.”
The alertness or fatigue issue is one Nolan raises too, especially as the industry moves inexorably closer to single-pilot operations.
“One of the major US carriers that we work with is a leader in fatigue risk management and they see this technology as key,” he explains, referring back to how in trucks the aim is to sense and eventually predict microsleeps. “In aircraft, they are interested in those signals that could capture microsleep and incapacitation events and ultimately predict the early onset of these conditions. If the industry choses to proceed towards reduced crew or single pilot operations, all the OEMs, carriers and operators will want to know a lot more about the state of the pilot, before they get to being incapacitated.”
It’s got a long way to go before it makes it into flight decks at 37,000 feet. But the use of eye tracking technology in simulator training sessions seems like it could be a useful training tool as the industry tries to get the best relationship possible between humans and automation.
Author: Victoria Bryan
Published: 7th September 2021
Photo credit: Jan Huber on Unsplash