Airlines can accrue bottom-line benefits from the use of cockpit visual guidance systems which enable increased operations in adverse weather, as Bernie Baldwin reports.
“The Party told you to reject the evidence of your eyes and ears.” This quote from George Orwell’s 1984 might not readily spring to mind when the topic of aircraft cockpit avionics arises. The latest visual guidance systems, however, could certainly be said to ask pilots to follow that maxim, at least when it comes to their eyes.
Whether called a head-up display (HUD), a head-up guidance system (HGS), head mounted display (HMD) or even a head-worn display (HWD), these systems all bring major benefits to pilots, their airlines and, ultimately, to passengers. The cockpit, though, is the starting point from which those benefits accrue.
Grant Blythe, senior manager, product marketing at Collins Aerospace, explains that the direct role of such a system is to improve the interface between an aircraft and the pilot operating it. “While pilots are highly skilled at flying with instruments, head-up augmented reality displays present information to a pilot in a way that allows them to view their instrument data and the outside world simultaneously without having to choose between the two,” he begins. “With this higher fidelity flight information, pilots can operate aircraft both more safely and with expanded operational capabilities.
“These systems are usually made up of three major components – a transparent display, a software application with vision algorithms, and various integrated cameras and vision sources,” Blythe continues. “The transparent display may take the form of a fixed HUD or a head or helmet mounted display and can utilise a number of different projection or waveguide optics technologies. These displays are focused at infinity to eliminate need for the pilot’s eyes to refocus rapidly between distant environment and close head-down displays.
“Software applications are the heart of the content displayed on HUDs. These applications, like primary flight displays, consume data from all available aircraft sensors and generate the symbology that is displayed on the HUD. That symbology is specially formatted to align precisely with the outside world and is often focused on the flightpath and energy management of the aircraft, two areas in which HUDs excel,” Blythe remarks. “In the case of HGSs, sophisticated guidance control algorithms are added to the core HUD software to enable manually flown CAT III instrument approaches and low-visibility takeoff (LVTO), which are major operational benefits for HGS users.
“Finally, a number of sources of vision system video are also integrated with the core symbology. These include enhanced vision and synthetic vision video that are precisely scaled and aligned to generate the full augmented reality view for pilots,” the Collins exec adds.
Leading turboprop manufacturer ATR offers the ClearVision system, developed in collaboration with Elbit Systems, as an option in the avionics suite on both the ATR 42-600 and ATR 72-600. As with the Collins products, the system has several components, among which is a head mounted display (HMD) composed of three main elements, reports Bernard Krier, the OEM’s head of aircraft systems and propulsion.
The first element is Skylens, a HMD visor worn by the pilot on which flight parameters and guidance information are projected. Next comes the Optical Tracker Fixed unit (OTF), installed in the cockpit, which enables the system to follow the pilot’s line of sight. “This ensures that, if a pilots moves their head left, right, up or down, the projection on the visor adjusts accordingly,” Krier explains. “The runway will appear where it actually is, for instance, instead of being a fixed image projected at the centre of the visor.”
The third element is a computer unit, which collects data from the aircraft systems and the OTF, processes them and then generates the information to be displayed on the Skylens.
“One of the benefits of such a system is to reduce pilot workload and increase safety margins. This makes it possible for pilots to look right in front of them, benefitting from all the necessary information during approaches, which are critical flight phases, without having to look down to the cockpit LCD screens,” Krier emphasises.
While basic HUDs have been around for some time, their capabilities have been improved considerably with the addition of enhanced vision systems (EVS) and synthetic vision systems (SVS). ATR’s ClearVision, for example, integrates an EVS which displays an augmented view of the terrain on the visor in real-time, thanks to a multi-spectral camera and sensors fitted on the nose of the aircraft. The different images captured are recombined to provide the clearest image to the pilot through the HMD.
“There are several benefits to this,” says Krier. “Firstly, the EVS enhances the pilot’s vision by capturing, magnifying and displaying on the HMD, elements which are invisible to the human eye. For example, during a foggy approach, the EVS will enable the pilot to ‘see through’ the fog, enhancing their situational awareness.
“Secondly, the EVS provides operational benefits to the airline as it enables pilots to carry out approaches with a reduced runway visual range (RVR) and get operational credentials equivalent to Cat II operations. For airlines operating in airports subject to frequent fog, this system can drastically reduce flight disruptions or cancellations and therefore boosts carriers’ profitability.
“A study showed that in the space of a year, ClearVision could have saved 24 of the 48 forbidden landings faced by Guernsey airline, Aurigny,” the ATR exec reports. “This would naturally have a significant impact on the costs the airline faces resulting from the disruption of its operations, from delays, diversions or cancellations.” Since the report was published, Aurigny has incorporated ClearVision on its ATR 72-600s.
“ClearVision offers pilots increased visibility and improved situational awareness without requiring expensive upgrades to an airport’s infrastructure, which in many regions of the world may even be completely unfeasible anyway,” Krier comments.
Grant Blythe reports a similar set-up in Collins Aerospace’s EVS, which adds a multi-spectral sensor made up of multiple infrared and visible light cameras installed in the nose of the aircraft. “These EVS cameras allow pilots to ‘see-through’ darkness and poor visibility conditions like fog, rain, smog, or snow. Aircraft equipped with EVS are allowed to fly enhanced flight vision system (EFVS) approaches in lower visibility conditions than non-EVS equipped aircraft, leading to more reliable operations,” he remarks.
“Based on a computer database instead of cameras, a synthetic vision system generates an artificial view of what the pilot would be able to see in clear conditions. Specifically, SVS takes the current position and orientation of the aircraft to render an artificial view of the outside world. This rendered imagery includes terrain, airfields, and tall obstacles like towers and bridges. Synthetic vision provides great situational awareness and reduces the risk of experiencing spatial disorientation in instrument conditions.
“Finally, our combined vision system (CVS) merges imagery from both EVS and SVS to present pilots with the best possible vision information available,” Blythe adds. “While EVS can still be affected by weather and SVS doesn’t provide real-time detail, these complementary systems can be merged into a full CVS solution that provides the best of sources in all conditions.”
Naturally, in an industry so competitive, new technologies and digitalisation of applications are being developed constantly. Companies are working on the next ‘layer’ for these types of systems, whatever that might eventually be.
“Today, we’re only just scratching the surface of what vision systems can do,” Blythe declares. “Though we can provide these impressive vision capabilities now, pilots are still solely responsible for interpreting what they are seeing and taking appropriate action. In the future, more intelligent processing of vision information will enable automated, early detection of safety concerns like other aircraft or drone traffic, runway incursions, or wrong surface operations, and then alert pilots to act.
“New vision sensors will also improve the performance and capabilities of EVS systems. While today’s infrared-based EVS sensors are limited to about a 33% improvement in visibility, new sensor technologies could provide for operations in effectively zero visibility conditions,” he reports enthusiastically. “These sensors on the aircraft will also expand EFVS operations to cover full gate-to-gate operations in all weather conditions including taxi, takeoff, and landing.”
ATR is also exploring possibilities to develop the ClearVision system. “We’re doing this with our customers’ needs at heart, taking into account their operational constraints and the value each technology will bring for them,” Krier says. “The objective will always be to provide further versatility to our aircraft, which in turn means more connectivity for local communities, either in challenging environments or in areas where airport infrastructure is limited.”
“As noted, the system integrates an EVS to enhance landing capability, which today is limited to 100 ft. Pilots must be able to see ground elements with their own eyes. We are now working towards its certification to 0 ft, namely ground level,” he adds, echoing the goal that Collins is aiming to achieve.
“We are also exploring the potential of HMD operations without the EVS, for increased situational awareness. Enhancing our takeoff capabilities in low visibility conditions could also be considered, as well as a night vision system for operations in environments without any lights,” Krier reports.
More completed flights, fewer occasions for airlines to make compensation payments, passengers happy to be reaching their destinations on time, cockpit visual guidance systems deliver them all with increased safety. The further capabilities discussed can only benefit stakeholders, who just might be able to believe the results their eyes are seeing.
Author: Bernie Baldwin
Published: 10th May 2022
Images courtesy of Collins Aerospace/ATR