Crash: Computers setting up airlines for disasters?
Aircraft makers are increasingly letting computers fly planes. Driverless cars are next. But is reliance on automation dangerously diminishing their skills? When should a computer intervene and what should the interface between man and computer look like? WOLE SHADARE asks
On a visit to British Airways massive training facilities six years ago in London facilitated by the Country Commercial Manager of British Airways for Nigeria and West Africa, Kola Olayinka, I got to fly a Boeing 777. Or, at least, I got as close as someone who is not a licensed commercial pilot can: I sat at the controls of a multi-million-pounds simulator and learned how to “land” at Lagos and Heathrow Airports.
My instructor, after explaining all I needed to do adjusted the airplane’s pitch. I experienced first- hand how the airline’s pilots are ultimately responsible for landing their planes safely. I kept the plane at level as I could as computer-generated facsimiles of low-rise buildings and the Grand Central Parkway slipped quickly beneath me. Controls around me adjusted themselves to keep us moving at the right speed, and with a big bump I made it onto the runway. The simulator, again by itself, came to a stop at the Murtala Muhammed Airport, Lagos.
I stepped out of the simulation with a renewed appreciation for the sophistication of autopilot, but also for the human pilots who make sure planes operate safely.
With airline industry watchers in a seemingly perpetual state of worry about a pilot shortage, some think automation could obviate the need for those human pilots altogether. But, most experts say, the technology, the industry and the passengers are not quite ready for fully autonomous flying.
For almost as long as planes have been in the sky, aviators and manufacturers have worked to make flying a simple experience for pilots and a smooth one for passengers.
The crash of Ethiopian Airlines B737MAX 8 flight ET310 in Bishoftu near Addis Ababa killing 157 passengers has again brought to the fore the advancement of technology in the cockpit of aircraft, further highlighting its place that is fast taking over pilots’ jobs and complicating matters.
Safety concern
While it is too early to know what caused the Ethiopian Airlines crash, the fact that both planes went down shortly after take-off has raised questions about the safety of the new 737 Max 8. In a statement Monday, Boeing said it had “engaged our customers and regulators on concerns they may have” and that “safety is our number one priority.”
The increasing amount of automation in airliners’ cockpits has simplified the job of piloting. Automatic systems are particularly desirable during long cruise phases at steady speeds and altitudes.
Dark side
But they may have a dark side. A new study indicates that pilots who rely too much on cockpit automation can lose the critical thinking skills that make them able to adapt to unexpected situations. Automation has created new opportunities for mistakes to be made, by pilots who don’t understand what the machine is doing and are not necessarily paying attention. The study differentiates between the manual skills that pilots use to operate airplanes’ controls and cognitive abilities that they need for such tasks as troubleshooting and maintaining awareness of their planes’ position. Changes in the level of automation included turning off the autopilot, which forced the pilots to obey the computer’s instructions; turning off the flight director, leaving pilots to determine their directions from readings on their instruments; and shutting off everything.
Understanding the system
But one thing is certain, the Manuevering Characterisation Augmentation System (MCAS), a computer software malfunction or inability of pilots to really understand the system has been narrowed to a possible cause of the accident because of similarity between the Ethiopia crash and that of Lion Air in Indonesia that occurred October last year that followed same pattern.
The Max is outfitted with bigger, more fuel-efficient engines than earlier 737s, a change that shifted the centre of gravity forward and increases the potential for the nose to pitch up after take-off. Boeing created software known as the MCAS, to counteract this risk.
MCAS uses sensors to point the nose of the plane down if they detect they has pitched too high and could be at risk of stalling.
Skill erosion
The question people are asking is who’s flying your plane? Pilots’ manual flying skills have become dangerously eroded because they rely too much on automated systems. That’s one conclusion of a leaked report on air safety commissioned by the United States Federal Aviation Administration (FAA).
Based on voluntary incident reports from concerned pilots, crash data and evidence from cockpit observers on more than 9000 flights, the report found that some pilots were “reluctant to intervene” with automated systems or to switch them off in risky situations.
Poor training and lack of manual flight experience, it says, meant some pilots had neither the knowledge to keep up to date with changes to automated systems nor the manual skills to take over when flight computers malfunction.
Technology dependence
The automated systems at issue span the whole gamut of computerised flight aids, including autopilot and automatic control of speed and landing, which save the pilot work that computers are supposedly better at.Cockpit computers also run safety checks that ensure, for instance, that the plane’s wing always bites into the airflow at the right lift-producing angle.
But the FAA report found that pilots can get “addicted” to the automation – and that that dependence must be combated with fresh training. One reason is that trouble can arise when pilots believe flight parameters are being automatically maintained when, for some reason, they are not. Boeing recently began studying autonomous flying technologies that one day could do away with the need for pilots in commercial airliners. Meanwhile, Airbus already is working on a self-flying plane that would serve as a local air taxi for one, a technology that eventually could be applied to larger aircraft, including airliners.
Last line
The paradox of automation, then, has three strands to it. First, automatic systems accommodate incompetence by being easy to operate and by automatically correcting mistakes. Because of this, an inexpert operator can function for a long time before his lack of skill becomes apparent – his incompetence is a hidden weakness that can persist almost indefinitely. Second, even if operators are expert, automatic systems erode their skills by removing the need for practice. Third, automatic systems tend to fail either in unusual situations or in ways that produce unusual situations, requiring a particularly skilful response. A more capable and reliable automatic system makes the situation worse.
Google+