Note: Where available, the PDF/Word icon below is provided to view the complete and fully formatted document
High anxiety: air safety is under threat from breakdowns in communication between pilots and the increasing application of complex computer technology on aircraft

DAVID WOODS: One pilot described it to me as he got a clearance it was going to be difficult for the aeroplane to meet. It was a late clearance in terms of the descent of the aircraft. And as soon as he heard, he knew he had to hurry so he quickly put his head down, looked at the computer screen, started typing madly. And after about the first three keystrokes, he sort of thought to himself 'Why am I doing this? This is difficult. I should have just negotiated with the air traffic controller, and found a different flight path to fly today.'

STAN CORREY: Professor David Woods, of Ohio State University, a world expert in the use of advanced technology aircraft - the big commercial passenger aeroplanes, which are beginning to dominate the world aviation industry.

Hello, I'm Stan Correy. Welcome to Background Briefing.

This week: High Anxiety - pilot error, automation surprises and global aviation safety.

David Woods believes that improved technology has been a significant factor in improving the overall safety of world aviation. And despite 1996's bad record in big passenger jet crashes, global aviation has one of the best safety records of any industry.

But there's a new element in the debate about aviation safety, which is worrying many experts. What happens when the people and the computers that run the aircraft have a communication breakdown?

PETER LADKIN: There was one A320 accident in Bangalore in which they were coming in to land on a perfectly clear day and the aeroplane behaved in a way they weren't expecting and they were discussing how to get the aeroplane to behave in the way they were expecting. And they were actually fairly recently well trained pilots from Airbus, and they were actually discussing the system state, and as they were trying to figure out which buttons to push and what to do, the aeroplane flew itself into the ground.

KEN FUNK: The aircraft was on approach to Miami International Airport. They attempted to lower the landing gear; there was a problem with the nose gear indicator not coming on, showing that the gear was down and locked. The entire flight crew and a jump-seat occupant turned their entire attention to solving the problem.

They thought .. they tried several hypotheses and they were fiddling with a light bulb actually at some point in this, had engaged the auto-pilot. And we're not really sure .. the National Transportation Safety Board is not absolutely certain, but it is believed there is a very good likelihood that the Captain bumped the control yoke, inadvertently disengaging the auto flight system, and the aircraft gradually descended and by the time they realised what had happened, it was too late.

STAN CORREY: Inexplicable failures and breakdowns between flight crew and the technology won't go away just because the technology is more complex and sophisticated, and because a new generation of 'Nintendo kids' becoming pilots are more comfortable and relaxed with computer screens.

Aviation authorities in many countries are pushing big money into research about human factors and aviation automation.

David Woods was a consultant on a US report on aviation automation released earlier this year. And our own Bureau of Safety Investigation also published an interim report on advanced technology aircraft.

Here's an extract from the introduction to that report, which collected opinions from pilots, air traffic controllers and other flight management personnel in Australia.

EXTRACT

BUREAU OF SAFETY INVESTIGATION REPORT: The aim of Phase 2 of this project is to identify potential pilot system interface factors in advance of serious consequences. These areas of conflict are not necessarily being identified by existing government and airline safety systems, for the following reasons:

Human factor incidents tend to be under-reported. There's often a resistance to reporting for fear of adverse consequences and, perhaps most importantly, pilots may perceive errors as very minor, perhaps not recognising that they may be indicators of larger problems.

STAN CORREY: Public perceptions about aviation safety can also add to political pressure on experts to find answers. In the coming year, paranoia about air transport may rise to new heights, with Michael Crichton, of 'Jurassic Park' fame, just releasing his new techno-thriller, "Airframe".

It deals with a passenger flight from Hong Kong to Denver, that without warning goes out of control and causes several deaths before the plane is stabilised. From sketchy reports of the novel - the book is only now going on the bookshelves in the US - the accident investigation is threatened by a cover-up because of possible damage to a commercial sale of the aircraft. Movie rights are already being negotiated.

When accidents occur, pilot error is often blamed, and on the surface, the statistics support this. But David Woods thinks it's a too simplistic approach that doesn't help when it comes to understanding aviation accidents.

DAVID WOODS: While it is typical for people to see the pilot as the 'cause' of the accident, that's often because the pilot's the last person who had the opportunity to take an action to make the sequence of events move in a different direction. But if we look behind that individual pilot or flight crew, we find a whole story, we find a whole series of factors that come together to create the hazard.

The level of safety we've achieved in the aviation industry means that we can't have a single thing go wrong in the system and create a problem, create a near miss or an accident situation. Instead it takes several things to go wrong and to build up. It's in some sense a side-effect of our success at reliability, engineering and creating a very effective and safe system. So that when it breaks down now, it breaks down through a very complicated story where there's not a single cause; but it takes several kinds of things, each necessary, but only jointly sufficient to create the conditions that breed potential accidents.

STAN CORREY: Nancy Leveson is the Boeing Professor of Computer Science and Engineering at the University of Washington in Seattle. Last year she published a book called Safeware, System Safety and Computers. She's won awards for her work on system safety in aeronautics and astronautics and was the Chair of the US National Academy of Science Committee that examined space shuttle software for NASA.

When I spoke to Nancy Leveson from her home in Seattle, I expressed my surprise that as a computer scientist she was so critical of what computer technology could deliver in providing safety.

NANCY LEVESON: Well, I specialise in making them safe, not necessarily in encouraging everybody to use them for every possible use. I actually got started in this field a long time ago, 1980, when computers were first starting to be used as very safe critical systems, and I'd gotten a call from someone at a large aerospace firm who was making a torpedo. And they told me that they weren't so concerned that the torpedo missed the other guy, but they were concerned that it turned around 180-degrees and hit them. And they thought this was a software safety problem, and I said 'Well, I'd never heard of such a thing, but I'd look at it for them.'

And later, when they tested this torpedo, they told me, they called me up and they said 'Well, you know, Nancy, we took her out into this testing ground and we tested this torpedo and every time we tried to fire it, it came out of the torpedo tube, it turned itself off and went down to the bottom and it just sort of lay there.' And I said 'Well, it's safe.' And they said 'Well, the Navy didn't want to pay for this safe torpedo.' And this is really, I guess, when I started realising that there are trade-offs between safety and reliability, and what they had to do was one by one, take off safety devices in order to make this thing more reliable and more effective.

And this is true in a lot of our systems, that making things safe may require some compromises. It's true in non-high tech systems, and it's going to be true in other kinds of systems.

STAN CORREY: Nancy Leveson.

Peter Ladkin is Professor of Computer Science at the University of Bielefeld in Germany. He publishes on the Internet a compendium of computer-related incidents with commercial aircraft.

And like David Woods and Nancy Leveson, Ladkin is no Luddite, but is concerned to discover that if problems are emerging with computer systems in high-tech aircraft, then solutions need to be found quickly.

I asked Peter Ladkin what we mean by aviation automation.

PETER LADKIN: The entire flight management has been placed on a computer essentially, for a long time. Before the days of digital computers, one had auto pilots. Auto pilots exist in some form in virtually every aeroplane that's flying now. They're also a form of computer; but they're not digital, they're analogue. What goes in and what comes out, tends to be very predictable.

There's also a trend recently to replace the mechanical controls, the hydro-mechanical controls, the indications that the pilot gives to the control services on the aeroplane, with wires and cables because it's found to be a lot easier to send, and certainly a lot cheaper, to send electrical signals and even optical signals down thin fibres, than to have heavy control cables and hydraulic systems running through the aeroplane.

So there's been a transition towards essentially computer controlled flight controls. So there are at least those three different aspects: simple passenger things - service to passengers; secondly, navigation and flight management; and thirdly, the actual physical control of the aeroplane that can go through computers.

The passenger stuff is relatively new. The flight management has been around for at least 15 years. It first flew really on the 757 and 767. We're talking the beginning of 1983. Those aeroplanes have been flying with computers doing the flight management ever since then.

Actual fly-by-wire controls, where the aeroplane is controlled by computer, is much more recent. That really arrived with the Airbus A320, and that's just a bit more than half a decade old now.

These all bring various questions into play; questions how the design is actually made, how the designers conceive the computer shall work, and also how the pilots use them. And there are at least these three aspects.

And there's the further aspect of maintenance. What happens when aeroplanes get old? Computer code doesn't get old, and computers don't get old, but wires do get old and various things like that.

So we're seeing different sorts - or we will be seeing different sorts of maintenance problems.

EXTRACT

UNIDENTIFIED: : While we were in steady state level flight, the go-around button was hit by accident by a clipboard. This caused the auto pilot to disengage and the throttles to advance to take-off thrust. Following the confusion of the moment, the aircraft started to climb. By the time we re-engaged the auto pilot and stopped the climb, we'd gained about 400 feet.

STAN CORREY: That's a real incident from NASA's Aviation Safety Reporting System.

Pilots are encouraged to anonymously submit safety incidents of any kind to NASA, and these incidents are studied to improve safety. Other countries, including Australia, have similar systems. A lot of incidents mention problems with auto pilots.

Now auto pilots have been in existence since the 1930s. In fact David Woods told me that one of his colleagues found a patent for a piece of flight control automation that predated the Wright Brothers' first flight in 1903.

But does the introduction in recent years of fly-by-wire planes where the computer can control the aircraft from take-off to landing, mean we've entered a different era? Peter Ladkin.

PETER LADKIN: I would say, yes, we are in a different period. The automation that is being used now is fundamentally different in the sense that many aeroplanes are now fly-by-wire. The Airbus A320 is an example; the Airbus A330; the Airbus A340; the Boeing 777 which has just come into service. They're all fly-by-wire aeroplanes. And there is talk that the new Boeing 747 is also going to be fly-by-wire. Same body, but a different way of controlling the aeroplane. This is definitely new. It's not new to the military, but it's new to commercial transportation.

Also the flight management systems have gotten an order of magnitude more complex. Fifteen years ago, they were relatively manageable and now they've gotten ten or more times as large, and in a complex system that's an awful lot of extra complexity. So, I think we're now looking at complexity issues that are much more difficult than we saw 10 years ago. Whether we actually have the tools to handle that or not is a question that the experts can discuss, but that is a significant difference.

On the other hand, I should say that the rate of aviation accidents has stayed pretty constant for quite some time and the extra worry is because people are seeing an enormous rise in the use of aeroplanes for travel. And if the rate of aeroplane accidents stays constant, that means we're going to be seeing in another 15 or 20 years, one major aeroplane accident a week. One has to understand the difference between the frequency of individual accidents, one per week, and the rate, which is measured in terms of how many accidents per passenger mile, or something like that.

And aviation is very safe in terms of the rate, but that rate stays constant and aviation people are simply not happy with that. They want to reduce it, and it's not known how to reduce that rate yet. We're working on it, lots of people are working on it in different aspects, but this is a new problem. The rate has never been this low and it has to go even lower.

EXTRACT

UNIDENTIFIED: The Captain then said 'What's going on?' - at which point the aircraft was observed 300 feet high. It had entered a subtle climb, seemingly on its own accord. This is another case of learning to type 80 words a minute instead of flying the aircraft. The more automation there is in the aircraft, it just means the flight crew should work that much harder to remain an active and integral part of the loop.

STAN CORREY: This type of incident is no surprise to Ken Funk, who's Assistant Professor of Industrial and Manufacturing Engineering at Oregon State University.

He's one of a team of researchers who's just completed phase one of a study for the Federal Aviation Administration in the US. The study is called 'Possible Problems with Flight Deck Automation'.

KEN FUNK: You almost have to look at the flight deck now as a place where there are multiple actors engaged, and by actors I mean entities, human or machine, that do certain goal-directed things, perhaps at cross purposes. So, we have on the flight deck, human actors and machine actors.

And I think one of the critical issues - and this is based upon my opinion, this doesn't really necessarily reflect any official position of Oregon State University, much less the Federal Aviation Administration - but I believe one of the critical issues is that in many cases the human actors, the pilots, are really unaware of just exactly what it is the machine actors, the automation, is doing.

Now the information is there. There are flight mode enunciators on the primary flight display and elsewhere, depending upon the aircraft. There are other ways of knowing what the automation is doing.

But given the workload, the cognitive complexity, the tempo of things, especially in abnormal circumstances, it's sometimes difficult for them to tell just exactly what the automation is trying to do. If it's a human pilot you can ask him or her, and in fact you may not even need to ask, you may know from routines, from maybe personal experience with that other individual, what he or she is attempting to do. But it's not necessarily so easy when that other actor is a machine, to know what it's trying to do.

STAN CORREY: David Woods and his colleagues at Ohio State are perhaps the world experts in collecting and analysing flight crew experiences in the 'glass cockpit'. That's what the pilots call the modern cockpit, filled with computer screens.

DAVID WOODS: Now if the computer's going to fly the aeroplane, you have to tell the computer what you want it to do. And one of the ways you do that is by using the keyboard, just like you would use on any other kind of computer, except they've customised the keyboard to make it fit the aviation context. So it's smaller, the layout of keys is a little different and there's more special keys that are set up to handle the kinds of instructions a pilot would want to give the computer that are adapted to the kinds of commands you would want to give a co-pilot, for example. So, pilots will interact through that keyboard to tell the computer how they want the computer to fly the aeroplane.

Now sometimes that might seem to be a very slow process. We all know how we are not all the best typists, and pilots are the same. So sometimes when we need to do things quickly, there's another kind of interface we can use to tell the computer what we want to do. And that looks more conventional, but it's still an interface to talk to the computer. And as we reach up above the different computer displays, CRT displays, and there's a series of knobs and dials that you can turn and give instructions. For example, you could dial in a heading, and you would turn the knob, a little display would show the heading indication that you were entering, and when you got to the correct one that you wanted the computers to use to fly the aeroplane, you could enter that and the computer would then bring the aircraft to that new heading, or to a new altitude.

STAN CORREY: David Woods.

If we can talk about pilot error, what about software reliability?

Nancy Leveson states it's a myth that simply changing the software will increase safety. Leveson says it's easy to change software, but making changes without introducing errors is extremely difficult.

NANCY LEVESON: What we're finding is that the software is just not behaving in a way that is consistent with either the human model, the pilot model of how it should operate, or maybe the pilot doesn't even understand the complexity of how it's operating. It's really more. The accidents are not arising because of failures or errors in a single component of these complex systems, but instead in the interaction between the components, in the interaction between the pilot and the software, or the software and the plane. And so it's very difficult.

Most of these, for example, the Airbus A320 accidents, of which there have been quite a few, have all been related to this sort of interface problem: the software doesn't behave in the way that the pilot expects, or it doesn't behave in a way that is reasonable in a crisis situation, or in an extreme situation under certain rare, unplanned or unpredicted circumstances. And so we can't say 'Oh there's this missing semicolon', or there's somehow something particular wrong in the software; it's just the whole conception of what the software should have done and how it should have worked, was incorrect.

STAN CORREY: Nancy Leveson.

Peter Ladkin analyses some actual accidents with A320s in his compendium of computer-related incidents with commercial aircraft. One accident occurred with a Lufthansa A320 at Warsaw in September 1993. For non-aviation people, wind-shear relates to the direction and strength of the wind when a plane is coming in to land.

Peter Ladkin.

PETER LADKIN: One recent example is the Airbus 320 accident in Warsaw. That was a Lufthansa aeroplane. It landed in Warsaw in a thunderstorm, and there was some potential for wind-shear. They'd received a wind report from the ground which was different from the wind report they had aloft. They were expecting wind-shear on the final approach so they came in faster than normal. They were expecting a wind-shear, they were expecting to lose suddenly about 15 or 20 knots.

That never happened. So they landed the aeroplane on a very wet runway 15 to 20 knots faster than they thought that they would have been landing, and it simply didn't brake. They put the aeroplane nice and gently down on the runway, and they waited, and they waited nine seconds. Now when you're in a fast-moving aeroplane on a runway, waiting nine seconds is a very long time. We can count them off, but I won't bother to do so. You can look at your watch and see.

Nothing happened. The brakes didn't work, the spoilers didn't come out, and the thrust reverse didn't operate. And in fact, after nine seconds, then the spoilers came out and the thrust reverse started to operate, and four seconds later they had positive braking action on the wheels, and it's the wheels that are in fact the most important braking system in the aeroplane.

So the aeroplane over-ran the end of the runway and ran into an earth bank that was at the end of the runway; hit the earth bank, ran over it, broke into pieces and eventually burned.

STAN CORREY: One passenger and First Officer died in the crash. All other passengers and flight crew escaped. Why did the brakes fail? Was this accident an example of pilot error or computer system failure?

PETER LADKIN: According to the official report, there are some aspects of pilot error. The pilots misjudged their speed on arrival. However, there is a system failure component to that, namely that they thought they were getting a different kind of weather forecast from the one they actually got. They were getting one which was old; they were assuming they were getting one which was actually current. That's a system failure.

Their judgment to come in fast can be questioned. If that's regarded as not a right decision, that would be a human factors component from the pilots; that would be pilot error, if you like.

But in fact the understanding of how the system was going to work in those circumstances is a bit like looking at a computer program: it's written in the pilot's operating manual how things are supposed to work. But as I found out, the pilot's operating manual is not complete, and it's not necessarily precise in certain circumstances. And so their understanding about how the system functioned, was not sufficient to enable them to predict the exact behaviour of the system in those circumstances.

Now that is also a system failure, as far as I see - failure to ensure that the exact working of the system is understood by all of the people who need to know it. And that's something that has been pointed out also by pilots in other circumstances, namely, one sees a lot of comments about how the operating manuals are not quite sufficient to let people know how the plane will be behaving.

EXTRACT

PILOT: After take-off checklist.

CO-PILOT: After take-off checklist: landing gear up and off; flaps are up; checked up; altimeters later. After take-off completed.

PILOT: Okay. Central auto-pilot on, please.

CO-PILOT: Centre auto-pilot is on command.

PILOT: Thank you. One-zero-one-three.

CO-PILOT: One-zero-one-three.

PILOT: Rudder ratio Mach airspeed trim.

CO-PILOT: Yes, trim.

PILOT: There's something wrong. There are some problems. Something crazy, do you see it?

CO-PILOT: There is something crazy there. At this moment 200 only is mine and decreasing, Sir.

PILOT: Both of them are wrong. What can we do? Let's check the circuit-breakers.

STAN CORREY: A re-enactment of the cockpit conversation of Birgen Air 757 just after take-off from Puerto Plata in the Dominican Republic on February 6th, 1996.

EXTRACT

CO-PILOT: Thrust.

PILOT: Disconnect the auto-pilot. Is auto-pilot disconnected?

CO-PILOT: Already disconnected. Disconnected, Sir.

PILOT: Not time! What am I to do?

CO-PILOT: You may level off, altitude okay. I am selecting the altitude hold, Sir.

PILOT: Select, select!

CO-PILOT: Altitude hold. Okay, 5,000 feet.

PILOT: Thrust levers, thrust! thrust! thrust! thrust!

CO-PILOT: Retard.

PILOT: Thrust! Pull back. Don't pull back! Don't pull back! Okay. Open! Don't pull back, please! Don't pull back!

CO-PILOT: Open. Sir, open!

PILOT: What's happening?

STAN CORREY: Minutes later, the plane crashed into the Caribbean, killing all passengers and crew.

PETER LADKIN: In the case of the Birgen Air accident in the Dominican Republic, the final report has just come out and I haven't seen it yet. But I do know that the pilots' behaviour was quite severely criticised, and I believe that is basically correct. There is adequate justification for saying that they really should have known more and behaved differently than they did.

There are some things, which as a pilot myself, I simply cannot understand about the way they behaved. They knew there was a failure on take-off, and they correctly identified the failure. What they didn't do, was do the normal things about it. There are specific things to do that are contained in the Operating Manual. When you have a failure of this sort, you essentially switch to what is called Alternate Air Data. They didn't do that.

And when the Captain turned on the auto-pilot, the centre auto-pilot a little while later, the centre auto-pilot got its air data, what it thought the aeroplane was doing, from this false instrument that they'd already identified as giving false readings. So, the auto pilot was controlling the aeroplane according to these completely false air data readings.

Now after that point, I can understand the confusion, but as a pilot, what I didn't understand was why they didn't switch to Alternate Air Data as soon as they had on take-off identified the problem with the Captain's air speed indicator; and secondly, why, if they don't know what's going on and they're worried about control problems, they were allowing the plane to be flown by auto pilot at all.

As a basic grassroots pilot, if the plane is doing something I don't know about, I switch off the auto pilot and I fly it by hand. They also didn't use their stand-by instruments. There is a completely electro-mechanical set of stand-by instruments, a completely different system on that aeroplane. There is no evidence that they actually tried to fly by reference to those stand-by instruments, which in fact were reading correctly at the time. So there's a lot to criticise in the way the pilots behaved.

On the other hand, it's quite understandable for them not to know that the centre auto pilot was connected to the false Air Data source. I in fact queried a couple of 757 pilots about that myself before I got the exact information, and they weren't able to tell me exactly where the Air Data source for that auto pilot was coming from. So that's something about the automation was crucial in this accident, that even well-trained 757 pilots generally do not expect themselves to have to know.

STAN CORREY: Peter Ladkin.

Some other notable accidents which involved a breakdown between flight crew and automated aviation systems are the China Airlines crash of Airbus 300 at Nagoya in Japan, in 1994, and more recently, the crash of a Boeing 757 operated by American Airlines near Cali, Colombia, in December 1995.

David Woods, from Ohio State University, was a consultant to an FAA report on flight crews and modern flight deck systems.

While the report emphasised the positive benefits of automation in reducing the workload of flight crews, it also highlighted real and perceived problems in dealing with high tech aircraft. The experts call these problems 'automation surprises'.

DAVID WOODS: The automation surprises is a new phenomenon. It's in some sense just a kind of co-ordination breakdown that we can have between two people. If you don't anticipate or understand the person you're supposed to co-operate with in some task, they can do something that surprises you. And this can happen on the flight deck.

It seems to arise from several different factors. People might make a typing mistake; they might mis-enter something into the keyboard; they might inadvertently grab the wrong dial or grab the correct dial, but in the wrong mode, and have a small mis-entry. And the key is that it's a small error to start out with.

It may also start because the computers do something that the pilot hadn't directly told it to do. Again they're very strong systems, which means that they are capable, they're autonomous, they're capable of actions on their own. And so this is what's been called the problem of indirect mode changes where the pilot will make an instruction to the computer and the computer may say, 'Ah, if you want me to do this, you probably also want me to do something else.' And the computer will go ahead and do that on its own, even though the pilot may not be thinking ahead far enough.

So we start out on an automation surprise, with a small error. And that error produces a mismatch. The computer is trying to do one thing, and the pilot thinks it's doing something different.

STAN CORREY: David Woods. So which actor is to blame for the automation surprise? The pilot or the machine actor?

Professor Nancy Leveson believes, like David Woods, it's just too easy to blame people who are suffering what the experts call 'mode confusion'; that is, who's flying the aircraft?

The problem, according to Leveson, lies with the design of the technology as much as it does with the limitations of people.

NANCY LEVESON: We're now having the computer make a lot of decisions and the problems that are arising are many which we label under the term 'mode confusion' - means that it's not really clear who's in charge at any time, and the human gets confused when the computer changes the mode of the aircraft and the human doesn't know about it. The complexity of the logic in these computer systems is getting very high. And it seems to be contributing to accidents.

Now, I think we have to be careful and understand that whenever we introduce new technology into systems, we tend to have more accidents. This has been true throughout history. And unfortunately we're not very good at understanding that the potential for a new accident and preparing for it ahead of time.

We tend to be horribly optimistic and think that new technology is just all going to be wonderful. But almost always, new regulation, new government agencies, new concern about these things, almost always post-dates accidents, comes after several and sometimes many accidents. As I say, we don't tend to be able to predict these ahead of time and deal with them.

STAN CORREY: Nancy Leveson.

Remember her story about the Navy wanting a safe, but reliable torpedo? The safe torpedo sunk to the bottom of the ocean. The point is that the rationale behind many automation systems is not necessarily safety, although increased safety may be a result.

Ken Funk of Oregon State University.

KEN FUNK: Most people understand, it is my understanding, that many of the design decisions were based on economic considerations. The flight management system is there to compute optimal climb and descent profiles in order to optimise fuel consumption. The engine indicating and crew alerting system, which is an automated system monitoring and alerting system for the pilot, was implemented in large part, to replace the flight engineer, to reduce the cost and the weight associated with that individual.

I think that implicitly everyone sort of knows that that was the case, and I'm certainly not privy to all the design decisions, really even to any of the design decisions. It is my feeling that economic incentives have predominated in the design of aircraft automation, or at least they have been given more weight than they should have been. I believe that the human has been somewhat short-changed in the process.

STAN CORREY: So the answer to the problem of automation surprise, which then could lead to what David Woods calls 'a going sour accident' is more user-centred control systems.

DAVID WOODS: When we talk about these automation surprises, in some ways the easiest way to think about an automation surprise is to use the words the pilots use themselves. We've tried to understand a lot about pilots' experience with these highly automated aircraft, and we do it in lots of different ways, from a research point of view. But it all boils down to what are the common statements that pilots make when they're flying with these very sophisticated flight computers.

Well, the most common statements on this automated flight deck have been described as statements like 'What's it doing now? What will it do next? How did we get into this mode? Why won't it do what I want? Stop interrupting me when I'm busy. I know there must be some way to get it to do what I want.' These kinds of questions that pilots sometimes have to ask themselves, or their co-pilot, are indicative of this kind of communication breakdown between the strong, but relatively silent automation, and the human flight crew.

Now what's interesting is that when people develop these highly automated systems, they usually think that 'I need less human expertise.' There's less knowledge that people have to acquire in order to be part of the system. That's usually touted as one of the benefits of automation. But what we've found in the aviation case as well as other cases, is that the new automation creates new knowledge requirements. People have to know more now, because they have to know how the automation works, not just how the aeroplane works. They have to know more now because they have to know about how to manage the automation in a wide range of different kinds of operational circumstances.

So when we talk to the people who are responsible for training pilots to fly these very sophisticated aeroplanes, we hear some very interesting messages, and again these statements are sort of ways to sort of focus in on the heart of the research results, while staying out of any kind of technical language. And one of the kinds of things we hear from the training managers is simply 'They're building a system that takes more time than we have available.' Right, because there's pressure on the training people to get pilots out flying the line.

And so they're saying, 'Whoops, this system is very complex to learn. We don't have that much time to train people. There's more to know; we have to know how it works, but especially we have to know how to work the system in different situations.'

Another statement we've heard from the training manager: 'The most important thing to learn is when to click it off.' Back to this idea that the new skill that's required is when to go to the automation and make use of it as a resource, and when to go to a less automated way to accomplish the same goal.

STAN CORREY: David Woods.

Clive Irving covers aviation safety issues for Conde Nast 'Traveller' magazine. His most recent story in the November 1996 issue dealt with Boeing's problems with the 737 rudder, a problem that some experts believed has caused at least two major accidents in the US in the past few years.

Irving thinks that the safety issue with high-tech aircraft isn't so much the interface between the flight crew and technology, but between the aeroplane and current air traffic control systems. It's an issue that's set to become the next hot phase of the aviation automation debate.

CLIVE IRVING: The desired object of all of this is something called 'zero accidents' which in a sense is never going to be attainable, but the fact is that flying is incredibly safe already, and what we're arguing about here really is making it safer than it is within a regime which has already achieved enormous advances in safety. And I think that's why I feel particularly strongly that the focus should be shifted to all the logistical support systems of which air traffic control is one, the training of air traffic controllers and so on is another one. So that some kind of common level, you have confidence in that, whether you're flying through Asian air space, or whether you're flying through Russian air space, whether you're flying through Latin American air space, that that can be achieved.

This isn't what's happening. What's happening is a kind of steady growth in technical sophistication in one tier of flying, and a sort of standstill, or even a falling back, in another. We are in a watershed I think. We're at a point where the regime, the safety regime and the way the safety regime is policed and regulated, has not caught up with the way that the industry now operates. It's operating on such an enormous scale, air travel is increasing, and it will increase dramatically in the Asia Pacific region in the next 10 or 15 years.

I mean just to take one example that pinpoints the importance of safety, you have to consider how many people are likely to get killed if a plane goes down. I mean it's as cynical a calculation as that. And now to meet a demand that's largely generated by the Asia-Pacific region, we're going to have a new generation of big jets, of Jumbos, both the 747 and Airbus, which are going to be carrying 550 people.

Well, when you have a situation like the one in India where you had a current 747 virtually full and you lose up to 400 people in a crash, now it means you're going to go to 550 people. This means that the whole system from the ground up and the way that the safety is designed into a plane from the very beginning and then the way that the flying is regulated and supervised, needs to come up to that point. And I think we've grown out of this idea that the industry needs boosters within it.

Every government that regulates the airline industry should make as its first priority, a totally undivided attention to maintaining safety standards. You cannot have this dichotomy, this conflict,. You just can't reconcile those two interests.

STAN CORREY: Clive Irving.

This mismatch between sophisticated aircraft and less sophisticated ground-based air traffic control, is a recognised problem for aviation safety.

In June this year, the Australian Bureau of Air Safety Investigation released its interim report on advanced technology aircraft. It was noted that airline management expressed the concern that the greatest threat posed to the operation of these aircraft is the co-ordination of the airborne and ground-based systems.

Nancy Levesen has just received a major government grant to study automated air traffic control systems. One of the proposals in the US is for what's called free flight. That's where the Captain of the aircraft can basically choose their own flight path rather than current customised routes set by air traffic control.

Leveson points out that as with flight deck automation, the initial reason for introducing automation in air traffic control isn't safety.

NANCY LEVESON: The main reason for introducing them is to get more planes in the air so that we can put them closer together; we can fly them .. there are plans to fly them directly - what's called 'free flight' from one place to another instead of along lanes the way we do it now. That would save a lot of fuel and then again, as I said, it saves a lot of money for the airlines. So there's tremendous pressure to use computers to replace the human controllers.

What we have to notice is, first of all, we've tried to build new air traffic control system an automated one, which actually wasn't that highly automated compared to what some of the plans are. And we, after .. Oh, I'm not sure how much we spent, I think it was originally scheduled to be $6 billion or $5 billion, I guess, and it was supposed to be tried out in Seattle, here, actually in 1992.

It's interesting that they always try these things on the west coast of the United States and the people making these decisions are on the east coast. I guess I shouldn't read anything into that.

But the project was finally cancelled after spending something like 6, 7, 8 billion - I'm not even sure what the final figure was - and that was a couple of years ago, and the company building it announced that they couldn't possibly have even a trial run of the thing until the turn of the century. And the cost over-runs at that point would have been just tremendous.

There are air traffic control systems, automated systems, that are being built right now in other countries and every one I've heard about is in trouble, technically. So, it's not so easy first of all to build these things. Our grasp sometimes exceeds our reach or our reach .. no, it's the other way round, isn't it. Our reach often exceeds our grasp. We want to do a lot of these things, it's not clear that we can, and it's not clear to any of us yet, and we're looking at these problems of whether we can reduce spacing between aircraft, increase the amount of aircraft and make these kinds of free flight or direct flights, these changes in routings and still maintain the level of safety that we have.

STAN CORREY: Nancy Leveson.

Discussion of aircraft accidents and incidents is a difficult subject, prone to media hype and exaggerated conspiracy theories.

The growth of air traffic in the past 30 years and the overall safety record of large passenger aircraft means that modern aviation technology must be doing something right. Yet all the experts interviewed for this program made the point that this shouldn't preclude discussion of any problems.

David Woods believes preventing future accidents means the public and the experts should avoid making easy assumptions about the benefits of automation.

DAVID WOODS: Computers are here. They bring capabilities that we can't really do without and the real challenge for aviation, as well as other areas, is how do we get them to work effectively with people? If we can accomplish that, we will get great benefits from the power of the technologies that we have today. And when we don't quite get it right, unfortunately the glitches will show up in these kinds of going sour accidents that will be seen by people as very preventable accidents because it built to a small sequence of actions and reactions, mis-assessments and mis-communications.

People will look back in hindsight and say 'How could we have let that happen?' But we can't get caught up in our own mythology about automation that it makes the job simpler, that it reduces the demands for human expertise. Instead, the increased pace of technology creates new opportunities and new needs for even more kinds of human expertise. And if we match human expertise with computer power, then the system will be very effective and even safer than it is today.

STAN CORREY: Professor David Woods of Ohio State University.

That's all for Background Briefing this week. Technical producers Greg Richardson and Mark Donn. Readings by Brendan Higgins and Ron Falk. Research Suzan Campbell; Executive Producer, Jeune Pritchard.