Скачать 202.63 Kb.
Two decades after the above L-1011 accident, an Airbus A300 experienced a similar in-flight incident off the coast of Florida (NTSB, 1998a). At the start of a descent into the terminal area, the autothrottles were holding speed constant, but unknown to the pilots, they were no longer controlling the airspeed when the aircraft leveled off at an intermediate altitude. The aircraft slowed gradually to almost 40 knots (kts) below the last airspeed set by the pilots and stalled after the stall warning activated. There was no evidence of autothrottle malfunction. The crew apparently believed that the automated system was controlling airspeed; in fact it had disengaged. In this aircraft a single press of the disconnect button will disengage the autothrottle control of airspeed. When the system disengages, the green mode annunciator in the primary flight display changes to amber and the illuminated button on the glareshield used to engage the system turns off.
The NTSB (1998a) noted that the change in the annunciators could serve as a warning. However, the passive way in which the displays were formatted did not attract attention. The NTSB also pointed to autothrottle disconnect warning systems in other aircraft that require positive crew action to silence or turn off. These systems incorporate flashing displays and, in some cases, aural alerts that capture the pilot's attention in the case of an inadvertent disconnect. These systems more rigorously adhere to the principle of providing important feedback to the operator about the state of an automated system. Internal transitions between different machine states or modes are sometimes hidden from the user, and as a result the user is unaware of the true state of the machine. This might lead to annoyance or frustration with simple systems, such as VCR/TV controls, where the user fumbles with adjusting the TV while the control is actually in VCR mode. In more complex systems the lack of salient feedback about automation states can lead to catastrophe (Degani, 2004; Norman, 1990).
In 1994, an A300 crashed in Nagoya, Japan, after the pilots inadvertently engaged the autopilot’s go-around mode. The pilots attempted to counter the unexpected pitch-up by making manual inputs, which turned out to be ineffective (Billings, 1997). The pilot attempted to continue the approach by manually deflecting the control column. In all other aircraft, and in this aircraft in all modes except the approach mode, this action would normally disconnect the autopilot. In this particular aircraft, the autopilot has to be manually deselected and cannot be overridden by control column inputs. Consequently, a struggle developed between the pilot and the autopilot, with the pilot attempting to push the nose down through elevator control and the autopilot attempting to lift the nose up through trim control. This caused the aircraft to become so far out of trim that it could no longer be controlled.
These types of misunderstandings result from a mismatch of the pilot’s mental model and the behavior of the automated system programmed by the designers (Sherry and Polson, 1999). Several other examples of incidents and accidents resulting from these system misunderstandings have been reported (Billings, 1997; Funk et al., 1999; Sarter and Woods, 1995). While some have had benign outcomes and simply become “lessons learned,” others have involved serious loss of life (Leveson, 2004).
In 1997, a single-engine airplane operated by a non-instrument-rated pilot took off under instrument meteorological conditions. About two hours later, after following a meandering course, which included reversals and turns of more than 360 degrees, the aircraft crashed into trees at the top of a ridge. No mechanical problems with the airplane’s controls, engine, or flight instruments were identified. A person who spoke with the pilot before departure stated that the pilot “... was anxious to get going. He felt he could get above the clouds. His GPS was working and he said as long as he kept the [attitude indicator] steady he’d be all right. He really felt he was going to get above the clouds.”
Undoubtedly, many factors played a role in this accident, but the apparent reliance on GPS technology, perhaps to compensate for insufficient training and lack of ratings, stands out as a compelling factor. This general aviation accident further exemplifies the danger of over-reliance on automated systems (NTSB, 1998b).
|Time 2 Beat 122 Entries (12” dogs judged by barbara Bounds; remainder judged by Sandy Moody)||Table Substance identity 2 XI Table Constituents 3 XI Table Impurities 3 XI|
|Appendix Bibliography (list of all articles cited and what chapter cited in)||Table Substance identity 2 VII Table Constituents 2 VII Table Overview of physico-chemical properties 3 VII|
|Round Table Table ronde The New Citizenship Guide. A round Table Le nouveau guide sur la citoyenneté||Note for Philip: Each section of the content is preceded by the columned table, copied over from your original document. The main body text for each section is outside of the table|
|Table structure for table `authors`||These publications cited papers and books authored and coauthored by S. A. Ostroumov|
Примеры работ, цитирующих публикации с авторством и соавторством д б н. С. А. Остроумова. These publications cited papers and books...
|Were medieval muslims really tolerant when judged by modern standards?||Were medieval muslims really tolerant when judged by modern standards?|