Table Judged Reasons for Failure in Events Cited 22 XIV




Скачать 202.63 Kb.
НазваниеTable Judged Reasons for Failure in Events Cited 22 XIV
страница18/18
Дата14.11.2012
Размер202.63 Kb.
ТипДокументы
1   ...   10   11   12   13   14   15   16   17   18

6.10 Can Behavioral Science Provide Design Requirements to Engineers?



The engineering of hardware and software has become very sophisticated. Design data and mathematical modeling tools abound, backed up by well-established laws of physics.

Human understanding of human behavior is much less developed. The applied discipline of human factors engineering (or human-machine systems), like the discipline of medicine, is mostly based on empirical study, with relatively few equations or substantially “hard” laws. A tendency of design engineers has been to dismiss human factors for this reason, or to begrudgingly accept design reviews by human factors professionals late in the system design cycle. But this has often proven ineffective because at this point the human factors professionals can do little beyond raising problems and are seen as naysayers who are in opposition to the proponents of the almost completed system designs.


Providing design requirements that are directly usable by design engineers is the challenge for human-automation interaction and for human factors engineering in general. Human performance in defined tasks must become representable in the same terms as those used by engineers—in both static and fast-time dynamic simulations that include mathematical models of human operators as well as other system components. Real-time simulations with real humans in the loop can lead the way.

6.11 The Blame Game: The Need to Evolve a Safety Culture




The current ATM culture supports what has been called a “blame game;” all failures, including infractions of safety rules, have causes. Responsibility for these failures must be determined and penalties meted out. This approach to safety is exacerbated by the decades-old standoff between labor and management within ATC operating staff. One result is that infractions are only partially reported; line controllers are loath to call attention to their own or to their colleagues’ shortcomings.


A different, and many believe a more enlightened approach is to have an operating culture acknowledge that errors will happen—one where operating staff are encouraged not only to report but also to suggest ways to ameliorate the factors that allow the errors to occur. The American statistician/industrial engineer W. Edwards Deming contributed greatly to U.S. military production during World War II and lectured extensively in Japan after the war. He taught the Japanese quality-control techniques and about the importance of worker sensitivity to their own work efficiency. He also fostered open communication about errors and problems both horizontally among worker groups and vertically between layers of management. The techniques worked. The Japanese became the global model for industrial production and Deming became a demigod in Japan.


More recently the Institute of Medicine of the U.S. National Academy of Sciences (Kohn et al., 2000) published the report To Err Is Human, calling upon the medical community to desist from their well known “blame game” in which medical errors are closely guarded and underreported. Physicians operate in fear of malpractice suits. Safety performance data are not shared among hospitals, and physician training emphasizes personal responsibility but not teamwork or systems improvement thinking (along the lines of Deming). Largely as a result of this report there are new efforts to change the culture. No one is saying it will be easy. The U.S. culture of litigation also needs to be changed. One Harvard medical malpractice attorney told the writer that in her experience when physicians being sued openly admitted their errors, juries were always understanding and the defendants were almost always acquitted.


NGATS may offer an opportunity to bring about a more enlightened safety culture in aviation.

6.12 Concluding Comment



The famous physicist Richard Feynman, in his last book What Do You Care What Other People Think? is quoted by Degani (2004) as describing the inspiration he received from a Buddhist monk who told him “To every man is given the key to the gates of heaven; the same key opens the gates of hell.” We can all agree with Degani when he concludes, “ I believe the same applies when it comes to designing and applying automation.” Automation may be a key to a much improved air transportation system, but it can also precipitate disaster.

7.0REFERENCES


Air Safety Week. Human factors issues emerge from Concorde crash investigation, Feb. 11, 2002.


Bainbridge, L .(1987). The ironies of automation. In New Technology and Human Error, eds. J. Rasmussen, K. Duncan, and J. Leplat. London: Wiley.


Bar Hillel, M. (1973). On the subjective probability of compound events. Organizational Behavior and Human Performance 9:396-406.


Billings, C. E. (1997). Aviation automation: The search for a human centered approach. Mahwah, NJ: Erlbaum.


Casey, S. (2006). The Atomic Chef, and Other True Tales of Design, Technology and Human Error. Santa Barbara CA: Aegean.


Decker, S., and Hollnagel, E. (1999). Coping with Computers in the Cockpit. Brookfield, VT: Ashgate.


Degani, A. (2004). Taming HAL: Designing Interfaces Beyond 2001. New York: Palgrave MacMillan.


Edwards, W. (1968). Conservatism in human information processing. In Formal Representation of Human Judgment, ed. B. Kleinmutz, 17-52. New York: Wiley.


Einhorn, H.J., Hogarth, R.M., and Robin, M. (1978). Confidence in judgment: Persistence of the illusion of validity. Psychological Review 85:395-416.


Evans, J.B.T. (1989). Bias in Human Reasoning: Causes and Consequences. Mahwah, NJ: Erlbaum.


Fischhoff, B. (1975). Hindsight = foresight: The effect of outcome knowledge on judgment under uncertainty. Journal of Experimental Psychology: Human Perception and Performance 1:288-299.


Fischhoff, B., Slovic, P., and Lichtenstein, S. (1977). Knowing with certainty: the appropriateness of extreme confidence. Journal of Experimental Psychology: Human Perception and Performance 3:552-564.


Fitts, P.M. (1951). Human engineering for an effective air navigation and traffic control system. Ohio State University Foundation report. Columbus, OH.


Forester, J., Bley, D., Cooper, S., Lois, E., Siu, N., Kolaczkowski, A., and Wreathall, J. (2004). Expert elicitation approach for performing ATHEANA quantification. Reliability Engineering and System Safety 83:207-220.


Funk, K., Lyall, B., Wilson, J., Vint, R., Miemcyzyk, M., and Suroteguh, C. (1999). Flight deck automation issues. International Journal of Aviation Psychology 9:125–138.


Gao, J., and Lee, J.D. (2006). Extending the decision field theory to model operators’ reliance on automation in supervisor control situations. IEEE Transactions on Systems, Man, and Cybernetics, Part A, 36(5):943-959.


Hogarth, R.M., and Einhorn, H.J. (1992). Order effects in belief updating: the belief adjustment model. Cognitive Psychology 25:1-55.


Hollnagel, E., Woods, D., and Leveson, N. (2006). Resilience Engineering. Williston, VT: Ashgate.


Infield, S., and Corker, K. (1997). The culture of control: free flight, automation and culture.  In Human-Automation Interaction: Research and Practice, eds. M. Mouloua and J. Koonce.  Lawrence Erlbaum Associates, 279-285.


Kletz, T.A. (1982). Human problems with computer control. Plant/Operations Progress, 1(4), October.


Kohn, L.T., Corrigan, J.M., and Donaldson, M.S. (2000). To Err Is Human. Washington, DC: National Academy Press.


Lee, J., and See, J. (2004). Trust in automation: designing for appropriate reliance. Human Factors 46:50–80.


Leplat, J. 1987. Occupational accident research and systems approach. In New Technology and Human Error, eds. Rasmussen, J., Duncan, K., and Leplat, J., 181–191, New York: Wiley.


Leveson, N.G. (2001). Evaluating accident models using recent aerospace accidents. Technical Report, MIT Dept. of Aeronautics and Astronautics. Available at http://sunnyday.mit.edu/accidents.


Leveson, N.G. (2004). A new accident model for engineering safer systems. Safety Science, 42(4):237-270.


Leveson, N.G., Allen, P., and Storey, M.A. (2002). The analysis of a friendly fire accident: using a systems model of accidents. Proceedings of the 20th International Conference on System Safety.


Metzger, U., and Parasuraman, R. (2001). Automation-related “complacency”: Theory, empirical data, and design implications. In Proceedings of the Human Factors and Ergonomics Society 45th Annual Meeting, 463–467. Santa Monica, CA: Human Factors and Ergonomics Society.


Michaels, D., and Pasztor, A. As programs grow complex, bugs are hard to detect; a jet’s roller coaster ride. Wall Street Journal, May 30, 2006.


National Transportation Safety Board (1973). Eastern Air Lines, Inc., L-1011, N310EA, Miami, Florida, December 29, 1972 (AAR-73-14). Washington, DC.


National Transportation Safety Board. (1998a). Brief of accident NYC98FA020. Washington, DC.


National Transportation Safety Board. (1998b). Safety recommendation letter A-98-3 through -5, January 21,1998. Washington, DC.


Norman, D. A. (1990). The problem with automation: inappropriate feedback and interaction, not “overautomation.” Philosophical Transactions of the Royal Society (London), B237:585–593.


Nunes, A., and Laursen, T. (2004). Identifying the factors that contributed to the Ueberlingen midair collision. Proc. Annual Meeting Human Factors and Ergonomics Society, New Orleans, Sept. 2004.


Parasuraman, R., Sheridan, T.B., and Wickens, C.D. (2000). A model for types and levels of human interaction with automation. IEEE Transactions on Systems, Man and Cybernetics, SMC 30(3):286-297.


Perrow, C. (1984). Normal Accidents: Living with High Risk Technologies. NY: Basic Books.


Pew, R., and Mavor, A. (1998). Modeling Human and Organizational Behavior. Washington, DC: National Academy Press.


Reason, J. (1990). Human Error. Cambridge University Press, 1990.


Sarter, N.B., and Amalberti, R. (2000). Cognitive Engineering in the Aviation Domain. Mahwah, NJ: Erlbaum.


Sarter, N., and Woods, D. D. (1995). How in the world did we ever get into that mode? Mode error and awareness in supervisory control. Human Factors 37:5–19.


Sheridan, T. B. (1992). Telerobotics, Automation and Human Supervisory Control. Cambridge, MA: MIT Press.


Sheridan, T.B. (2000). Function allocation: algorithm, alchemy or apostasy? International Journal of Human-Computer Studies 52:203-216.


Sheridan, T.B. (2002). Humans and Automation. New York, NY: Wiley.


Sheridan, T.B., and Verplank. W.L. (1979). Human and computer control of undersea teleoperators. Man-Machine Systems Laboratory Report. Cambridge, MA: MIT.


Sheridan, T.B., and Parasuraman, R. (2006). Human-automation interaction. In Reviews of Human Factors and Ergonomics, ed. R. Nickerson. Santa Monica: Human Factors and Ergonomics Society.


Sherry, L., and Polson, P. G. (1999). Shared models of flight management system vertical guidance. International Journal of Aviation Psychology 9:139–153.


Tversky, A., and Kahneman, D. (1973). Availability, a heuristic for judging frequency and probability. Cognitive Psychology 5:207-232.


Tversky, A., and Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science 185:1124-1131.


Tversky, A., and Kahneman, D. (1980). Causal Schemes in Judgments Under Uncertainty. New York: Cambridge University Press.


Tversky, A., and Kahneman, D. (1981). The framing of decisions and the psychology of choice. Science 211:453-458.


Vicente, K. (2004). The Human Factor. New York: Routledge.


Weber, E. (1994). From subjective probabilities to decision weights: the effect of asymmetric loss functions on the evaluation of uncertain events. Psychological Bulletin 115:228-242.


Wickens, C.D., Mavor, A.S., Parasuraman, R., and McGee, J.P. (Eds.) (1988). The Future of Air Traffic Control: Human Operators and Automation. Washington, DC: National Academy Press.


Wiener, E.L., and Nagel, D.C. (1988). Human Factors in Aviation. New York: Academic Press.


Wiener, Norbert. God and Golem, Inc. Cambridge, MA: MIT Press, 1964.

Winkler, R.L., and Murphy, A.H. (1973). Experiments in the laboratory and in the real world. Organizational Behavior and Human Performance 10:252-270.

8.0ACKNOWLEDGMENTS


The writer especially acknowledges the contributions of Prof. Kevin Corker of San Jose State University, who served as a valuable consultant throughout this project, and of Dr. Richard John, former director of the Volpe Center, who was instrumental in initiating the project and eliciting the author’s participation as principal investigator.


Several colleagues are also to be acknowledged for their pioneering research on human-automation interaction and human error and their reports on various accident situations. I particularly drew on the work of Prof. James Reason of Manchester University in the UK, Dr. Asaf Degani of NASA Ames Research Center, Prof. Raja Parasuraman of George Mason University, Dr. Steven Casey of Ergonomic Systems Design, Prof. Nancy Leveson of the Massachusetts Institute of Technology, and Prof. Kim Vicente of the University of Toronto.


1   ...   10   11   12   13   14   15   16   17   18

Похожие:

Table Judged Reasons for Failure in Events Cited 22 XIV iconTime 2 Beat 122 Entries (12” dogs judged by barbara Bounds; remainder judged by Sandy Moody)

Table Judged Reasons for Failure in Events Cited 22 XIV iconTable Substance identity 2 XI Table Constituents 3 XI Table Impurities 3 XI

Table Judged Reasons for Failure in Events Cited 22 XIV iconAppendix Bibliography (list of all articles cited and what chapter cited in)

Table Judged Reasons for Failure in Events Cited 22 XIV iconTable Substance identity 2 VII Table Constituents 2 VII Table Overview of physico-chemical properties 3 VII

Table Judged Reasons for Failure in Events Cited 22 XIV iconRound Table  Table ronde The New Citizenship Guide. A round Table  Le nouveau guide sur la citoyenneté

Table Judged Reasons for Failure in Events Cited 22 XIV iconNote for Philip: Each section of the content is preceded by the columned table, copied over from your original document. The main body text for each section is outside of the table

Table Judged Reasons for Failure in Events Cited 22 XIV iconTable structure for table `authors`

Table Judged Reasons for Failure in Events Cited 22 XIV iconThese publications cited papers and books authored and coauthored by S. A. Ostroumov
Примеры работ, цитирующих публикации с авторством и соавторством д б н. С. А. Остроумова. These publications cited papers and books...
Table Judged Reasons for Failure in Events Cited 22 XIV iconWere medieval muslims really tolerant when judged by modern standards?

Table Judged Reasons for Failure in Events Cited 22 XIV iconWere medieval muslims really tolerant when judged by modern standards?

Разместите кнопку на своём сайте:
Библиотека


База данных защищена авторским правом ©lib.znate.ru 2014
обратиться к администрации
Библиотека
Главная страница