14
Copyright © 2012 Azad Madni Human Failure Modes Dr. Azad M. Madni Professor , Epstein Department of Industrial and Systems Engineering Director, SAE Program Co-Director, CSSE March 6, 2012

Copyright © 2012 Azad Madni Human Failure Modes Dr. Azad M. Madni Professor, Epstein Department of Industrial and Systems Engineering Director, SAE Program

Embed Size (px)

Citation preview

Copyright © 2012 Azad Madni

Human Failure Modes

Dr. Azad M. MadniProfessor , Epstein Department of Industrial and Systems Engineering

Director, SAE ProgramCo-Director, CSSE

March 6, 2012

Copyright © 2012 Azad Madni

Outline

■ Human Failure Modes■ Demanding Systems Requirements■ Implications for Humans■ Evolving Human Roles■ Systems Engineering Mindset ■ The Remarkable Human Brain■ Human Error Sources■ Potential Remedies and Opportunities

Copyright © 2012 Azad Madni

Human Failure

■Comprises

human errors, which are unintentional behaviors violations, which are willful disregard of rules and

regulations

■Human errors fall into specific categoriesslips, lapses of memorymistakes in following rules and proceduresmistakes in understanding

Copyright © 2012 Azad Madni

Demanding System Requirements

■ Adaptability■ Reconfigurability■ Composability■ Resilience

These requirements pose formidable challenges for humans that work with and within complex

systems

Copyright © 2012 Azad Madni

Implications for Humans

■System adaptability implies changing contexts and potential changes to human-system interactions

■System reconfigurability implies potential changes to human roles and human-system function allocation

■System resilience implies potential dynamic changes to human role and attendant changes to cognitive load

■System composability (as in SoSs) implies potential changes in collaborators (lack of shared conceptual model)These changes can increase likelihood of human

error.

Copyright © 2012 Azad Madni

Evolving Human Roles

■From that of operator outside system to that of agent within an adaptable system

decision maker supervisor monitor with override

authority re-assignable participant

(peer, assistant)These roles require new behaviors.

Copyright © 2012 Azad Madni

Systems Engineering Mindset

• Humans are suboptimal job performers that need to be shored up and compensated for during task performance

• This perception leads to systems that are inherently incompatible with human conceptualization of work

• The resulting mismatch inevitability creates human reliability issues that show up as human error

This mindset fails to capitalize on human ingenuity

Copyright © 2012 Azad Madni

The Remarkable Human Brain

■ Yuor Barin Can Raed This

■ For emaxlpe, it deson’t mttaer in waht oredr the ltteers in a wrod aepapr, the olny iprmoatnt tihng is taht the frist and lsat ltteer are in the rghit pcale. The rset can be a toatl mses and you can sitll raed it wouthit pobelrm.

■ S1M1L4RLY, Y0UR M1ND 15 R34D1NG 7H15 4U70M471C4LLY W17H0U7 3V3N 7H1NK1NG 4B0U7 17.

■ How ?

Source: LiveScience.com

Copyright © 2012 Azad Madni

Human Error Sources (Examples) ■ Erroneous/Incomplete Mental Model

often traceable to poor design- results in mistakes lack of complete info causes user to make

unwarranted assumptions about system state also results from misrecognition of cues/state info

■ Drop in Vigilance/Arousal during Monitoring occurs with infrequent stimulus leading to

missed cue detection ■ Loss of Focus during Task Performance

results in slips (execution errors) arising from inattention ■ Cognitive Overload

causes: multi-tasking, context switching, decision making under stress

can lead to suboptimal behaviors and human errors (mistakes and slips)

Copyright © 2012 Azad Madni

Key Findings

■ Humans change cognitive strategies under overload ■ Inverted-U relationship: performance & stress■ Humans unable to distribute attention under stress■ Adaptability of human-in-the-loop system is

upper-bounded by acceptable human error rate

■ System inspectability facilitates human intervention and avoids having to make erroneous assumptions

■ For robust performance need to minimize multitasking and context switching employ alerting/automation to monitor and flag rare events need to understand cognitive strategies under overload for effective aiding

Copyright © 2012 Azad Madni

Potential Remedies

■ Design human work to avoid multi-tasking and frequent context switching to the extent possible

■ Assign rare event monitoring to automation or alerting mechanisms

■ Provide decision aiding and performance support for decision making under stress

■ Design appropriate incentives to counter risk compensation tendency

■ Employ automation and dynamic function allocation to assure manageable cognitive loadMost complex problems will require a combination

of many of these remedies.

Copyright © 2012 Azad Madni

Potential Opportunities

■ Exploit human ingenuity and creativity in:

adapting to shifting contexts

generalizing from specifics

recognizing novelty and improvising

aggregating information in the absence of an algorithm

detecting and filling gaps (e.g., in narratives)

Most complex problems will require a combination

of human creativity and ingenuity.

Copyright © 2012 Azad Madni

So,….Is Human Error a Cause or Consequence?

Thank You

Copyright © 2012 Azad Madni

My References

■ Madni, A.M. “Integrating Humans with Software and Systems: Technical Challenges and a Research Agenda,” Keynote Presentation, 22nd Annual Systems & Software Technology Council, Salt Lake City, Utah, April 26–29, 2010.

■ Madni, A.M. “Integrating Humans with Software and Systems: Technical Challenges and a Research Agenda,” INCOSE Journal of Systems Engineering and, Vol. 13, No. 3, 2010.

■ Madni, A.M. “Integrating Humans with Software and Systems: Technical Challenges and a Research Agenda,” Keynote Presentation, INCOSE 2010 LA Mini-Conference, Loyola Marymount University, October 16, 2010.

■ Madni, A.M. “Integrating Humans With and Within Software and Systems: Challenges and Opportunities,” (Invited Paper) CrossTalk, The Journal of Defense Software Engineering, May/June 2011, “People Solutions.”