Misplaced Pages

Organizational models of accidents: Difference between revisions

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
Browse history interactively← Previous editNext edit →Content deleted Content addedVisualWikitext
Revision as of 10:33, 1 September 2009 editMootros (talk | contribs)Extended confirmed users, Pending changes reviewers15,100 editsm moved Swiss Cheese model to Human error model: too narrow← Previous edit Revision as of 10:36, 1 September 2009 edit undoMootros (talk | contribs)Extended confirmed users, Pending changes reviewers15,100 editsNo edit summaryNext edit →
Line 1: Line 1:
{{cleanup}} {{cleanup}}
{{redirect3|Swiss Cheese model|For the Swiss Cheese model in healthcare ], see ], and in ], see ], ], and ]}}
{{ otheruses2|Swiss cheese }}


The '''Swiss Cheese model''' of accident causation is a model used in the ] and ] of human systems. It likens human systems to multiple slices of ], stacked together, side by side. It was originally propounded by British psychologist James T. Reason in 1990, and has since gained widespread acceptance and use in healthcare, in the ] industry, and in emergency service organizations. It is sometimes called the '''cumulative act effect'''. The '''Swiss Cheese model''' of accident causation is a model used in the ] and ] of human systems. It likens human systems to multiple slices of ], stacked together, side by side. It was originally propounded by British psychologist James T. Reason in 1990, and has since gained widespread acceptance and use in healthcare, in the ] industry, and in emergency service organizations. It is sometimes called the '''cumulative act effect'''.

Revision as of 10:36, 1 September 2009

You must add a |reason= parameter to this Cleanup template – replace it with {{Cleanup|reason=<Fill reason here>}}, or remove the Cleanup template.

The Swiss Cheese model of accident causation is a model used in the risk analysis and risk management of human systems. It likens human systems to multiple slices of Swiss cheese, stacked together, side by side. It was originally propounded by British psychologist James T. Reason in 1990, and has since gained widespread acceptance and use in healthcare, in the aviation safety industry, and in emergency service organizations. It is sometimes called the cumulative act effect.

In the Swiss Cheese model, individual weaknesses are modelled as holes in slices of Swiss cheese, such as this Emmental. They represent the imperfections in individual safeguards or defences, which in the real world rarely approach the ideal of being completely proof against failure.

Reason hypothesizes that most accidents can be traced to one or more of four levels of failure: Organizational influences, unsafe supervision, preconditions for unsafe acts, and the unsafe acts themselves. In the Swiss Cheese model, an organization's defences against failure are modelled as a series of barriers, represented as slices of Swiss cheese. The holes in the cheese slices represent individual weaknesses in individual parts of the system, and are continually varying in size and position in all slices. The system as a whole produces failures when all of the holes in each of the slices momentarily align, permitting (in Reason's words) "a trajectory of accident opportunity", so that a hazard passes through all of the holes in all of the defenses, leading to a failure.

The Swiss Cheese model includes, in the causal sequence of human failures that leads to an accident or an error, both active failures and latent failures. The former concept of active failures encompasses the unsafe acts that can be directly linked to an accident, such as (in the case of aircraft accidents) pilot errors. The latter concept of latent failures is particularly useful in the process of aircraft accident investigation, since it encourages the study of contributory factors in the system that may have lain dormant for a long time (days, weeks, or months) until they finally contributed to the accident. Latent failures span the first three levels of failure in Reason's model. Preconditions for unsafe acts include fatigued air crew or improper communications practices. Unsafe supervision encompasses such things as, for example, two inexperienced pilots being paired together and sent on a flight into known adverse weather at night. Organizational influences encompass such things as reduction in expenditure on pilot training in times of financial austerity.

The same analyses and models apply in the field of healthcare, and many researchers have provided descriptive summaries, anecdotes, and analyses of Reason's work in the field. For example, a latent failure could be the similar packaging of two different prescription drugs that are then stored close to each other in a pharmacy. Such a failure would be a contributory factor in the administration of the wrong drug to a patient. Such research has led to the realization that medical error can be the result of "system flaws, not character flaws", and that individual greed, ignorance, malice, or laziness are not the only causes of error.

Lubnau, Lubnau, and Okray apply Reason's Swiss Cheese model to the engineering of human systems in the field of firefighting, with the aim of reducing human errors by "inserting additional layers of cheese into the system", namely the techniques of Crew Resource Management. Frosch describes Reason's model in mathematical terms as being a model in percolation theory, which he analyses as a Bethe lattice.

References

  1. Smith, D. R., Frazier, D., Reithmaier, L. W. and Miller, J. C. (2001). Controlling Pilot Error. McGraw-Hill Professional. p. 10. ISBN 0071373187.{{cite book}}: CS1 maint: multiple names: authors list (link)
  2. ^ Stranks, J. (2007). Human Factors and Behavioural Safety. Butterworth-Heinemann. pp. 130–131. ISBN 0750681551. {{cite book}}: Unknown parameter |isbn13= ignored (help)
  3. Wiegmann, D. A. and Shappell, S. A. (2003). A Human Error Approach to Aviation Accident Analysis: The Human Factors Analysis and Classification System. Ashgate Publishing, Ltd. pp. 48–49. ISBN 0754618730.{{cite book}}: CS1 maint: multiple names: authors list (link)
  4. Hinton-Walker, P., Carlton, G., Holden, L. and Stone, P. W. (2006-06-30). "The intersection of patient safety and nursing research". In J. J. Fitzpatrick and P. Hinton-Walker (ed.). Annual Review of Nursing Research Volume 24: Focus on Patient Safety. Springer Publishing. pp. 8–9. ISBN 0826141366. {{cite book}}: Check date values in: |date= (help)CS1 maint: multiple names: authors list (link)
  5. Thomas Lubnau II, Randy Okray, and Thomas Lubnau (2004). Crew Resource Management for the Fire Service. PennWell Books. pp. 20–21. ISBN 1593700067.{{cite book}}: CS1 maint: multiple names: authors list (link)
  6. Robert A. Frosch (2006). "Notes toward a theory of the management of vulnerability". In Philip E Auerswald, Lewis M Branscomb, Todd M La Porte, and Erwann Michel-Kerjan (ed.). Seeds of Disaster, Roots of Response: How Private Action Can Reduce Public Vulnerability. Cambridge University Press. p. 88. ISBN 0521857961.{{cite book}}: CS1 maint: multiple names: editors list (link)

Further reading

James Reason resources:

Related peer reviewed papers:

  • Palmieri, P. A., DeLucia, P. R., Ott, T. E., Peterson, L. T., & Green, A. (2008). "The anatomy and physiology of error in averse healthcare events". In E. Ford and G. Savage (ed.). Advances in Health Care Management. Vol. 7. Emerald Publishing Group. pp. 33–68. doi:10.1016/S1474-8231(08)07003-1.{{cite book}}: CS1 maint: multiple names: authors list (link) — This paper offers a comprehenisive transdisciplinary analysis of the etiology of error that contributes to healthcare adverse events. The authors offer an adaptation of the Swiss Cheese Model called the Healthcare Error Proliferatio Model that incorporates the complex adaptive system and dispositional attribution features among others into the model.
  • Perneger, T. V. (2005-11-09). "The Swiss cheese model of safety incidents: Are there holes in the metaphor?". BMC Health Services Research. 5 (71). BioMed Central Ltd.: 71. doi:10.1186/1472-6963-5-71. {{cite journal}}: Check date values in: |date= (help)CS1 maint: unflagged free DOI (link) — This paper provides a balanced commentary about critical issues with the Swiss Cheese model applicability to healthcare.

Related books:

  • Bayley, C. (2004). "What medical errors can tell us about management mistakes". In P. B. Hofmann and F. Perry (ed.). Management Mistakes in Healthcare: Identification, Correction, and Prevention. Cambridge University Press. pp. 74–75. ISBN 0521829003.
  • Westrum, R. and Adamski, A. J. (1998). Organizational factors associated with safety and mission success in aviation environments. Lawrence Erlbaum Associates. p. 84. ISBN 0805816801.{{cite book}}: CS1 maint: multiple names: authors list (link) — This chapter relates the Swiss Cheese model to the Human Envelope Model, where "around every complex operation there is a human envelope that develops, operates, maintains, interfaces, and evaluates the function of the sociotechnological system".

Related web sources:

Other related resources:

See also

Category: