Journal of ISSN: 2373-6437JACCOA

Anesthesia & Critical Care: Open Access
Opinion
Volume 1 Issue 6 - 2014
Patient Safety, Surgical Checklists, Stone Soup
Stavros Prineas*
Department of Anaesthesia, Prince of Wales/Sydney Childrens’ Hospitals, Australia
Received: October 28, 2014 | Published: November 27, 2014
*Corresponding author: Stavros Prineas, Department of Anaesthesia, Prince of Wales/Sydney Childrens’ Hospitals, Randwick, Sydney, Australia, Tel: +447454013652; Email: @
Citation: Prineas S (2014) Patient Safety, Surgical Checklists, Stone Soup. J Anesth Crit Care Open Access 1(5): 00034. DOI: 10.15406/jaccoa.2014.01.00034

Abstract

Since the launch of the World Health Organization Surgical Checklist, early data have suggested a dramatic reduction in certain types of peri-operative error. Advocates of the checklist claim it has the potential to transform healthcare. This article seeks to examine the deeper relationship between practitioners and checklists, and the key to their success or failure in keeping patients safe.
Keywords: Patient safety; Surgical checklist; WHO checklist; Ergonomics; Human factors

Opinion

In many European cultures there is a traditional folktale [1] about a stranger who comes to a remote village with an empty cauldron, seeking food. When the locals turn him away, he fills the pot with water from a nearby stream, and sets it over an open fire. He places an ordinary stone in the pot and sits back. A couple of villagers pass by, and when they enquire, he says he is making “stone soup”. “It’s delicious on its own, although a little salt would really make it special…if only I had some.”
Eventually one of the locals offers to part with a little salt and the stranger adds it to the pot. “Thank you”, he says eagerly, “now if only it had a few herbs…” The other villager says, “I have oregano – but if I give you some, I’ll want some soup too…” Word gets around, and soon a small crowd gathers around the cauldron. Generally there is disbelief - but by now the “stone soup” does smell rather nice. “Can I have some?” asks an onlooker. “Hold on,” says the stranger, “I just can’t give you some when others have put in for their share.”“Well - I’ve got a turnip…”“Hmm, let me check… yes, a turnip should work… but two would make it fit for a king.” Soon people are offering vegetables and even meat to get some soup. As everyone has their fill, spirits rise, wine appears. They laugh, sing and feast. And the stone soup is delicious.
So how does this imaginary tale relate to patient safety? Since the World Health Organisation launched its surgical checklist in 2009 [2] early data have suggested a dramatic reduction in certain types of peri-operative error [3]. One of the authors of this study, Atul Gawande, already a popular best-selling author in his own right, published a book of the development of this checklist [4], advocating more extensive use of formal checklists in healthcare: “if something so simple can transform intensive care, what else can it do?”[5]. And why not? Formalised checklists have enjoyed dramatic successes elsewhere in healthcare [6-8], so it is not a groundless aspiration. However, closer examination of Gawande‘s book suggests to me a deeper reason why checklists may improve safety - a reason well understood by its architects, yet perhaps under-publicised or misunderstood elsewhere.
In examples throughout the book, a conspicuous amount of clinical governance groundwork is needed to make the checklists work – clinical leaders to champion the cause, negotiated improvements in communication and collaboration, new assertiveness protocols allowing nurses to make doctors comply, resources allocated to audit outcomes, and so on. In this sense a checklist may act like the ‘stone’ in ‘stone soup’, i.e. a catalyst for others to contribute and collaborate in ways that otherwise were lacking. Perhaps it is in fact these changes, as opposed to whether a particular box is ticked, that are making things safer.
Now this in itself may be no bad thing - except if it gives managers licence to believe that any safety problem can be solved by generating another set of mandatory tick-boxes without the substantial preparation, evidence gathering, diligence and follow-through that exemplify the successes cited in Gawande’s“manifesto”. One therefore begs the question: if a surgical checklist is introduced without a designated clinician resourced to champion it, without enforcing the communication and team-working behaviours that would have to accompany it (e.g. making sure everyone participates at the same moment in time), and without a prospective evaluation framework to monitor its impact, would it actually make a difference? This is not a frivolous question: for example, while electronic prescribing systems are no doubt a promising technology, the evidence that they reduce prescriber errors, drug administration errors and adverse drug events has not been compelling [9-11] and they may even contribute to errors in some situations [12]. This suggests that ‘it isn’t what you do, but the way that you do it’ that appears to improve safety. Might the same be true of surgical checklists?
Safety culture – the way we perceive and priorities safety in our day-to-day practice - is far from ideal in many places. Like any professional group, doctors tend to cherish their autonomy (in many arenas rightly so) and can regard externally imposed protocols as a form of bureaucratic intrusion to be rejected on principle. I have observed surgeons and anesthetists actively refuse to participate in checklists, yet in truth these same clinicians would probably use their own mental checklists and mnemonics for many aspects of their clinical practice – from the Advanced Trauma Life Support primary survey and the minimum requirements for rapid sequence induction, to remembering cranial nerves and checking an anesthetic machine. By contrast, I have also observed that other clinicians are happy to proceed if the paperwork is complete, even when actual criteria have not been met or verified. Documenting the motions, rather than the actual checking, becomes the endpoint. So, for example, the anesthetist or the surgeon may not be paying attention to the checklist, but the theatre nurse does not stop to ensure everyone is listening, and carries on. The purpose of the checklist as a collective briefing – to ensure all members of the team have a shared mental model [13,14] of what is being done to whom - is undermined; yet as long as all the boxes have been ticked, the procedure goes ahead. Practices such as these would subvert many safety gains possible from a checklist.
At the risk of stating the obvious, part of the point of a ‘checklist’ is to check that which should already have been done. Why wait until the ‘roll-call’ to introduce yourself robotically to the team, only to forget everyone’s name a minute later? Introductions are better remembered when they are made person-to-person, on arrival. Discussion of a particular case is better done as an interactive narrative, rather than as a disengaged series of bullet points. The checklist is there to ensure the team has not forgotten critical elements of the story - it is not the story. Nor is it really a substitute for a team briefing unless all minds of the team are engaged. This is like putting the stone in the soup assuming all the other ingredients will automatically appear, when in fact without conscious and willing participation, it could just be a recipe for hot water.
Moreover post-implementation audits [15] may be open to misinterpretation: operating theatre teams who perform the checklist well may produce better outcomes for their patients, but it may be that the sort of teams partial to running checklists are the ones whose clinical outcome is good anyway, as they may generally interact better in other aspects of their care as well. In other words, where the ‘soup’ is already good, the ‘stone’ (the checklist) may be neither an active ingredient nor a catalyst - just a heavy garnish. At least one recent study suggests that the use of checklists improves the perceived quality of teamwork and communication in some or while having a negative impact in others [16].
More studies are needed. Meanwhile one could argue that in a superlative facility, where good clinical leaders championed safety initiatives, where uniformly competent clinicians worked comfortably in cohesive inter-professional teams, communicated respectfully and efficiently with each other in a timely manner all the time, and where mutual monitoring of individual and team performance was not only tolerated but encouraged, checklists would not be needed at all.
So- am I in favour of WHO checklists? Absolutely. Not for scientific reasons (I don’t think we have the right data yet) but on philosophical grounds - on faith and principle, if you will.
Firstly, as an industry we remain obsessed with the economic and technical aspects of delivering health care, even though there is now a substantial body of evidence that communication, teamwork and leadership failures are major contributors to medical adverse events [17,18]. The competence of many individual healthcare workers in these ‘paratechnical’ or ‘non-technical’ skills is highly variable and the superlative workplace described above is utopian. So in the real-world context, enforcing a surgical checklist may be a crude but effective way to raise the bar, the beginning of a process of setting minimum standards for communication, teamwork and leadership behaviours in our hazard-rich environments. Hopefully by setting some standards, appropriate training programmes, and ultimately cultural change, would follow, taking us a little closer to where we need to be. In the field of medical crisis management, it appears this may already be happening through high-fidelity simulation.
Second, it should be clear that ‘systems’ (departments, hospitals, health authorities, etc.) are not just abstract infrastructures. They are made of people - dozens, hundreds or even thousands of individual humans, interacting a myriad times on a day-to-day basis. To put it another way, ‘systems are people, too’ [19]. There are physical and cognitive limits to the performance that can be expected of each and every human component in that system. Not only is it true that ‘to err is human’ –in fact, just like everyone else, healthcare workers regularly make mistakes in their routine work [20]. Most of these errors are either of themselves inconsequential, or are rapidly detected or corrected [21,22]. This speaks as much to the resilience of patients and healthcare workers alike as it does to their vulnerabilities. Yet while practice, scholarship and motivation are all necessary to improve human performance, in real life they can never make us perfect [23]. We build policies, procedures, equipment, hospitals and public expectations around assumptions of idealised ‘system’ performance, yet when the inherent fallibility of each human component is multiplied against the number of interactions necessary for modern clinical pathways, we should not be surprised that the likelihood of sinister error chains forming dramatically increases.
These chains of fallibility are further compounded by the revolutions in scale that characterise modern life and modern health care. Taken globally, humans are expected in concert to perform increasingly complex and critical tasks faster and in higher volumes that could possibly have been imagined a hundred years ago. In the last 20 years new technology has taken industrialization of healthcare to a whole new order of magnitude. This has increased exponentially the number of interactions occurring related to the care of a single patient, and with this an increased likelihood that a chain of very human errors will lead to adverse outcomes. Systemic errors and catastrophes will not only occur in complex organizations, they should be expected; they can be regarded, as Perrow does, as ‘normal accidents’ [24]. The frequency with which these adverse events occur will depend on the intrinsic qualities of a system and the people in it.
In this juggernaut that the business of medicine has become, only a relatively small proportion of human errors lead to harm, but in absolute terms this represents a major cause of morbidity and mortality. Tackling this is not just an administrator’s problem, it’s everyone’s problem: clinicians, patients, lawmakers too, and society as a whole. Accepting the limits of human performance, and embracing both the training and ergonomic support required to redress these limits, are essential to reducing the likelihood of medical adverse events over time. We all need to make ourselves aware, and pitch in.
An increasing amount of thought and effort must be devoted to designing systems that minimise the opportunities for human error and maximize the efficacy of human effort. Conceptually, there are two broad approaches: adaptation of human workers to fit the needs and limitations of the work environment (‘training’), and adaptation of the work environment to fit the needs and limitations of human workers (‘ergonomics’) [25]. Traditionally, clinicians have focused on training to make things safer, although anesthesia also has an established track record in promoting and campaigning for better design in equipment, drugs and critical care facilities [26-31].
A well-designed and appropriately supported checklist, used in a carefully selected clinical setting, represents just one of many ergonomic tools used in other safety-critical fields of human endeavor, patiently waiting for us to realize their place in improving patient safety. However, like most ergonomic solutions, for them to work properly, they must be supported by training, diligence and goodwill. So before we start rolling out checklists for everything, perhaps we would do well to remember one thing: it’s not just the stone that makes the soup, but what we are each willing to bring to the pot.

References

  1. Author Unknown (1978) Stone Soup or St Bernard’s Soup. In: Brewer EC (Ed.), Brewer’s Dictionary of Phrase and Fable (Classic Edition). Avenel Books, New York, USA, pp. 1181.
  2. The Joint Commission on Accreditation of Healthcare Organisations. Speak Up - The Universal Protocol.
  3. Heynes A, Weiser T, Berry W, Lipsitz SR, Breizat AH, et al. (2009) A surgical safety checklist to reduce morbidity and mortality in a global population. N Engl J Med 360(5): 491-499.
  4. Gawande A (2010) The Checklist Manifesto – How to get things right. Profile Books, London, UK.
  5. Gawande A (2007) The Checklist. The New Yorker, USA.
  6.  Provonost PJ, Needham DM, Berenholtz S, Sinopoli D, Chu H, et al. (2006). An intervention to reduce catheter-related bloodstream infections in the ICU. N Engl J Med 355(26): 2725-2732.
  7. Berenholtz SM, Milanovich S, Faircloth A, Prow DT, Earsing K, et al. (2004) Improving care for the ventilated patient. Jt Comm J Qual Saf 30(4): 195-204.
  8. Lingard L, Espin S, Rubin B, Whyte S, Colmenares M, et al. (2005) Getting teams to talk: development and pilot implementation of a checklist to promote interprofessional communication in the OR. Qual Saf Health Care 14(5): 340-346.
  9. Ammenwerth E, Schnell-Inderst P, Machan C, Siebert U (2008) The effectof electronic prescribingon medication errorsand adverse drugevents: a systematic review. J Am Med Inform Assoc 15(5): 585-600.
  10. Reckmann MH, Westbrook JI, Koh Y, Lo C, Day RO    (2009) Does computerized provider order entry reduce prescribing errors for hospital inpatients? A systematic review. J Am Med Inform Assoc 16(5): 613-623.
  11. VanDoormaal JE, Van DenBemt PM, Zaal RJ, Egberts ACG, Lenderink BW, et al. (2009) The influence that electronic prescribing has on medication errors and preventable adverse drug events: an interrupted time-series study. J Am Med Inform Assoc 16(6): 816-825.
  12. Van den Bemt PM, Idzinga JC, Robertz H, Kormelink DG, Pels N (2009) Medication administration errors in nursing homes using an automated medication dispensing system. J Am Med Inform Assoc 16(4): 486-492.
  13. Cannon-Bowers JA, Salas E, Converse SA (1993) Shared mental models in expert team decision making. In: Casstelan NJ (Ed.), Individual and Group Decision Making: Current Issues. Lawrence Erlbaum (Now Taylor and Francis) Pubs, UK, pp. 221-245.
  14. Mathieu JE, Heffner TS, Goodwin GF, Salas E, Cannon-Bowers JA (2000) The influence of shared mental models on team process and performance. J Appl Psychol 85(2): 273-283.
  15. Darzi A, Sevdalis N, Moorthy K,Mansell J, Russ S, et al. (2012) Implementing and Evaluating the WHO Surgical Checklist – Project summary and commentary.
  16. Russ S, Rout S, Sevdalis N, Moorthy K, Darzi A, et al. (2013) Do safety checklists improve teamwork and communication in the operating room? A systematic review. Ann Surg 258(6): 856-871.
  17. Manser T (2009) Teamwork and patient safety in dynamic domains of healthcare: a review of the literature. Acta Anaesthesiol Scand 53(2): 143-151.
  18. Reader T, Flin R, Lauche K, Cuthbertson BH (2006) Non-technical skills in the intensive care unit. Br J Anaesth 96(5): 551-559.
  19. Dawe RL (1996) Systems are people too. TranspDistrib 37(1): 86-87.
  20. Barker KN, Flynn EA, Pepper GA, Bates DW, Mikeal RL (2002) Medication errors observed in 36 healthcare facilities. Arch Intern Med 162(16): 1897-1903.
  21. Cooper JB, Newbower RS, Kitz RJ (1984) An analysis of major errors and equipment failures in anaesthesia management: considerations for prevention and detection. Anesthesiology 60(1): 34-42.
  22. Khan FA, Hoda MQ (2001) A prospective survey of intra-operative critical incidents ina teaching hospital in a developing country. Anaesthesia 56(2): 171-182.
  23. Leape LL (1994) Error in medicine. JAMA 272(23): 1851-1857.
  24. Perrow C (1999) Normal Accidents. Princetin University Press, Princeton, New Jersey, USA, pp. 464.
  25. Wickens CD, Hollands JG (2000) Engineering Psychology and Human Performance (3rd edn) Prentice Hall Pubs, Upper Saddle River, New Jersey, USA, pp. 69-118.
  26. Australia and New Zealand College of Anaesthetists. (2012) Professional document PS55: Recommendations on minimum facilities for safe administration of anaesthesia in operating suites and other anaesthetising locations.
  27. Diba A (2005) Ergonomics. In: Davey A & Diba A (Eds.), Ward’s Anaesthetic Equipment. Elsevier, Philadelphia, USA, pp. 600.
  28. Tremper KK (2007) Equipment and Monitoring, In: Atlee JL et al. (Eds.), Complications in Anaesthesia, (2nd edn), Elsevier, Philadelphia, USA, pp. 518-521.
  29. Royal College of Anaesthetists (2001) Safety notice on prevention of hypoxic gas mixtures. RCoA Bulletin 8: 354.
  30. Royal College of Anaesthetists (2014) Syringe labeling in critical care areas.
  31. Webster CS, Merry AF (2007) Colour-coding, drug-administration error and the systems approach to safety. Eur J Anaesthesiol 24(4): 385-386.
© 2014-2016 MedCrave Group, All rights reserved. No part of this content may be reproduced or transmitted in any form or by any means as per the standard guidelines of fair use.
Creative Commons License Open Access by MedCrave Group is licensed under a Creative Commons Attribution 4.0 International License.
Based on a work at http://medcraveonline.com
Best viewed in Mozilla Firefox | Google Chrome | Above IE 7.0 version | Opera |Privacy Policy