Journal of Surgical Simulation 2017; 4: B: 1 - 1

Published: 11 May 2017

DOI: https://doi.org/10.1102/2051-7726.2017.B001

Oral presentation

Special Issue: Human factors in surgical error

Ken Catchpole
Corresponding author: Ken Catchpole, Medical University of South Carolina, Charleston, USA. Email: catchpol@musc.edu

Abstract

Worldwide, the incidence of medical accidents in hospitalized patients is approximately 10%, and amongst the leading causes of death in both the USA and the UK. A recent article1 suggested it is the third most frequent cause of death in the USA.  Rather than being related purely to skill, expertise or negligence, these events are now seen as the consequence of inadequate components of the healthcare system. Since the Bristol enquiry2 in the late 1990s it has been increasingly recognized that injuries to surgical patients are not simply due to a small number of ‘bad apples’ but reflect systems of work that have failed the humans working within them.

Systems of work are a combination of humans, technologies, processes, policies, management and training. Human errors are predisposed by system designs that do not account for natural human limitations, and instead create mismatches between what humans are required to do in increasingly complex technological systems, and their abilities to do them. When things go wrong, to look only at human failures is to ignore the complexity of those accidents. Indeed, it is people who create safety in complex systems by accounting for variations that systems designers cannot appreciate. Studies that have observed and analysed in detail sequences of events with system threats – from organization, environment, task, technology and patient –  describe how a cascade of problems leads to a far more risky and potentially adverse situation.

The erosion of human cognitive capacity creates opportunities for failure that further reduce human capacity, leading to a spiral of increased risk. Fatigue compromises perceptual abilities; noise can mask important communications, and can reduce or exacerbate fatigue. Interruptions and distractions take attention away from primary tasks, increasing the chances of forgetting or omitting steps, and delays. Temperature and humidity increases physiological stress, can lead to dehydration and fatigue, and can also create interruptions, for example, while the surgeon wipes their brow or clears fogging of a lens or goggles.

Equipment designs can predispose to errors, or can guide users towards the right methods and modes of operation. The wrong buttons in the wrong place, displays that are unclear, labels that are ambiguous or devices that allow unsafe configurations can all contribute to an error, yet frequently go unnoticed. Badly designed technology can add complexity, uncertainty and new ways to fail without adding to the care of the patient. Many of these issues have been uncovered in infusion pumps3, electronic health records4, laparoscopic surgery5, surgical robots6 and a range of other clinical and non-clinical contexts. In essence, we have learned that discussions which focus on replacing the human with technology usually under-estimate the extent and value of human contributions to performance and safety, and will likely create a range of new problems. Examples from cardiac7, laparoscopic8, vascualar9, orthopedic10, trauma11, robotic12, neuro and maxillofacial surgery13, are used to show how an emphasis on training and checklist solutions is slowly giving way to a more complex and richer understanding of how socio-technical systems configurations contribute to success or failure in surgery. While this complexity may take time to elucidate and understand, it offers many new ways to think about how improvements in the efficiency, safety and quality of surgical care might be delivered. If we approach systems design from the point of view of helping the human to achieve their goals, by supporting adaptive human sense-making and decision-making within a complex system, we stand a good chance of avoiding catastrophes and creating success in the future.

References

1. Makary MA, Daniel M. Medical error-the third leading cause of death in the US. BMJ. 2016; 353: i2139. https://doi.org/10.1136/bmj.i2139

2. Kennedy I. Learning from Bristol: the report of the public inquiry into children's heart surgery at the Bristol Royal Infirmary 1984 -1995. 2001 Command Paper: CM 5207. http://webarchive.nationalarchives.gov.uk/20090811143745/http:/www.bristol-inquiry.org.uk/final_report/the_report.pdf (accessed 28 April 2017)

3. Perry SJ. An overlooked alliance: using human factors engineering to reduce patient harm. Jt Comm J Qual Saf 2004; 30 :455-459. https://doi.org/10.1016/s1549-3741(04)30052-3

4. Ratwani R, Fairbanks T, Savage E, Adams K Wittie M Boone E et al. Mind the gap. A systematic review to identify usability and safety challenges and practices during electronic health record implementation. Appl Clin Inform 2016; 7: 1069-1087. https://doi.org/10.4338/ACI-2016-06-R-0105

5. Way LW, Stewart L, Gantert W, Liu K. Lee C, Whang K et al. Causes and prevention of laparoscopic bile duct injuries: analysis of 252 cases from a human factors and cognitive psychology perspective. Ann Surg 2003; 237: 460-469. https://doi.org/10.1097/00000658-200304000-00004

6. Randell R, Greenhalgh J, Hindmarsh J, Dowding D, Jayne D, Pearman A et al. Integration of robotic surgery into routine practice and impacts on communication, collaboration, and decision making: a realist process evaluation protocol. Implement Sci 2014; 9: 52. https://doi.org/10.1186/1748-5908-9-52

7. Catchpole KR, Giddings AE, de Leval MR, Peek GJ, Godden, PJ, Utley M et al. Identification of systems failures in successful paediatric cardiac surgery. Ergonomics 2006; 49: 567-588. https://doi.org/10.1080/00140130600568865

8. Mishra A, Catchpole K, Dale T, McCulloch P. The influence of non-technical performance on technical outcome in laparoscopic cholecystectomy. Surg Endosc 2008; 22: 68-73. https://doi.org/10.1007/s00464-007-9346-1

9. Catchpole K, Mishra A, Handa A, McCulloch P. Teamwork and error in the operating room - Analysis of skills and roles. Ann Surg 2008; 247: 699-706. https://doi.org/10.1097/SLA.0b013e3181642ec8

10. Catchpole K, Giddings A, Wilkinson M, Hirst G, Dale T, De Leval M. Improving patient safety by identifying latent failures in successful operations. Surgery 2007: 142 (1): 102-110. https://doi.org/10.1016/j.surg.2007.01.033

11. Catchpole K, Ley E, Wiegmann D, Blaha J, Shouhed D, Gangi A et al. A human factors subsystems approach to trauma care. JAMA Surg. 2014;149(9):962-968. https://doi.org/10.1001/jamasurg.2014.1208

12. Catchpole K, Perkins C, Bresee C, Solnik MJ, Sherman B, Fritchet J et al. Safety, efficiency and learning curves in robotic surgery: a human factors analysis. Surgical Endoscopy 2015; 30:. 3749–3761. https://doi.org/10.1007/s00464-015-4671-2

13. Catchpole KR, Dale TJ, Hirst DG, Smith JP, Giddings TA. A multicenter trial of aviation-style training for surgical teams. JPatient Saf 2010;6(3):180-186. https://doi.org/10.1097/PTS.0b013e3181f100ea

Keywords

human factors; surgical error; equipment design; patient safety

Additional Information

This presentation was given at the one day symposium, Current Approaches to Understanding Surgical Error, University of Leeds, Leeds, UK, on 9 December 2016.

Conflicts of interest: none declared.