November 2021 Volume LVI Number 6

 
 
 
Screen_Shot_2021-11-03_at_2.09.02_PM

Feature Story: Safety is Vital

November 2021 Volume LVI Number 6

Systems and Safety in Dentistry
 
by Dr. Charles S. Czerepak
AAPD Foundation President
 
What comes to your mind when someone brings up the sub- ject of safety? Do you reflexively think about wearing an N-95 respirator? Counting gauzes in the OR? Maybe its not even dental related: seat belts in cars, or passwords on your com- puter. Safety has a wide and nuanced meaning, and that is why it is so difficult a subject to quantify in the health care setting. You can examine patient safety, staff safety, and even safety of the building. It all depends on your point of view and what you are examining.
 
This article reviews several building blocks of the current sci- ence of health care safety. From there we will apply these new principles to two case histories. Let’s begin with training. I was trained at Northwestern like the physicians at our hospital – to be the "captain of the ship". The idea was that if a dentist was trained to a high degree of knowledge and experience, acci- dents would be avoided, or at least managed well. Back then it was thought that good doctors don’t make mistakes. However, that point of view changed with the publication of a modest commentary in the Dec. 21, 1994, issue of the Journal of the American Medical Association. The author, Dr. Lucian Leape, a pediatric surgeon and researcher at Harvard Medical School, entitled it, "Error in Medicine"1. Leape focused particular attention to a paper written by Dr. Elihu Schimmel, M.D.,2 which examined the patient records of a large teaching hospital.
 
Schimmel found that 20 percent of the adverse events were caused by physician errors (iatrogenic) and 20 percent of those errors were serious or fatal. Leape posed the question, "How could the error rate be so high?". The physicians involved were trained at major institutions to the highest standards, yet pa- tients were dying due to physician errors.
 
To quote Leape:
"Physicians are socialized in medical school and residency to strive for error free practice.19 There is a powerful emphasis on perfection, both in diagnosis and treatment. In everyday hospital practice, the message is equally clear: mistakes are unacceptable. Physicians are expected to function without error, an expectation that physicians translate into the need to be infallible. One result is that physicians, not unlike test pilots, come to view an error as a failure in character—you weren’t careful enough, you didn’t try hard enough. This kind of thinking lies behind a common reaction by physicians. "How can there be an error without negligence?1
 
Though physicians were trained for perfection, Leape observed that errors were still occurring. He called it "perfectibility" train- ing. With medical errors came blame and guilt and an atmo- sphere where errors were rarely discussed or at least shared only privately.
 
Leape looked for an explanation outside of the medical safety literature. He turned to the studies of human factors by special- ists and psychologists who studied human cognition. This field of study examined how human behavior could cause an accident. One of its most accomplished researchers was James Reason, a professor of Psychology at England’s University of Manchester. He studied the industrial accidents at Three Mile Island, Bhopal, the Challenger shuttle failure and Chernobyl, to understand the fundamental causes3. Reason’s work led to the Swiss-cheese model of the causes of an accident. (Figure 1) Each layer represented a defense in the system to prevent an accident, but each layer of defense had a flaw (the hole) and when infrequently, all the flaws of the system aligned them- selves that could lead to an accident. He referred to that phe- nomenon as the trajectory of the accident, and his key point was that an accident is caused not only by the human error, but also by the flaws in the system. Those systems flaws were called "latent errors" and were accidents waiting to happen.
 
On examining the industrial accidents, Reason found that even though there were operator errors, they were only part of the explanation of why these complex systems failed. He pointed out that these disasters were caused by major design errors in place long before the operator introduced errors.
 
Reason’s work helped Leape to realize that in the arena of health care errors, it was not enough to look at the physi- cian’s error but also the system flaws that surrounded the physician. To re- duce errors, each part of the system had to be examined for these flaws/latent er- rors. In hospitals he suggested a review of all the delivery systems in an attempt to decrease medical errors. As you can imagine Leape’s commentary was not initially met with universal support from the medical profession,4 but it did lead the Institute of Medicine to report on the state of human safety in health care. The report, To Err is Human: Building a Safer Health System,5 published in 1999, stated that errors occurred not be- cause of bad people working in health care, but good people working in bad systems. Systems needed to improve, and the intent was not to remove the physician from responsibility but to find a way to deliver health care more safely. The report outlined several recom- mendations to create a safer health care system:
  1. Establishment of a national center for patient safety in the U.S. Depart- ment of Health and Human Ser- vices, which became the Agency for Health Care Research and Quality (AHRQ).
  2. Make mandatory and voluntary reporting systems in health care.
  3. Announcement of new standards on safety from the Joint Commis- sion and a report, "Safe Practices for Better Health Care", which was a consensus report by the National Quality Forum on evidenced based safe practices in health care.
  4. Health care organizations to cre- ate an environment in which safety becomes the top priority.
The second recommendation, manda- tory and voluntary reporting systems, caused a lot of consternation in the medical community, fearing that with reporting, there could be data collection. Physicians could learn from their own ac- cidents and those of their colleagues. An open learning environment could lead to improvements in systems and decrease preventable errors. At the time of the re- port, some physicians feared the public would feel health care was unsafe and law suits would increase in frequency, 21 years later, it has been shown that an organization/hospital that is more trans- parent has fewer law suits and lower dol- lar settlements.6,7 When we look at dental reporting key phrases are "non-discoverable" and "degree of acuity". Non- discoverable means that no one would know who did the reporting. This allows the doctor to disclose without admit- ting guilt or worrying about a potential lawsuit from the disclosure. In regards to acuity, there would need to be a lower limit on what gets reported. Certainly, subsequent hospitalization from compli- cations from a dental procedure must be reported, but dropping an orthodontic band on the floor would not. Currently in dentistry, we do not have one accepted means of non-discoverable reporting, but the safety committees of both the AAPD and the ADA are working on it.
 
The following is a brief review of ap- plication of concepts described above. The medical community has changed its focus from a provider-centric to a system-centric analysis of health care errors. Instead of blaming a practitioner, the key to error reduction is to examine the system.
 
Two distinct adverse events, one in in- dustry and one in the dental office, offer an opportunity to use the lens of system thinking to examine the accidents.
 
The first is an industrial standard for error analysis. The tragedy of the Chal- lenger Space Shuttle disaster,8 in which all seven astronauts on the Challenger perished on Jan. 28,1986. At that time the Challenger Space Craft held the re- cord for most successful space flights by any of the space shuttles. On this flight the rockets failed catastrophically 73 seconds after liftoff. The primary failure mode was the erosion and ultimate failure of an O-ring on a section of the solid-state booster rocket. Examination of the accident found that the O-rings on the solid-state booster rockets had some degree of failure on most of the previous Challenger flights. The engineers who designed that booster rocket felt there was enough of a safety margin to war- rant continued flights without redesign of the O-ring joint. For James Reason, this would be the latent error waiting for an accident to happen. But there were other flaws, mainly resting with the NASA hierarchy on how decisions whether to launch the rocket were made. Complicating the circumstance (Reason’s hole in the Swiss cheese) was that it was very cold that day, around 32 degrees Fahrenheit. (Pictures of the launch platform showed icicles on equipment.) O-rings were never meant to function at that low a temperature. When the NASA officials spoke with engineers of Marshall Space Flight Center and Morton Thiokol Inc, who developed the solid-state booster rockets, a miscommunication gap was identified. So, even though some engineers recognized that the cold could negatively affect the functioning of the O-rings, the launch occurred anyway. Part of the failure of the system was the decision made by the launch director to go ahead with the mission, so, there was a real time error in deciding to launch, but latent factors doomed the shuttle flight from the beginning. This very complicated system with its numer- ous defense mechanisms, each with a vulnerability, that all lined up caused the accident. Because of its high media profile, the Challenger accident has been studied by many organizations, but one investigator, Diane Vaughan, deserves special mention. In her book: The Chal- lenger Launch Decision, risky technology, culture, and deviance at NASA9 she added to the safety lexicon with the concept of "the normalization of deviance". The easi- est way to describe the normalization of deviance is by observing the person who chooses to text while driving a car. The very first time the phone is checked, there is a heightened awareness that this is a dangerous thing to do. But because nothing bad happened, the next time the phone is looked at there is less of a sense of danger. This observation leads to distracted driving and not feeling unsafe, until the unfortunate day when an accident occurs. In the case of the Chal- lenger, the partial O-ring failures on the previous flights should have served as a warning that the system was not as safe as it could be, but the engineers began to look at O-ring erosion as normal and not a possible failure mode, an uncon- scious lowering of the margin of safety.
 
The normalization of deviation is an important lesson for anyone practicing clinical dentistry. As months and years of practice accumulate, there can be a natural erosion of situational awareness.Lowering the threshold for an accident to occur in your office. The moral of this sto- ry is that if the O-rings (the system) had been properly addressed, a catastrophic ending with the demise of the seven astronauts would not have occurred.
 
For our second case history, let’s exam- ine a dental adverse event. I would like to thank Dr. Ronald Zentz from CNA insurance and Dr. Jennifer Flynn from the Dentist’s Advantage for this actual case history.
 
The plaintiff/patient was a minor who underwent extraction of a baby tooth for the development of her permanent tooth. The patient alleged that the defendant dentist extracted the patient’s adult tooth which was next to the baby tooth he intended to extract. The patient alleged that the defendant dentist was also negligent in confirming that he was extracting the correct tooth, and negligent in stopping the extraction when it was difficult to remove the tooth, which is a sign that he was attempting to remove a mature tooth, instead of a baby tooth. The patient also alleged that the defendant dentist did not obtain informed consent. According to the report the dentist by verdict was ordered to pay punitive damages.10,11
 
In the classical approach, the supposition was that the dentist could not make that error, but an error was made. To deter- mine the cause of the error the dentist would have to self-examine to evaluate why professional training failed or was not followed to cause the erroneous extraction. Certainly, the error would not be shared with the profession. 
 
Under the lens of the IOM report "To Err Is Human", which systemic issue allowed the permanent tooth to be extracted? The second primary molar can some- times resemble the first permanent molar in shape and size. They can be mistaken for each other especially if the dentist is distracted, perhaps by a non- cooperative patient or a busy schedule. So, what systems-fix could the dentist have made to increase the awareness of which tooth to extract? Certainly, mark- ing the correct tooth to extract would uniquely identify the tooth and would be an easy system-fix, just as orthopedic sur- geons mark a surgical site preoperatively. Marking would add another layer of de- fense to the system. Another system-fix could be to call a time out with the assis- tant and the patient. verbally identifying what tooth is going to be treated. Pre–Dr. Leape, the dentist’s judgement was the weak link, post-Dr. Leape the system you work in is the weak link.
 
The practice of pediatric dentistry involves systems of care. All of these systems have latent flaws that can lead to errors in patient care. Think of the number of potential errors existing in any cycle of care we use. Through review of our systems, we have the chance to achieve "Zero Harm".
 
I would like to end by recommending Dr. Leape’s latest work, Making Healthcare Safe the story of the Patient Safety Move- ment, a great resource for anyone with an interest in patient safety. And finally, the last paragraph of Error in Medicine, writ- ten 26 years ago, says it all:
"But it is apparent that the most funda- mental change that will be needed if hospitals are to make meaningful prog- ress in error reduction is a cultural one. Physicians and nurses need to accept the notion that error is an inevitable accom- paniment of the human condition, even among conscientious professionals with high standards. Errors must be accepted as evidence of systems flaws not charac- ter flaws. Until and unless that happens, it is unlikely that any substantial progress will be made in reducing medical errors."
 
Acknowledgements
Dr. Keven Donly and Dr. John Rutkauskas who organized the first task force on patient safety for the AAPD.
Dr. Paul Casamassimo for his steady hand in promoting quality improvement both in the AAPD and ADA.
Dr. Jade Miller, the chairman of the first Committee on Safety for the AAPD.
Clare Conte for her work with the AAPD safety committee.
Dr Steven Geiermann, the ADA liaison to the ADA safety taskforce.
Dr. Nancy Hijjawi for her drive to create a safe dental practice.
The dentists, children and parents that I work with every day.
MaryAnn Czerepak for her unwavering support.
 
References
  1. Lucian L. Leape, "Error in Medicine," Journal of the American Medical As- sociation (JAMA) 272, no.23 (1994): 1851.
  2. Schimmel EM. The hazards of hospitalization, Ann Intern Med. 1964:60:100-110.
  3. Reason J. Human error. Cambridge: Cambridge University Press; 1990.
  4. Zinman D, Study finds hospitals "harm" some. Newsday 1990 March 1, 1990; Sect.17A
  5. Kohn KT, Corrigan JM, Donaldson MS, editors. To err is human: building a safer health system, Washington DC: National Academy Press; 1999.
  6. The National Safety Foundation’s Lucian Leape Institute. Report of the Roundtable on Transparency 2015. Boston, MA. NPSF:2015.
  7. Sandra G. Boodman, Should Hos- pitals and doctors- apologize for medical mistakes? the Washington Post, March 12, 2017.
  8. to the Presidential Commission on the Space Shuttle Challenger Accident (Washington. DC: Government Print- ing Office) 1986.
  9. Vaughan D. The Challenger Launch Decision risky technology, culture, and deviance at NASA. The University of Chicago Press, Chicago and London, 2016.
  10. Personal communication with Dr. Zentz for permission to use case study on 9/1/2021.
  11. Downloaded from the Dentists Ad- vantage Website with permission.
 
The 19 in the Leape quote references:
19. Hilfiker D Facing our mistakes, N Engl J Med 1984;310:118-122.
 
Figure 1. Trajectory of an accident, adapt- ed from "Human Error," by James Reason, 1990, One Liberty Plaza, 20th floor, New York, NY 10006, USA, Cambridge Univer- sity Press, p 208, figure 7.8
 
 
 
 

Click here for a PDF version of this article.