Legal Matters

John R. Clark, JD, MBA, NRP, FP-C, CCP-C, CFC, CMTE

When Good Isn’t Good Enough The Critical Care Transport Medicine Conference held in Austin, TX, in April 2014 hosted many great speakers. There was a common theme in all the presentations of improving the care we provide to make it both better and safer. Michael Frakes captured my interest with his talk “Eyes On the (Wrong) Prize” (oral communication, April 2014) that examined how and why errors are still present even though everyone has programs, systems, and strategies in place to minimize them. He made a clear and concise argument that although there is much focus on reducing errors, it seems that, despite all of the effort, the same mistakes we made a decade ago are still being made today. We’ve all made mistakes, but why do we keep making them? From my own experience, as a solo paramedic working a cardiac arrest with the help of a basic life support engine crew, I somehow managed to spike and hang a premixed bag of lidocaine for my intravenous access instead of normal saline. On another occasion, during that vibration that accompanies transition from forward flight to landing, I heard a “thunk” on the floor as a handgun fell from under our trauma patient onto the cabin floor and slid to the clamshell doors. And, in a different context, I answered the door of the crew quarters late at night in a rainstorm to find a very soggy and somewhat unhappy Civil Air Patrol volunteer who informed me that they had been searching for half the night in a howling storm looking for the source of an emergency location transmitter (ELT) ping that they triangulated to our home airport. That ping happened to be emanating from our aircraft that was warm and snug in the hangar. Apparently, when putting her away for the night, someone inadvertently triggered the ELT. In all 3 of my examples, no untoward events occurred, but they were all mistakes that could have resulted in harm. In my cardiac arrest incident, it helped explained why there was no ectopy when we got a rate and rhythm back. Our trauma patient was paralyzed, sedated, and intubated, so the handgun was of little use to him. And, finally, the Civil Air Patrol guy was wet, but he and his team were able to enjoy some great practice tracking down the source of that constant pinging. The 1999 Institute of Medicine’s report To Err Is Human: Building a Safer Health System Committee on Quality of Health Care in America1 opened Pandora’s box about the scale of medical errors when the Institute of Medicine reported that up to 98,000 people a year die because of mistakes that are made in hospitals. Initially disputed, the number is now widely accepted as a reasonable estimate of the problem. Ten years later, in 2010, the Office of Inspector General for the Department of Health and Human Services estimated that substandard care in hospitals contributed to the deaths of 180,000 patients in Medicare alone in any given year.2 July-August 2014

In 2013, John T. James, PhD, examined 4 studies about medical errors.3 All of the studies used the Institute for Healthcare Improvement’s Global Trigger Tool4 that flags specific evidence in medical errors that point to an adverse event that may have harmed a patient. Using a weighted average of the 4 studies, he concluded that at least 210,000 deaths and maybe as many as 440,000 deaths are caused by preventable errors. Using the numbers from the James analysis places medical errors as the third-leading cause of death in the United States only behind heart disease and cancer. Commentary related to these findings points out that the real number is elusive because of inaccuracies in medical records and the fact that not all providers report mistakes.5 Back to Mr Frakes point, he presented the following statistics to illustrate his point: (1) 2%-6% of all hospitalized patients have an adverse event while hospitalized that adds on average 1.7 to 5.5 days to their length of stay, (2) 12% of discharged patients have a preventable event related to the hospitalization, and (3) nearly 270 patients in the United States per day die from some type of error.6 Even if we were able to achieve 99.9% reliability, that would mean that 3,000 newborns would be dropped during delivery every year. Has society been programmed to accept a certain amount of error? When we consider how errors impact the patient, in transport you have to consider what happens on the aviation side too. In February 2014, the Federal Aviation Administration (FAA) published its long-awaited final rule covering helicopter air ambulance, commercial helicopter, and Part 91 helicopter operations.7 The final rule mandates new operational procedures and additional equipment requirements for helicopter air ambulance operations, with the aim of addressing an increase in fatal helicopter air ambulance accidents. The belief is that helicopter operations will be made safer thanks to the revised requirements. The FAA has estimated that the requirements of the final rule will cost the US helicopter emergency medical services (HEMS) industry around $280 million over the next decade. The single most expensive rule provision is the need for operational control centers, which the FAA estimated will cost $77 million over the next 10 years, and technology includes helicopter terrain awareness and warning system, flight data monitoring systems, and radio altimeters will cost a combined $95 million. Weather has a significant impact on HEMS operations, and because many destinations lack accurate weather reporting for the off-airport locations that helicopters typically service, mistakes crop up. Add to that en route obstacles like towers and ridges, and looking for and landing at an unprepared landing zone with wires, trees, and uneven ground ups the complexity. Even beyond the operational challenges of HEMS, the potential 133

for mechanical failure because of defective parts or maintenance cannot be overlooked. Everyone doing this job knows that HEMS operations are complex and not without danger, but being dangerous is not the same as unsafe and yet that seems to be how much of the public and some of the regulatory bodies view us. Some statistics show HEMS aircraft to have a fatal accident rate that is 6,000 times that of commercial airliners.8 All of this leads to the categorization of critical care transport as a High Reliability Organization (HRO). An HRO is defined as an organization with systems in place that are exceptionally consistent in accomplishing their goals and avoiding potentially catastrophic errors.9 The characteristics we see in HRO operations are hypercomplex with tight coupling overseen by multiple decision makers using a complex communication network in which errors may have severe consequences requiring immediate and continuous feedback leading to adjustments; all of these adjustments are made within a compressed time frame. Does that sound like your last day at work? Even the most reliable operations recognize that when small things start to have failures they are often early warning signals of bigger trouble ahead. Near misses and small errors are great learning opportunities to have a better understanding of future problems and to allow you to prevent them before they happen. Airway management is 1 of those tasks that critical care transport providers do every day with great success and few complications, but when there is an error, it is often significant. Consider the case of 62-year-old Susan Kalitan10 who had outpatient elective surgery to treat carpal tunnel syndrome in her wrist. In the operating room, she was intubated by a student under direct supervision. After an uneventful surgery, Ms Kalitan complained of severe chest pain and pain on swallowing. Even with these complaints, she was discharged to home. The next day a neighbor checked on her and found her nearly unconscious. She was transported to the hospital, where it was discovered she had an esophageal rupture that caused a massive chest infection, resulting in a 2month hospitalization. Kalitan has never returned to work and has been declared disabled by her physicians. A lawsuit was filed against the hospital and the anesthesia care team that consisted of an anesthesiologist, a certified registered nurse anesthetist (CRNA) who was supervising a student anesthetist), and a student registered nurse anesthetist. The lawsuit claimed negligence during the insertion of the endotracheal tube, which, because of the application of too much force, caused the tear in the esophagus. A jury awarded $4.7 million attributed as follows: $2 million for past pain and suffering, $2 million for future pain and suffering, and an additional $718,000 for related medical expenses. The jury verdict sheet went on to break down the charges as such: the hospital was 35% liable, the anesthesiologist 50%, the CRNA 10%, and the student registered nurse anesthetist who inserted the endotracheal tube 5%. The Kalitan case is a technical error, but the next case is clearly bad judgment. Paramedics were dispatched for a 48year-old man who was not feeling well and presented with diaphoresis and shortness of breath. Rather than treating the 134

patient aggressively in his apartment, they instead selected to walk the man down 3 flights of stairs to the ambulance before initiating treatment. Thirty minutes later, he was pronounced dead in the emergency department after suffering an extensive anterior wall myocardial infarction. To make matters worse, it appeared that the EMS patient care report was falsified to show that the patient was carried down the stairs. The paramedic making the report told investigators that he wrote that the patient was carried down the stairs out of force of habit and because he was in a rush to get the paperwork to the hospital. In their defense, the paramedics were prepared to testify that they assisted the decedent down the stairs and did not make him walk on his own, rationalizing that time was of the essence and it was faster and safer to assist him down the stairs rather than secure him in a stair chair and then carry him. The jury found for the decedent and awarded his wife $1 million. In both of these cases, there were procedures and processes in place to reduce the errors that occurred. In the Kalitan case, esophageal rupture is a known complication of endotracheal intubation, and because the student nurse anesthetist was being overseen by a CRNA who was in turn being supervised by a board-certified anesthesiologist, it would be reasonable to assume that this was not an out-of-control random insertion of an endotracheal tube. Instead, the true error in this case is that Ms Kalitan was discharged home even though she was complaining of chest pain and difficulty swallowing. Somewhere in the steps before releasing her, someone made an error by not recognizing the significance of her complaints. The error in the EMS case was more about judgment, even though the case outlines that the paramedics violated policy when they walked the patient down the stairs. The other interesting component of this case is the statement made by the paramedic writing the chart in that he documented 1 thing but did another “out of force of habit” driven by the fact that he was under pressure to get the “paperwork to the hospital.” Have we created systems that are too complicated and collecting too much data that the system itself creates errors? I think the elephant in the room in this case is that documentation, although in its purest form memorializes the care we provide, does not always reflect the care we provide in the most accurate way. Humans are not perfect, and medical errors are not cloaked in malice. Errors are often the result of flaws in the system that allow errors to happen and perhaps even go unnoticed (or unreported). Sometimes, errors occur because they happen, but it is our responsibility to create a system that minimizes the potential for error and also mitigates the risks associated with from an error. What policies or guidelines exist in your organization for reporting of errors? In aviation, there are reporting guidelines for violations of the rules, but generally they do not have the veiled threat of litigation. For example, if the pilot busts his duty time requirement because of a long flight, he must report it and sanctions are not punitive. However, disclosure of medical errors may expose the individual and the organization to potential litigation. To encourage disclosure of errors Air Medical Journal 33:4

in health care, every state has a law on the books that protects disclosures made for the purpose of quality improvement from discovery in litigation. It is easy to rationalize that the cost is too much to be error free. The reality is that it is more expensive not to address even a 0.1% error rate. Critical care transport relies on a core of highly experienced and proficient staff with a shared experience among the whole team with individual accountability for team performance that is supported by continuous training. Red rules and reference cards and continual education are all strong tools to help combat the threats of errors and lapses in judgment, but the best way of reducing errors is to cultivate and reward a workforce that strives to be better every day with exceptional clinicians who are caring and compassionate caregivers and are always doing what is right for the patient.

References 1. Kohn LT, Corrigan JM, Donaldson MS, Eds. To Err Is Human: Building a Safer Health System Committee on Quality of Health Care in America, Institute of Medicine. Washington, DC: National Academies Press; 2000. 2. Adverse Events in Hospitals: National Incidence Among Medicare Beneficiaries. Washington, DC: Department of Health and Human Services, Office of the Inspector General. https://oig.hhs.gov/oei/reports/oei-06-09-00090.pdf. Accessed April 1, 2014. 3. James JT. A new, evidence-based estimate of patient harms associated with hospital care. J Patient Saf. 2013; 9:122-128. 4. IHI global trigger tool for measuring adverse events. Cambridge, MA: Institute for Healthcare Improvement. http://www.ihi.org/resources/Pages/Tools/IHIGlobal TriggerToolforMeasuringAEs.aspx. Accessed April 1, 2014. 5. How many die from medical mistakes in U.S. hospitals? Health News from NPR. http://www.npr.org/blogs/health/2013/09/20/224507654/how-many-die-frommedical-mistakes-in-u-s-hospitals. Accessed April 1, 2014. 6. Frakes M. Eyes on the (wrong) prize. Critical Care Transport Medicine Conference. April 1, 2014; Austin, TX. 7. Department of Transportation, Federal Aviation Administration, 14 CFR Parts 91, 120, and 135 Final Rule. https://www.faa.gov/regulations_policies/rulemaking/ recently_published/media/2120-AJ53.pdf. Accessed April 2, 2014. 8. As medical helicopter industry has grown, So have fatal crashes. Washington Post. August 21, 2009. http://www.washingtonpost.com/wp-dyn/content/article/2009/ 08/20/AR2009082004500.html?hpid⫽topnews. Accessed April 1, 2014. 9. McKeon LM, Oswaks JD, Cunningham PD. Safeguarding patients: complexity science, high reliability organizations, and implications for team training in healthcare. Clin Nurse Spec. 2006;20:298-304; quiz 305-306. 10. Kalitan v Alexander et al, 08-029706 (17th Judicial Circuit of Florida 06-16-11).

John R. Clark, JD, MBA, NRP, FP-C,CCP-C, CFC, CMTE, is a member of the board of directors for the Board for Critical Care Transport Paramedic Certification (BCCTPC) and legal advisor and member of the board of directors for the International Association of Flight and Critical Care Paramedics (IAFCCP). Editor’s Note: While the information in this article deals with legal issues, it does not constitute legal advice. If you have specific questions related to this topic, you are encouraged to consult an attorney who can investigate the particular circumstances of your individual situation. If you have an issue you would like to see addressed in a future issue of AMJ, please contact the author at [email protected] to suggest a topic. 1067-9991X/$36.00 Copyright 2014 by Air Medical Journal Associates http://dx.doi.org/10.1016/j.amj.2014.04.011 July-August 2014

135

When good isn't good enough.

When good isn't good enough. - PDF Download Free
77KB Sizes 1 Downloads 3 Views