Decision-making Errors in Anesthesiology

Marjorie P. Stiegler, MD Department of Anesthesiology, University of North Carolina at Chapel Hill, Chapel Hill, North Carolina

Anahat Dhillon, MD Department of Anesthesiology, University of California, Los Angeles, California



What is a “Cognitive Error”?

Cognitive error is a term used to define a faulty thought process, as opposed to a gap in knowledge or a deficiency in technical skills. Faulty thought processes and preferences are important factors in medical decision-making mistakes. Although many authors have used traditional human factor paradigms to classify errors, the application of decisionmaking psychology in the specialty of anesthesiology is advancement in the understanding of error cause, prevention, and recovery. Decisionmaking psychology seeks to explain the human factor components that occur, despite skill, knowledge, and good intentions. Latent conditions have been defined by Reason as gaps that are present in a system and ultimately allow an adverse event to take place. Therefore, latent conditions should represent the primary targets of any safety management system.1 We believe that cognitive errors are caused by deeply ingrained and subconscious thought processes and should therefore be viewed as latent conditions that can be targeted and minimized. Many catalogs of biases and cognitive errors have appeared in the popular press; they sometimes vary in their precise definitions and are often perceived to be overlapping or redundant. It seems unlikely that strict terminology will be agreed upon, but this is less important than a general understanding of thought process pitfalls. This chapter explores principles of cognitive behavior and specific cognitive errors that are likely to be responsible for anesthesiologists’ mistakes. REPRINTS: MARJORIE P. STIEGLER, MD, DEPARTMENT OF ANESTHESIOLOGY, UNIVERSITY OF NORTH CAROLINA, CHAPEL HILL, N2198 UNC HOSPITALS, CB 7010, CHAPEL HILL, NC 27599-7010. E-MAIL: MSTIEGLER@AIMS. UNC.EDU INTERNATIONAL ANESTHESIOLOGY CLINICS Volume 52, Number 1, 84–96 r 2014, Lippincott Williams & Wilkins

84 | www.anesthesiaclinics.com

Decision-making Errors in Anesthesiology ’



85

Key Elements of Cognition That Lead to Error

As a framework for understanding specific cognitive errors, it is important to explore some of the processes that are inherent to human cognition on a subconscious level. Because these processes are largely subconscious, they are challenging to recognize (particularly in oneself), and are also generally difficult to detect by direct observation or traditional root cause analysis techniques.2 Error Blindness

Error blindness describes a phenomenon in which we are unaware of an error at the time we are committing it. Although the interval between error and recognition of that error may be very brief, prolonged, infinite, or anything in between, we cannot have insight into our errors at the time that we are making them. A corollary is that “being wrong” feels identical to “being right.”3 Our brains have a strong preference for being right and generally assume that we are. Bias

A bias is simply an inclination or tendency to prefer one thing to another. Bias represents a proclivity toward judgment that may have roots in anecdotal or social situations. In medicine, it may represent a perspective or decision that is not based on evidence but one’s own experience. It is well studied that accumulation of experience does not necessarily result in the acquisition of expertise.4 “Bias Blind Spot” and Overconfidence

Overconfidence is a pervasive human nature trait. For example, only 1% of the general population believes their driving ability to be below average.5 Physicians are no exception. For example, Mele6 reports that 94% of academic professionals rate themselves in the top half of their profession. Both overconfidence and bias are often easy to detect in others, but difficult to recognize in one’s own judgments.7 Believing ourselves to be impervious to the influence of bias is called a bias blind spot by cognitive psychologists. This blind spot is not attenuated by cognitive sophistication and in fact may be worse among people with higher cognitive abilities.7 Illusion of Personal Accuracy

There is a widespread illusion of personal accuracy in knowing one’s own strengths and weaknesses.8 People are often bewildered at how others can be so bad at self-assessment, while usually believing that they www.anesthesiaclinics.com

86



Stiegler and Dhillon

themselves have good accuracy. It has long been known that physicians do not possess good self-assessment skills. Perhaps not surprisingly, this phenomenon is so pervasive that the majority of people think they are above average in self-assessment ability.9 To illustrate a key pitfall in faulty logic about self-assessment, Eva suggests thinking of the world as a 22 table, in which there are some activities at which we actually excel and others at which we actually perform dismally, crossed with some activities we think we do well and others we think we do poorly. Although it is perhaps easy to identify items in the cell that contains items we think we perform poorly and in fact we do perform poorly (in their paper, for example, a colleague knows that he will never be a professional football player), that calibration does not necessarily extend to the other 3 cells. Memory Shifting

Memories are not static; they are prone to distortion by a variety of mechanisms. The mere process of memory retrieval has 2 important consequences. Retrieval results in reactivation of the neural processes associated with the initial learning of information, which causes consolidation and increases the durability or permanence of the memory. However, the same neural process activation can cause information that diverges from the original event to be incorporated into the memory representation, thereby promoting memory distortion.10 In simple terms, this means that memories can become less accurate with each retrieval, possibly to the point of becoming completely false. In addition, memories are not precise records of events. Memories encode and store the perceptual gist of a situation, and retrieval is a reconstruction of events. People generally accept that memories are prone to the error of complete loss (ie, forgetting something altogether) but may be less receptive to the idea that the content of their own recollections is flawed. Just as we display overconfidence in our abilities, we also have overconfidence in our memories. Heuristics

Heuristics are efficient cognitive processes that ignore some information; essentially, they are mental short cuts, often based on pattern recognition. They are familiar in medicine and dictums are often repeated to trainees (“common things are common” and “when you hear hoofbeats, think of horses instead of zebras”). By definition, heuristics require expertise and experience to establish, and as such these are often adaptive mechanisms used by experts to be efficient. They may be particularly beneficial when faster decisions are better www.anesthesiaclinics.com

Decision-making Errors in Anesthesiology



87

decisions. Because using heuristics saves effort, 1 controversial view has been that heuristic decisions may result in more errors than do “rational” decisions using analytical methods.11 Dual-Process Model of Cognition

A dual-process cognitive model of reasoning influences many medical decisions, particularly when making a diagnosis.12 Cognitive psychologists and medical educators have long understood that some decisions are made by hypothetico-deductive, Bayesian, analytical processes. This consists of generating a list of hypotheses, assigning probabilities to these hypotheses, and subsequently modifying the list when new data warrant an adjustment. Other decisions are intuitive, heuristic based, and automatic. For most medical decision making, context and circumstances will influence whether an intuitive or analytical approach is more appropriate. For example, preoperative consults or daily ICU rounds are perhaps more likely to be suited to the analytical processes, as is an educational session that critiques recent journal articles. In contrast, operating room emergencies evolve rapidly over a background of uncertainty and are life threatening. Hence they are more prone to “quick and dirty” intuitive decision making. Fortunately, there is good evidence that heuristic-based decisions are frequently accurate; some studies suggest that they are more accurate than hypothetico-deductive methods.11 Loss Aversion and Regret

Loss aversion describes a phenomenon in which the negative association of a loss is more than twice as powerful a motivator than the positive association of an equivalent gain.13 As a result, our actions do not seek to maximize gains, but rather, to avoid losses. Studies have consistently shown that subjects regret actions that resulted in negative consequences less so than they regret inaction, even when outcomes are just as bad. The “Free Agent” Fallacy

People are generally considered to be free agents, who deliberately select a specific course of action, and are therefore held responsible for an error that results in an adverse outcome. However, psychologists have long known that many subconscious influences affect our decision making and question whether we are the “free agents” we believe ourselves to be.14,15 Functional brain imaging studies have shown that the outcome of a decision is encoded in the brain activity of the prefrontal and parietal cortex up to 10 seconds before it enters a subject’s awareness. In other words, the moment a decision seems to be www.anesthesiaclinics.com

88



Stiegler and Dhillon

made may be actually long after it is actually made, in terms of neural processing.16 This may explain why we seem to simultaneously realize we do not want to do something, yet are already doing it, and why the recovery from a “split-second” change in plans is so awkward.



Cognitive Errors in Anesthesiology

The psychological processes described above provide the framework for understanding cognitive errors. For purposes of illustration, we will explore several specific cognitive errors along with clinical examples of their manifestation. An expanded catalog is also included, although it is not comprehensive (Table 1). This episode, which has been modified slightly to preserve anonymity, was relayed to the author (M.P. Stiegler) by personal written communication, 2013. A narcotic tolerant middle-aged gentleman presented with excruciating pain for a late afternoon lumbar discectomy. He received 200 mcg IV fentanyl with good improvement in his pain and no respiratory depression. Induction was uneventful with lidocaine, propofol, and rocuronium, and intubation was accomplished without difficulty. The patient was turned prone, a second peripheral IV was placed for planned anesthetic infusions, and the blood pressure cuff was changed to the opposite arm. Subsequently, hypotension occurred. Fluids were given without response. Phenylephrine was given, also without significant improvement, and anesthetic depth was decreased. Consideration was given to a variety of causes, including relative overdose of induction and pre-induction agents, extended NPO time and relative hypovolemia, positional hypotension due to turning prone and the effects of the table on venous return, undiagnosed asymmetry between upper extremity blood pressures, and delayed surgical stimulation and relative overdose of anesthetic. Fluids and phenylephrine was continued without improvement, and troubleshooting efforts were directed at the blood pressure cuff. Surgical stimulation was encouraged, but blood pressure did not improve when the operation began. Finally, epinephrine was chosen, but the patient developed ST segment elevations on the EKG before it was administered. The blood pressure improved slightly, the procedure was aborted, and the patient was transferred to the cardiac lab for angiography, which showed no significant disease. The correct diagnosis was anaphylaxis to antibiotics that were administered immediately after the induction of anesthesia. The combination of morbid obesity, hypertension, diabetes, and other cardiac risk factors in his history, coupled with his lack of response to initial therapies, led his caregivers to presume a cardiac cause, rather than consider anaphylaxis, which they are certainly all familiar with. Several cognitive errors probably occurred in this story, including confirmation bias, framing, anchoring, and representativeness bias. www.anesthesiaclinics.com

Decision-making Errors in Anesthesiology

Table 1.



89

Cognitive Processes That May Foster Cognitive Error

Cognitive Error

Definition

Feedback bias

Significant time elapses between actions and consequences; lack of outcome data reporting. Absence of feedback is subconsciously processed as positive feedback Confirmation bias “Believing is seeing”: seeking confirming evidence to support a diagnosis while discounting disconfirming evidence, despite the latter sometimes being more definitive Availability bias Error due to an emotionally memorable past experience (usually negative); subconsciously ignoring important differences between the current presentation and that prior experience Omission bias Tendency toward inaction rather than action, out of fear of causing harm (to patient, if action failed; to self, by damaging professional reputation if wrong). May be especially likely when a significant authority gradient is perceived or real Commission bias Tendency toward action rather than inaction, even when those actions are unindicated or founded on desperation Sunk costs Phenomenon during which the more effort and commitment invested toward a plan, the harder it may become psychologically to abandon or revise that plan Anchoring/fixation Focusing on 1 feature exclusively, at the expense of comprehensive understanding. This may lead to misdiagnosis of a single problem, or missing concurrent diagnoses by focusing on just 1 Framing effect Allowing key descriptive features to unduly influence decisions may be related to transfer of care from one person or team to another Overconfidence Inappropriate boldness, misplaced certainty of abilities. Refusal to acknowledge a dire situation when faced with it Outcome bias Judging a decision on the eventual outcome, rather than the merits of the decision at the time it was made Premature closure Accepting the first plausible diagnosis before it has been fully verified. Although similar to other diagnostic errors such as Availability Bias, Confirmation Bias, and Framing Effect, this represents the “not otherwise specified” cessation of thinking Representativeness Comparing a patient’s presentation to the “typical” or “classic” bias presentation as a rapid “pattern recognition” diagnostic strategy Visceral bias Counter-transference; our negative or positive feelings about a patient influencing our decisions. May include “VIP treatment” or circumstances surrounding “difficult” patients Psych-out error Medical causes that manifest as personality or behavioral problems (eg, agitation due to hypoglycemia) are missed in favor of psychological diagnosis

It illustrates how an entire team of anesthesiologists, nurses, and surgeons could miss a seemingly “classic” diagnosis, despite knowledge, skill, and good intentions. www.anesthesiaclinics.com

90 ’



Stiegler and Dhillon

Prevalence and Relevance

Unpublished data from 2 pilot studies conducted at the University of California, Los Angeles, illustrate the prevalence of cognitive errors in clinical practice. The first is a national survey of anesthesiologists, and the second is an observational study on residents participating in simulation scenarios. American Society of Anesthesiologists’ Survey

We sought to estimate how often cognitive errors are responsible for adverse outcomes in our specialty, and also to identify the perceived importance of particular errors in practice and in training. We hypothesized that the majority of respondents would have identified Z1 of these cognitive errors in their own practices. We also hypothesized that the majority of respondents would have vicariously experienced these errors through conversations with colleagues or formal case analysis (ie, morbidity and mortality conference, root cause analysis, etc). The American Society of Anesthesiologists selected a random sample of members and invited them to participate through email with a link to the anonymous electronic survey. Respondents were queried about 8 cognitive errors (listed in Table 2) selected from a previously established catalog.17 For each, the cognitive error was defined and illustrated with a short clinical example. Participants were asked 2 questions about each cognitive error. The first question asked respondents to indicate whether such processes occurred to them personally, to someone else, or never that they knew about, selecting all that applied. The second question asked respondents to estimate how frequently they perceive each error to occur in anesthesiology practice overall, among near misses and actual error events. This was done using a 5-point Likert-style scale with the following anchors: never/very rarely, rarely, sometimes, often, and very often. Results are shown in Table 2. We received 518 completed responses (n = 2000, 26% response rate). Our respondents were mostly male (76%), with 31% in academic practices, and 58% in private practice, and the remainder in a combination practice. Nearly half had over 20 years of experience (44%), with 85% in practice for at least 5 years. When considering the impact of cognitive errors in one’s own practice, at least 20% of all participants identified all of the cognitive errors as having occurred to them personally, with anchoring, premature closure, and framing bias occurring to nearly 50% of respondents. Feedback (22%), sunk cost (27%), availability (28%), and overconfidence (28%) were reported to have occurred to self the least. When considering the impact of cognitive errors on a colleague’s practice, every cognitive error was identified by >50% of respondents, www.anesthesiaclinics.com

Decision-making Errors in Anesthesiology

Table 2.



91

Self-reported Cognitive Error Prevalence Through Survey

Cognitive Error Anchoring Feedback bias Availability Premature closure Overconfidence Omission bias Sunk cost Framing bias

Identified in Own Practice (%)

Identified in Other’s Practice (%)

55 22 28 45 28 31 27 50

60 47 50 70 81 56 77 68

with premature closure (70%), overconfidence (80%), and sunk costs (76%) occurring most often. Feedback (47%), availability (50%), and omission (56%) bias were reported to have occurred the least to a colleague or trainee, yet still resonated with approximately half of the respondents. Regarding the second question about estimated frequency of occurrence overall, some items that respondents reported the most in actual experiences were perceived to occur “rarely” or “never/very rarely.” For example, roughly one third of respondents estimated overconfidence to play a role in error “rarely” or “very rarely/never,” yet 80% of respondents reported that overconfidence played a role in an error of a case they heard about. Similarly, omission bias was estimated as “very rare” by 58% of respondents, yet one third had identified it in personal experience, and more than half identified in someone else’s event. Of the respondents, 80% reported that overconfidence played a role in someone else’s error, yet only 28% stated that overconfidence was responsible for their own errors. Every single item was identified as being more prevalent in someone else’s practice. This suggests that it may be easier to identify bias and error in others than in oneself, consistent with the previously discussed bias blind spot phenomenon. Of the respondents, 43% perceived that feedback bias never plays a role in errors. Ironically, the brain interprets the absence of feedback as positive feedback. Very few institutions are rigorously capturing data about adverse events such as peripheral nerve injury from positioning, unintended awareness, catheter-associated line infection, ophthalmologic injury and so on, with immediate feedback to the attending anesthesiologist involved in the case. In fact, in many institutions, the denominator is completely unknown even to the department at large (ie, aggregate data on an annual basis). From these results, we conclude that even those items perceived to be least common are still occurring with considerable frequency. Of the www.anesthesiaclinics.com

92



Stiegler and Dhillon

respondents, 85% agreed that education on the topic of cognitive errors would be important for their own practice, and 94% agreed that this education would be important for trainees. The limitations of this study include selection bias and recall bias. It is also unknown whether our results overestimate the true prevalence because of selection bias, or underestimate the true prevalence because of recall bias. Simulation Study

The second study is a simulation-based evaluation of cognitive errors committed by anesthesiology residents, looking at the frequency as reported by the residents themselves and trained faculty observers. The study was based on an existing educational curriculum, consisting of a case library of 20 predeveloped and standardized simulation scenarios. The cases include management of a variety of anesthesiology emergencies and mishaps for all levels of training, including airway emergencies (endobronchial intubation, mucus plug obstruction, laryngospasm, “can’t intubate, can’t ventilate”), pharmacologic emergencies (anaphylaxis, local anesthetic toxicity, malignant hyperthermia), equipment emergencies (power failure, oxygen/nitrous oxide pipeline swap), and surgical emergencies (hemorrhage, venous air embolism, pneumothorax). A total of 39 residents participated, each taking the lead role of “hot seat resident,” acting as the primary manager of a unique case that they had not seen before, while their colleagues observed through video feed. Each simulated case included a script with actor cues and preprogrammed evolution of physiological parameters. Residents were asked to “think aloud” and verbalize their thoughts and actions. Actors and patient’s vital signs responded appropriately to resident maneuvers. The “patient” was a high-fidelity computerized mannequin with physiological monitoring, heart and lung sounds, blinking eyes with reactive pupils, and the capacity for defibrillation and other invasive procedures (Laerdal SimMan 3G, Norway). Each of the 39 residents completed an assessment tool that specifically queries cognitive errors.18 Trained faculty observers completed the same evaluation. Frequencies and correlations for each error were calculated, and data from resident self-identifications and faculty identifications were compared (Table 3). The errors were reported with considerable frequency by both residents and faculty, but there was no significant correlation between their scores (except framing bias). Capturing and studying cognitive errors are challenging. These results indicate that the errors may be occurring frequently, but that perhaps cognitive errors cannot be evaluated by traditional methods of rater observation and “think aloud” methodology (also known as Protocol Analysis).19 The thought process underlying the erroneous www.anesthesiaclinics.com

Decision-making Errors in Anesthesiology



93

Table 3. Frequency of Cognitive Errors and Correlation Between Resident SelfIdentification and Faculty Observers During a Simulation Exercise N (%) Cognitive Error Commission Omission Overconfidence Anchoring Sunk cost Premature closure Availability Framing Confirmation

Resident Identified 3 (7.7) 6 (15.4) 4 (10.3) 18 (46.2) 4 (10.3) 11 (28.2) 6 (15.4) 11(28.2) 3 (7.7)

Faculty Identified 12 15 2 11 9 21 7 16 13

(30.8) (38.5) (5.1) (28.2) (23.1) (53.8) (17.9) (41.0) (33.3)

Fisher Exact P 0.169 0.101 1 0.762 0.657 0.54 1 0.019 0.552

behavior was not apparent to the observer, despite verbalization, possibly because of the stressful, time-pressured nature of anesthesiology emergencies and inconsistent degree of verbalization among participants. Also, as cognitive errors are largely subconscious processes, an interview with the subject may not be sufficient to examine these errors.19 A recent report by Ogdie20 suggests that a reflective writing approach may be able to capture language consistent with these types of thought-process errors. ’

Cognitive Interventions to Prevent and Recover

Many methods have been proposed for preventing error in medical decision making, and a comprehensive treatment is beyond the scope of this review. It has been shown that intuitive, heuristic-based diagnoses made by experts are more accurate than those selected by hypotheticodeductive reasoning.21 Heuristics can, however, result in flawed decisions, especially in the setting of rare events. The following strategies are intended to “de-bias” or counter-balance our intuitive decision making. Some paradigms are better suited to “fast” thinking situations (eg, sudden cardiovascular collapse) and some to the “slow” thinking situations (eg, ICU rounds) as described above in the dual process of reasoning section. Explicitly Describe Heuristics and Impact on Clinical Reasoning

This may be accomplished using traditional lecture approaches, problem or case-based discussions, simulation, and other teaching methods. Trowbridge22 asserts that increased familiarity with specific cognitive traps among learners may increase their ability to reflect upon their own decision-making processes. This process of “thinking about www.anesthesiaclinics.com

94



Stiegler and Dhillon

thinking” is called metacognition. This is clearly an activity not well suited for time-pressured, urgent clinical decisions, but routine reflection as an after-action review may result in lasting skills that make recognition and management of heuristic or bias-related error more effective during emergencies. Engage in Prospective Hindsight

“Prospective hindsight” describes the perspective that is revealed when one looks into the future, sees that an error in the working diagnosis has been made, and considers what might have been missed. First described by military strategists, this concept can be applied to medicine by asking “what if we are wrong?” and considering alternatives.23 Asking how this situation will look to an outsider, perhaps a medical expert witness or a jury member, can often prompt reconsideration of details. Predicting how the clinical picture will unfold during a morbidity and mortality conference is another variation on this prospective hindsight theme. Importantly, the goal of this exercise is not to practice “defensive” medicine, making choices exclusively to “cover” all bases in the case of litigation. Rather it is intended to allow doctors to step back from their own decision-making processes and search for additional data or an alternate point of view. Worst-Case Medicine

Popular in emergency and trauma medicine, the mental exercise of considering the worst case is a way to counterbalance availability and representativeness heuristics, and also to act as a safety net when true Bayesian probabilistic reasoning would rule out rare but life-threatening diagnoses. Rare diseases, or those present without classic symptoms or findings, are possibilities with a smaller but nonetheless real probability. Heuristics may miss these “high stakes” conditions, and thus deliberate strategy is needed to ensure their consideration. Feedback

Feedback is most likely to change a physician’s behavior when it is timely, specific, and includes rationale for why certain behaviors are right and why others are wrong.23 Although significant patient safety gains have been made by addressing “systems” and the latent errorprovoking conditions within them, rather than focusing on individual mistakes, novices, experts, and teachers alike benefit from deliberate practice that includes feedback.4,24 We must craft systems to give us feedback on our quality of practice that is timely, specific, and individualized. www.anesthesiaclinics.com

Decision-making Errors in Anesthesiology



95

Rule of 3

This approach to both diagnostic and therapeutic interventions reminds the clinician to perform a mental stop before selecting a firm diagnosis or repeating a therapeutic maneuver for the third time.25 Before committing to a diagnosis, 3 alternatives must be considered. Similarly, before a third administration of the same vasopressor, alternate treatments and causes must be considered. This forces consideration of alternatives and may prevent, among others, premature closure, anchoring, sunk costs, framing, and confirmation bias. ’

Conclusions

A variety of hard-wired and potentially unavoidable features of cognition act as latent conditions that foster error in medical decision making. Although the use of heuristics may often be advantageous, and the influence of bias and imperfect memory cannot be avoided, the regular practice of counterbalancing strategies may allow doctors to recognize and recover from cognitive errors, thereby enhancing the patient’s safety.

M.P. S. received support for a portion of the work presented in this chapter by the Foundation for Anesthesia Education Research (FAER). The authors have no conflicts of interest to disclose.



References

1. Reason J. Beyond the organisational accident: the need for “error wisdom” on the frontline. Qual Saf Health Care. 2004;13(suppl 2):ii28–ii33. 2. Croskerry P. The importance of cognitive errors in diagnosis and strategies to minimize them. Acad Med. 2003;78:775–780. 3. Schulz K. Being Wrong: Adventures in the Margin of Error. New York, NY: Harper Collins Publishers; 2010. 4. Ericsson KA. Deliberate practice and acquisition of expert performance: a general overview. Acad Emerg Med. 2008;15:988–994. 5. Reason JT, Stradling SG. Errors and violation on the roads: a real distinction? Ergonomics. 1990;33:1315–1332. 6. Mele AR. Real self-deception. Behav Brain Sci. 1997;20:91–102; discussion 103-36. 7. West RF, Meserve RJ, Stanovich KE. Cognitive sophistication does not attenuate the bias blind spot. J Pers Soc Psychol. 2012;103:506–519. 8. Eva KW, Regehr G. “I’ll never play professional football” and other fallacies of selfassessment. J Contin Educ Health Prof. 2008;28:14-9. 9. Pronin E, Lin DY, Ross L. The bias blind spot: perceptions of bias in self versus others. Person Soc Psychol Bull. 2002;28:369–381. 10. Bridge DJ, Paller KA. Neural correlates of reactivation and retrieval-induced distortion. J Neurosci. 2012;32:12144–12151. www.anesthesiaclinics.com

96



Stiegler and Dhillon

11. Gigerenzer G, Gaissmaier W. Heuristic decision making. Annu Rev Psychol. 2011;62: 451–482. 12. Croskerry P. Clinical cognition and diagnostic error: applications of a dual process model of reasoning. Adv Health Sci Educ Theory Pract. 2009;14(suppl 1):27–35. 13. Tversky A, Kahneman D. The framing of decisions and the psychology of choice. Science. 1981;211:453–458. 14. Radder H, Meynen G. Does the brain “initiate” freely willed processes? A philosophy of science critique of Libet-type experiments and their interpretation. Theory Psychol. 2013;23:3–21. 15. Rolls ET. Willed action, free will, and the stochastic neurodynamics of decisionmaking. Front Integr Neurosci. 2012;6:68. 16. Soon CS, Brass M, Heinze HJ, et al. Unconscious determinants of free decisions in the human brain. Nat Neurosci. 2008;11:543–545. 17. Stiegler MP, Neelankavil JP, Canales C, et al. Cognitive errors detected in anaesthesiology: a literature review and pilot study. Br J Anaesth. 2012;108:229–235. 18. Stiegler M, Dhillon A, Huang Y, et al. Non-Technical and Cognitive Skills (NTCS) Self- Reflection and Faculty Evaluation Tools. MedEdPORTAL; 2011. Available at: http://www.mededportal.org/publication/9024. 19. Fox MC, Ericsson KA, Best R. Do procedures for verbal reporting of thinking have to be reactive? A meta-analysis and recommendations for best reporting methods. Psychol Bull. 2011;137:316–344. 20. Ogdie AR, Reilly JB, Pang WG, et al. Seen through their eyes: residents’ reflections on the cognitive and contextual components of diagnostic errors in medicine. Acad Med. 2012;87:1361–1367. 21. Coderre S, Mandin H, Harasym PH, et al. Diagnostic reasoning strategies and diagnostic success. Med Educ. 2003;37:695–703. 22. Trowbridge RL. Twelve tips for teaching avoidance of diagnostic errors. Med Teach. 2008;30:496–500. 23. Graber ML, Kissam S, Payne VL, et al. Cognitive interventions to reduce diagnostic error: a narrative review. BMJ Qual Saf. 2012;21:535–557. 24. Baker K. Clinical teaching improves with resident evaluation and feedback. Anesthesiology. 2010;113:693–703. 25. Stiegler MP, Ruskin KJ. Decision-making and safety in anesthesiology. Curr Opin Anaesthesiol. 2012;25:724–729.

www.anesthesiaclinics.com

Decision-making errors in anesthesiology.

Decision-making errors in anesthesiology. - PDF Download Free
117KB Sizes 0 Downloads 0 Views