Methodology

Using Functional Magnetic Resonance Imaging to Improve How We Understand, Teach, and Assess Clinical Reasoning

STEVEN J. DURNING, MD, PHD; MICHELLE COSTANZO, PHD; ANTHONY R. ARTINO JR, PHD; CEES VAN DER VLEUTEN, PHD; THOMAS J. BECKMAN, MD; ERIC HOLMBOE, MD; MICHAEL J. ROY, MD, MPH; LAMBERT SCHUWIRTH, MD, PHD Clinical reasoning is essential to the practice of medicine. There have been many advances in the understanding of clinical reasoning and its assessment, yet current approaches have a number of important limitations. Functional magnetic resonance imaging (fMRI) is promising because it permits investigators to directly view the neuroanatomical changes that occur with thinking. In this article, we briefly review current approaches to assessing clinical reasoning, discuss the emerging role and utility of fMRI in understanding clinical reasoning, and suggest directions for future research, continuing education, and practice. Key Words: clinical reasoning, assessment, fMRI, evaluation-educational intervention, innovative educational interventions

Clinical reasoning may be defined as the cognitive processes involved in arriving at a diagnosis or treatment plan.1 The understanding of clinical reasoning is limited by an inability to directly observe cognition, yet understanding clinical reasoning is important because cognitive errors are common and often cause harm to patients.2,3 Neuroimaging methods, such as functional magnetic resonance imaging (fMRI), are promising because they allow researchers to view the neuroanatomical activation changes that occur during thinkDisclosures: The authors report this work was supported in part by an American Board of Internal Medicine Foundation grant. The views expressed herein are those of the authors and do not necessarily reflect the Department of Defense or other federal agencies. Dr. Durning: Professor of Medicine and Pathology, Uniformed Services University of the Health Sciences; Dr. Costanzo: Uniformed Services University of the Health Sciences; Dr. Artino: Associate Professor of Medicine and Preventive Medicine & Biometrics, Uniformed Services University of the Health Sciences; Dr. van der Vleuten: Professor of Education, Maastricht University; Dr. Beckman: Professor of Medicine and Medical Education, Mayo Clinic; Dr. Holmboe: Senior Vice President and Chief Medical Officer, American Board of Internal Medicine; Dr. Roy: Professor of Medicine, Uniformed Services University of the Health Sciences; Dr. Schuwirth: Professor of Medical Education, Flinders University. Correspondence: Steven J. Durning, Uniformed Services University of the Health Sciences, 4301 Jones Bridge Road, Bethesda MD 20814-4799; e-mail: [email protected]. © 2014 The Alliance for Continuing Education in the Health Professions, the Society for Academic Continuing Medical Education, and the Council on Continuing Medical Education, Association for Hospital Medical Education. • Published online in Wiley Online Library (wileyonlinelibrary.com). DOI: 10.1002/chp.21215

ing, which could enhance our understanding of the clinical reasoning process. In this article, we first review current approaches to clinical reasoning assessment. We review these developments to illustrate how they have led to a better understanding of the nature of clinical reasoning but also to demonstrate that currently these assessment techniques have made limited contributions. We then discuss the emerging potential role of fMRI in understanding clinical reasoning. We conclude with some suggested directions for future research, continuing education, and practice.

Clinical Reasoning: Assessment Approaches and Our Limited Understanding In recent years, the nature of clinical reasoning and the best methods for assessing it have been debated.4,5 Some investigators have focused on assessing the final or “decision steps” when individuals respond to stimuli such as vignettebased multiple-choice questions (MCQs) or extended matching questions (EMQs).6 Clinical decision making typically refers to these final steps, and this is a research field in and of itself. Although these assessment methods represent observable final steps in clinical reasoning, they do not scrutinize the full pathway of clinical reasoning, which includes the cognitive processes involved from the initial patient encounter to the final treatment decision.1,7 Others have sought to capture the intermediate steps that lead to a diagnosis or choice of therapeutic approach as substitutions for internal mental processes.8–11 Such intermediate steps are integral to

JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS, 34(1):76–82, 2014

Using Functional Magnetic Resonance Imaging

the process of clinical reasoning but provide an incomplete view of the internal processes involved. Attempts to assess the full process of clinical reasoning are not unique. Indeed, such investigations gained momentum from the 1960s through the 1980s with the introduction of long simulation-based assessments such as computer-based examinations [CBXs]12 and patient management problems [PMPs].13 Nonetheless, because knowledge is highly domain specific, so is problem solving ability.14,15 Consequently, these long simulations are not well suited to the broad content sampling (eg, multiple case scenarios)16 that is needed to provide a more complete view of reasoning. Furthermore, individuals with an intermediate level of expertise and experience have tended to outperform more experienced “expert” clinicians.17 Dual-processing theory (analytic and nonanalytic reasoning)18,19 claims that most successful decision making is nearly instantaneous, without any conscious analytical processing (eg, through pattern recognition, nonanalytic reasoning, or System-1 processing).18,19 Accordingly, experts accumulate many experiences, flexibly select from numerous problem-solving strategies, and even adapt their strategies according to the demands of a given problem.1 Dual-process theory also provides an explanation for why so many hours are required for expertise development, at least from the standpoint of deliberate practice theory. Deliberate practice assumes that approximately 10 000 hours of practice is required to master a skill, whether it is playing a piano, earning an international rating in chess, being an elite athlete, or practicing medicine.20–23 Dual-process theory would suggest that deliberate practice enables learners to more effectively and efficiently store experiences and strategies, which can then be retrieved more efficiently in order to solve various guises of a problem. Thus, an assessment method that requires experts to explicitly expose each of their individual decision steps may hinder experts, and force them to apply analytic reasoning (ie, actively comparing and contrasting options) to a problem that they would normally solve with a nonanalytic or pattern recognition approach. For example, many current assessment methods attempt to focus on the key nodes (or steps) in the problems-solving process. Nonetheless, these assessment methods (eg, key-feature exams, script concordance testing, and concept maps)8,11,24 could still interfere with the mental processes associated with the efficiency and accuracy of expertise. We endorse a view of clinical reasoning as a much more unpredictable process in which there are multiple correct pathways within boundaries, rather than one “single best pathway.”1 The quality of clinical reasoning thus depends on the possession of many different strategies, the ability to flexibly choose from among them, and as needed, make adaptations based on the reaction of the patient or other features in the environment (specifics of the situation). This is

consistent with the finding that expert performance in clinical reasoning is not a general problem solving ability or trait; it is highly knowledge dependent, situation specific and idiosyncratic.1 Thus, successful clinical reasoning relies on the ability to select strategies within the bounded space of reasonable care for a patient in a given situation (of which there are multiple potential “paths” leading to success) and avoid paths or trajectories that may lead to poor care, which is clearly something more experience physicians do better than novices. Two key questions, then, are how can one clarify what internal mental processes enable an individual to gain such expertise and, more importantly, how can one foster its development? Thus far, even the most modern methods designed to capture the process of clinical reasoning have not been able to look beyond behavior; even completing a Script Concordance Test, extended matching question, or key-feature examination is a behavioral task from which we try to infer the process of problem solving.5 For example, think-aloud studies are among the best measures applied to date, but they have subjects verbalize their thought processes, which constitutes an indirect, potentially reactive, and biased account. Thinkaloud protocols are not direct observations of the brain activity that characterizes thought processes.25 In other words, think-aloud protocols may interfere with the very mental processes that are inherent to expert performance. Greater understanding of clinical reasoning processes may lead to future assessment strategies that enable more direct exploration of thought processes, without interfering with these processes. To further understand the nature of clinical reasoning, and in particular how individuals with greater expertise are able to more effectively cope with increasingly difficult and complex tasks, a more direct method of evaluation would be useful. Such direct methods could enhance our understanding of the clinical reasoning process and foster interventions to help physicians develop and maintain this competency. One promising method for testing educational theories is fMRI, a neuroimaging approach that examines spatial patterns of brain activation in real time. Functional MRI has the potential to reveal the complex neurobiological substrates that underlie clinical reasoning without interfering with those processes, thereby overcoming many of the limitations that plague our ability to understand and assess this complex process. How Can Neuroimaging and fMRI Assist in Assessing Clinical Reasoning? Advances in neuroimaging over the past 3 decades have revolutionized the field of neuroscience.26 They provide an unprecedented capability to study brain activity noninvasively “in vivo.” Functional MRI relies on the fact that cerebral blood flow and brain activation are closely coupled—when

JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS—34(1), 2014 DOI: 10.1002/chp

77

Durning et al.

an area of the brain is activated, its blood supply increases relative to other areas through a feedback loop that fosters regional vasodilation within a few seconds. This blood flow is visualized by BOLD (blood oxygen level–dependent) techniques that rely on magnetic differences between oxygenated and deoxygenated hemoglobin so that intravenous contrast agents are not required.27 Functional MRI provides excellent spatial resolution and thus is a powerful method for identifying the brain regions that are involved during in the completion of tasks. In our investigations, we have compared regional brain function during reading, answering, and reflecting phases as clinicians completed vignette-based clinical reasoning MCQs in the scanner. A number of studies have demonstrated the favorable psychometric properties of vignettebased MCQs for licensing examination purposes (ie, reliability and validity), but there remains limited knowledge of the actual cognitive processes used by participants to solve such problems.28 We therefore combined a think-aloud protocol with the administration of MCQs in the scanner to better understand the neurobiological cognitive operations employed in responding to MCQs.23 To be able to test well-validated MCQs, we employed questions from the American Board of Internal Medicine and the National Board of Medical Examiners, which have been rigorously tested on thousands of individuals. We separated the major tasks involved in responding to the MCQs (eg, reading and answering) in order to examine task-specific differences in neuroimages of the clinical reasoning process. Since the BOLD signal is a relative measure, the response at a given voxel (eg, volumetric pixel or 3-dimensional pixel) must be compared to a baseline task or condition in order to reveal the brain response of interest. This is typically done by utilizing contrast tests in which it is determined if condition 1 is greater than condition 2 (condition 1 > condition 2). In our studies we compared answering versus reading, reflecting versus reading and answering versus reflecting. Such comparisons allowed isolation of the brain processes associated with answering (answering > reading) that encompass nonanalytical and analytical reasoning, reflecting (reflecting > reading) that requires analytical reasoning and answering versus reflecting (answering > reflecting) that encompass nonanalytical processes. In our first study, we examined how experts (boardcertified internists) answered vignette-based MCQs focusing on different response types: correct versus incorrect, guess versus nonguess.29 Such an investigation provides insight into current theoretical perspectives, which postulate that successful clinical problem solving is based almost entirely on fast, largely unconscious, recognition of the problem and its solution through experience (so-called System 1 processing). Our results revealed that experts demonstrated significant activation in the medial prefrontal cortex (mPFC), a re78

gion critical to executive functions,30 when answering items incorrectly (see Figure 1a in APPENDIX S1, which is available as supporting information in the online version of this article). These findings suggest that cognitive regions of the brain may have had to work harder to address more difficult questions (incorrect). Based on the isolation of nonanalytical processes (answer > reflection), in this context the mPFC may serve as a component of the System 1 dualprocess model that is often referred to as the control network or domain general network.29 In addition, this study revealed that experts demonstrated less activation in the bilateral precuneus while guessing (compared to not guessing) (see Figure 1b in APPENDIX S1). Recent fMRI studies connect the precuneus with self-referential goal-directed actions as well as memory retrieval.31 Such a process would not be expected by control network, but instead is related more to analytical processing (System 2), that has been shown to marginally influence correctness of diagnosis and/or treatment plan in several studies.29 Although System 1 is believed to share neural networks with the slow system (System 2) studies suggest experts can use both systems.29 In the next series of studies, we wanted to further examine expertise, exploring the notion that expertise is a state (item dependent) versus a trait (defined by group or level of training). We argue that what makes experts unique is that their considerable experience affords a higher probability of having an exemplar for the correct answer, not because they apply a different process to solving the problem. Indeed, we would expect that experts and novices utilized similar brain processes but experts demonstrate more efficient and accurate performance. Neurobiological investigations of skilled performance reveal that expertise is typically associated with a relative reduction in brain activation or “neural efficiency” during task completion.32–39 Think of the brain as a lightbulb with a limited amount of potential activity. Expertise conveys a more focused signal, thereby “freeing up watts” for other activities. This idea is consistent with cognitive load theory (CLT).40–42 CLT postulates that there are limits to working memory capacity and benefits of long-term memory and automation; it theorizes that we can only pay attention to and manipulate so many pieces of information in our environment in a given period of time. Working memory is very limited in terms of both capacity and duration, whereas the capacity of long-term memory is essentially limitless. Thus expertise involves building long-term memory so that short term memory can be “freed up” to attend to other information in the environment. To examine whether neural efficiency is a marker of clinical expertise, we compared experts and novices during the same clinical reasoning task. This approach to understanding clinical expertise is of particular interest since it is consistent with dual-processing theory; notably, System 1 processing has been shown to be more efficient, invoking less cognitive

JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS—34(1), 2014 DOI: 10.1002/chp

Using Functional Magnetic Resonance Imaging

load and an ability to observe and process more information in the situation.40,41 Thus, in a study that is currently under review for publication, we explored the neural basis of nonanalytic reasoning (System 1) in internal medicine interns (intermediates) and board-certified internists (experts) by contrasting answering validated United States Medical Licensing Examination and American Board of Internal Medicine multiple-choice questions relative to reflecting upon these questions (answer > reflection). The results demonstrate that intermediates and experts share a common neural network supplemented by nonoverlapping neural resources. Group comparisons indicated that experts demonstrated neural processing efficiency in regions such as the prefrontal cortex (see Figure 1c in APPENDIX S1). Such findings support a broader neuroscience model of neural efficiency as a hallmark of skill and suggest that such processing efficiency could be a marker of expertise. Taking into account this observed pattern of neural efficiency, we next examined how external factors such as sleep and burnout modulate the brain response in experts and novices during clinical reasoning. These external influences are of interest to medical educators, given the welldocumented detrimental impact of burnout and sleep deprivation on physicians and resultant impairment of patient care. Prior to entering the fMRI scanner, we had participants complete a validated measure of sleepiness, the Epworth Sleep Scale (ESS),43 as well as a 2-item version of the Maslach Burnout Inventory (MBI),44 which has been shown to be a reliable and valid measure of burnout in physicians. First, we focused on how ESS scores of sleepiness were related to brain processes in experts during nonanalytical reasoning (comparing answering and reflection, answer > reflection). We found that higher ESS scores were associated with deactivation in the mPFC (see Figure 2a in APPENDIX S1). The mPFC also was found to be significantly activated in our first investigation while answering incorrectly.29 Such a relationship suggests that sleepiness, as measured by ESS, may act as a cognitive load on brain processes associated with nonanalytical reasoning in experts, such that as sleepiness increases, mPFC activation decreases. This may have implications on clinical error detection since our findings indicate that the mPFC is sensitive to correct/incorrect responses.45 This sleep investigation was subsequently expanded with a pilot study examining actigraphy in experts and interns. Actigraphs are watchlike devices that are worn on the wrist and detect motion, providing a proxy for real sleep. In particular, actigraphy data have been found to correlate well with polysomnography, the gold standard for sleep assessment,46 and thus serve as an important measure to validate the self-report ESS data. Our results revealed significant deactivation in the medial prefrontal cortex (in experts and interns) with respect to mean sleep as measured by actigraphy (see Figure 2b in APPENDIX S1). Collectively, this

suggests that the mPFC is influenced by sleep during nonanalytical reasoning. In addition, we examined if and how burnout modulates the neural substrates of clinical reasoning. Burnout is a syndrome of emotional exhaustion, depersonalization, and a low sense of personal accomplishment. It is highly prevalent across the continuum of medical education and has meaningful impact on clinical reasoning performance. We expected that novices would be particularly vulnerable to such exogenous stressors given that they are developing the neural processing networks associated with expertise, so their networks may be less stable and more susceptible to change. Our results indeed revealed that interns had higher emotional exhaustion (EE) and depersonalization (DP) scores than experts and that these measures had significant correlation with fMRI findings for interns, but not experts. Higher EE resulted in middle frontal gyrus (MFG) and posterior cingulate cortex (PCC) activation (see Figure 2c in APPENDIX S1). Higher DP was associated with MFG, precuneus and dorsolateral PFC deactivation (see Figure 2d in APPENDIX S1). In summary, in this study, interns appeared to be more susceptible to burnout effects on brain processes associated with cognition, which indicates that interns may need both cognitive and emotional support to improve quality of life and in order to work and learn. Future Directions We believe that our recent fMRI work has provided some insight into the neurobiological processes involved in clinical reasoning and how factors such as sleepiness and burnout may impact how physicians process information related to clinical reasoning. So far, our work has examined the neuroimaging correlates of expert clinical reasoning performance, nonanalytical reasoning in experts and novices, and how such processes are modulated by burnout and sleep. Such findings help to illuminate contemporary reasoning theories such as dual process, neural efficiency and cognitive load. In this section, we provide some potential future educational directions for this work, which includes informing theory, assisting with instrument development, and supplementing diagnosis and remediation. Informing Theory The use of fMRI has the potential to bolster our understanding of various cognitive and learning theories. In particular, one can view the relationship between fMRI and clinical reasoning theory as bidirectional, such that fMRI data can be used to inform clinical reasoning theory and clinical reasoning theory can motivate novel investigations using fMRI.47 Functional MRI can illuminate the functional neuroanatomy of a construct (in our case, clinical reasoning), but clinical reasoning theory can also facilitate the interpretation of

JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS—34(1), 2014 DOI: 10.1002/chp

79

Durning et al.

neuroimaging data. We therefore identify relevant educational theories and neuroscience theories that could be combined and used as a means for interpreting fMRI findings. For example, the further exploration and refinement of the neurobiological basis of learning research into the role of the prefrontal cortex in error detection and task-switching (ie, when to switch from nonanalytical clinical reasoning to analytical reasoning) could be investigated. This could be accomplished by incorporating a learning “intervention” into the paradigm used in our investigations and imaging individuals pre and post intervention to assess for neurobiological changes. Also, fMRI research could be used to better understand when people fail to retrieve existing knowledge from their memories or do not manage to apply existing knowledge successfully to clinical problems. The fMRI literature on memory tasks could inform such work for clinical reasoning. But apart from the instructional side of education, fMRI could also be used to enhance our understanding of transformative learning theories,48 and theories of motivation and emotion in complex clinical problem solving.49 Such investigations could not only enhance our understanding, but also potentially lead to needed interventions to assist physicians, such as steps that could be taken to mitigate burnout. Instrument Development Functional MRI could also play a role in future educational assessment instrument development. For example, with advances in fMRI, different stimuli can be studied to enhance our understanding of how best to stimulate and assess physician cognition. Participants can view images, listen to sounds (such as heart sounds for interpretation or listen to a consultant while in the scanner) or speak (jaw motion is now permissible in some fMRI scanners). For instance, comparing experts and novices or high and low performers within a group could enhance our understanding of the use of such assessments in different sets of learners through the neurobiological processes involved. Such work could also include an intervention phase (eg, feedback). Through these advances, possible studies could include whether there are functional neuroimaging differences in assessment stimulus formats, for example: reading a description of an X-ray or electrocardiogram (ECG) or viewing it, listening to a heart sound or reading a description, thinking aloud in a scanner or thinking aloud after leaving the scanner. Videos could also be viewed in the fMRI scanner, which enables more authentic testing of workplace performance. Supplementing Diagnosis and Remediation In terms of diagnosis and remediation, future studies could explore differences among groups of physicians such as

80

aging physicians, physicians with performance difficulties, and struggling medical students. Questions with still images, sounds, or video could be incorporated into the fMRI methodology to provide scenarios that are closer to actual practice then MCQs. From an assessment standpoint, we would argue that fMRI, if linked to theory, may be most helpful in struggling medical students or physicians by providing better means for identifying and remediating focused problems in their learning pathways or professional functioning. Understanding the process and, at times, complexity of clinical reasoning is essential and through such work, diagnostic and therapeutic interventions for clinical reasoning could emerge, which, due to cost and limited access, would seem to be most appropriate for “at risk” physicians for whom clinical reasoning may be suspect for one reason or another (eg, struggling physicians or aging physicians). Their performance and fMRI activation patterns could be compared with a nonstruggling cohort, which could elucidate potential strategies for helping these struggling physicians. A futuristic view of physician certification and recertification could involve not only identification of the correct responses, but also the activation of preferred brain pathways. This might then involve asking more straightforward questions to clarify that that the appropriate areas of the brain are involved, and it could enable the providence of more tailored feedback to facilitate improvement. Given the relatively few brain areas involved in our investigations to date, it is plausible that such an approach could be considered for medicine. Findings from explorations of physician’s clinical reasoning could also have implications for other professions as well as for education in general. For example, dual-process theory has been demonstrated to be useful in diverse fields such as economics, chess, and sports. Enhancing our understanding of this theory through fMRI investigations could lead to insights that may apply to other fields. Furthermore, an expertise “network” in medicine, as well as an understanding of how these networks emerge with growing expertise, could impact our understanding of educational policies and practices. In this light, using educational theory, advanced neuroimaging, and a validated stimulus (eg, high-stakes examination questions) would have serious implications for the care of patients and our understanding of cognition in the field of medical educational. We believe that the ability to more directly observe brain areas activated in the clinical reasoning process, build/revise existing theory, and correlate/validate other modalities and existing instruments helps us to ascertain the strengths and weaknesses of MCQs and think-aloud methods and establishes a valuable role for fMRI in the medical education community both today and into the future.

JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS—34(1), 2014 DOI: 10.1002/chp

Using Functional Magnetic Resonance Imaging

Lessons for Practice ●





Assessing clinical reasoning is difficult due, in part, to an inability to directly observe cognitive processes, Functional MRI provides a means of more directly “observing” cognition. A series of studies and opportunities for future research and practice are discussed.

Supporting Information Additional Supporting Information may be found in the online version of this article at the publisher’s web site: APPENDIX S1 Using functional magnetic resonance imaging to improve how we understand, teach, and assess clinical reasoning As a service to our authors and readers, this journal provides supporting information supplied by the authors. Such materials are peer reviewed and may be reorganized for online delivery, but are not copy edited or typeset. Technical support issues arising from supporting information (other than missing files) should be addressed to the authors. Acknowledgment We would like to thank Rebecca S. Lipner, PhD, senior vice president of evaluation, research, and development, American Board of Internal Medicine. References 1. Durning SJ, Artino AR, Jr., Schuwirth L, van der Vleuten C. Clarifying assumptions to enhance our understanding and assessment of clinical reasoning. Acad Med. 2013;88(4):442–448. 2. Graber M. Diagnostic errors in medicine: a case of neglect. Jt Comm J Qual Patient Saf. 2005;31(2):106–113. 3. Croskerry P. Clinical cognition and diagnostic error: applications of a dual process model of reasoning. Adv Health Sci Educ Theory Pract. 2009;14(Suppl 1):27–35. 4. Elstein AS, Schwartz A. Clinical problem solving and diagnostic decision making: selective review of the cognitive literature. BMJ. 2002;324(7339):729–732. 5. Schuwirth L. Is assessment of clinical reasoning still the Holy Grail? Med Educ. 2009;43:298–300. 6. Case SM, Swanson DB. Extended-matching items: a practical alternative to free-response questions. Teach Learn Med. 1993;5(2):107–115. 7. Eva KW. What every teacher needs to know about clinical reasoning. Med Educ. 2004;39:98–106. 8. Page G, Bordage G, Allen T. Developing key-feature problems and examinations to assess clinical decision-making skills. Acad Med. 1995;70(3):194–201.

9. Bordage G. An alternative approach to PMP’s: the “key-features ” concept Paper presented at Proceedings of the second Ottawa conference1987; Montreal, Canada. 10. Charlin B, Tardif J, Boshuizen HP. Scripts and medical diagnostic knowledge: theory and applications for clinical reasoning instruction and research. Acad Med. 2000;75(2):182–190. 11. Lubarsky S, Charlin B, Cook DA, Chalk C, van der Vleuten CP. Script concordance testing: a review of published validity evidence. Med Educ. 2011;45(4):329–338. 12. Norcini JJ, Keskausas J, Langdon J, Webster G. An evaluation of a computer simulation in the assessment of clinical competence. Eval Health Prof. 1986;9(3):286–304. 13. Rimoldi HJ. The test of diagnostic skills. J Med Educ. 1961;36:73–79. 14. Swanson DB, Norcini JJ, Grosso LJ. Assessment of clinical competence: written and computer-based simulations. Assess Eval Higher Educ. 1987;12(3):220–246. 15. Chi MTH, Glaser R, Rees E. Expertise in problem solving. In: Sternberg RJ, ed. Advances in the Psychology of Human Intelligence. Hillsdale, NJ: Lawrence Erlbaum; 1982:7–76. 16. Eva KW. On the generality of specificity. Med Educ. 2003;37:587–588. 17. Schmidt HG, Boshuizen HP. On the origin of intermediate effects in clinical case recall. Mem Cognit. 1993;21(3):338–351. 18. Evans JS. Dual-processing accounts of reasoning, judgment, and social cognition. Annu Rev Psychol. 2008;59:255–278. 19. Boreham NC. The dangerous practice of thinking. Med Educ. 1994;28(3):172–179. 20. Ericsson KA. An expert-performance perspective of research on medical expertise: the study of clinical performance. Med Educ. 2007;41(12):1124–1130. 21. Ericsson KA, Charness N. Expert performance: Its structure and acquisition. Am Psychologist. 1994;49(8):725–747. 22. Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(10 Suppl):S70–S81. 23. Ericsson KA. The Cambridge Handbook of Expertise and Expert Performance. Cambridge/New York: Cambridge University Press; 2006. 24. Suen HK, Sonak B, Zimmaro D, Roberts DM. Concept map as scaffolding for authentic assessment. Psychol Rep. 1997;81(3):734–734. 25. Russo JE, Johnson EJ, Stephens DL. The validity of verbal protocols. Mem Cognit. 1989;17(6):759–769. 26. Bandettini PA. Twenty years of functional MRI: the science and the stories. Neuroimage. 2012;62(2):575–588. 27. Huettel SA, Song AW, McCarthy G. Functional Magnetic Resonance Imaging. 2nd ed. Sunderland, MA: Sinauer Associates; 2008. 28. Schuwirth LWT, Verheggen MM, Van der Vleuten CPM, Boshuizen HPA, Dinant GJ. Do short cases elicit different thinking processes than factual knowledge questions do? Med Educ. 2001;5(4):348–356. 29. Durning SJ, Graner J, Artino AR, Jr., et al. Using functional neuroimaging combined with a think-aloud protocol to explore clinical reasoning expertise in internal medicine. Mil Med. 2012;177(9 Suppl):72–78. 30. Stuss DT, Benson DF. The Frontal Lobes. New York: Raven Press; 1986. 31. Cavanna AE, Trimble MR. The precuneus: a review of its functional anatomy and behavioural correlates. Brain. 2006;129(Pt 3):564–583. 32. Babiloni C, Marzano N, Infarinato F, et al. “Neural efficiency” of experts’ brain during judgment of actions: a high-resolution EEG study in elite and amateur karate athletes. Behav Brain Res. 2010;207(2):466– 475. 33. Capotosto P, Perrucci MG, Brunetti M, et al. Is there “neural efficiency” during the processing of visuo-spatial information in male humans? An EEG study. Behav Brain Res. 2009;205(2):468–474. 34. Del Percio C, Babiloni C, Marzano N, et al. “Neural efficiency” of athletes’ brain for upright standing: a high-resolution EEG study. Brain Res Bull. 2009;79(3–4):193–200.

JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS—34(1), 2014 DOI: 10.1002/chp

81

Durning et al. 35. Del Percio C, Rossini PM, Marzano N, et al. Is there a “neural efficiency” in athletes? A high-resolution EEG study. Neuroimage. 2008;42(4):1544–1553. 36. Grabner RH, Neubauer AC, Stern E. Superior performance and neural efficiency: the impact of intelligence and expertise. Brain Res Bull. 2006;69(4):422–439. 37. Neubauer AC, Fink A. Intelligence and neural efficiency. Neurosci Biobehav Rev. 2009;33(7):1004–1023. 38. Sayala S, Sala JB, Courtney SM. Increased neural efficiency with repeated performance of a working memory task is information-type dependent. Cereb Cortex. 2006;16(5):609–617. 39. Strait DL, Kraus N, Skoe E, Ashley R. Musical experience and neural efficiency: effects of training on subcortical processing of vocal expressions of emotion. Eur J Neurosci. 2009;29(3):661–668. 40. van Merri¨enboer J, Sweller J. Cognitive load theory and complex learning: recent developments and future directions. Educ Psychol Rev. 2005;17(2):147–177. 41. van Merrienboer JJ, Sweller J. Cognitive load theory in health professional education: design principles and strategies. Med Educ. 2010;44(1):85–93.

82

42. Sweller J, Chandler P. Why some material is difficult to learn. Cogn Instr. 1994;12(3):185–233. 43. Johns MW. A new method for measuring daytime sleepiness: the Epworth sleepiness scale. Sleep. 1991;14(6):540–545. 44. Maslach C, Jackson SE, Leiter MP. Maslach burnout inventory manual. 3rd ed. Palo Alto, CA: Consulting Psychologists Press; 1996. 45. Leppink J, Paas F, Van der Vleuten CP, Van Gog T, Van Merrienboer JJ. Development of an instrument for measuring different types of cognitive load. Behav Res Methods. 2013, Apr 10. 46. Sadeh A. The role and validity of actigraphy in sleep medicine: an update. Sleep Med Rev. 2011;15(4):259–267. 47. Wixted T, Mickes L. On the relationship between fMRI and theories of cognition: the arrow points in both directions. Perspect Psycholog Sci. 2013;8(1):104–107. 48. Mezirow J. Transformative learning: theory to practice. New Directions Adult Contin Educ. 2002;74:5–12. 49. Bechara A, Damasio H, Damasio AR. Emotion, decision making and the orbitofrontal cortex. Cereb Cortex. 2000;10(3):295– 307.

JOURNAL OF CONTINUING EDUCATION IN THE HEALTH PROFESSIONS—34(1), 2014 DOI: 10.1002/chp

Using functional magnetic resonance imaging to improve how we understand, teach, and assess clinical reasoning.

Clinical reasoning is essential to the practice of medicine. There have been many advances in the understanding of clinical reasoning and its assessme...
94KB Sizes 0 Downloads 3 Views