Medical Teacher

ISSN: 0142-159X (Print) 1466-187X (Online) Journal homepage: http://www.tandfonline.com/loi/imte20

The costs of medical education assessment Craig Brown, Jennifer Cleland & Kieran Walsh To cite this article: Craig Brown, Jennifer Cleland & Kieran Walsh (2016) The costs of medical education assessment, Medical Teacher, 38:2, 111-112, DOI: 10.3109/0142159X.2015.1105946 To link to this article: http://dx.doi.org/10.3109/0142159X.2015.1105946

Published online: 26 Nov 2015.

Submit your article to this journal

Article views: 159

View related articles

View Crossmark data

Full Terms & Conditions of access and use can be found at http://www.tandfonline.com/action/journalInformation?journalCode=imte20 Download by: [Emory University]

Date: 13 February 2016, At: 00:10

2016, 38: 111–112

COMMENTARY

The costs of medical education assessment CRAIG BROWN1, JENNIFER CLELAND1, & KIERAN WALSH2 University of Aberdeen, Scotland, 2BMJ, UK

Downloaded by [Emory University] at 00:10 13 February 2016

1

Medical schools and postgraduate training providers need to answer to governments, regulators, funders and the public in terms of whether what they are delivering is fit for purpose. ‘‘Fit for purpose’’ can be considered from a number of angles. Are we selecting the best applicants to successfully complete medical education and training? Are we producing the right doctors in terms of skills, knowledge and behaviours to meet the health needs of the communities they serve? And, are we delivering these outcomes not just to a high standard but in a fiscally responsible way given current financial challenges to the academic and healthcare institutions which deliver this education and training? Medical education is inherently expensive (Walsh & Jaye 2013), yet little is known about the precise nature of the costs. We propose that assessment is a major component of medical education and, therefore, of spend on medical education. Assessment in medical education starts even before medical school, with the costs of selection being borne mostly by the academic institution in terms of staff time for examining applications and interviewing students (Rosenfeld et al. 2008) but also by the applicants in terms of direct payment systems for national selection exams (e.g., the USA Medical College Admission Test https://www.aamc.org/students/applying/ mcat). The costs of assessment continue through medical school, with seemingly endless – from the perspective of both staff and students – rounds of written exams, objective structured clinical examinations (OSCEs) and workplace based assessments (Wilkinson et al. 2008) – all of which have direct (e.g., consumables) and indirect (e.g., administrative, clinical and academic staff time) costs. It does not stop there: postgraduate exams can cost thousands of pounds (Adam 2015), costs usually borne by the trainee or residents themselves. This assessment burden is not likely to reduce – indeed, in our own country (the UK), it is due to increase with the proposal of an ‘‘entrance’’ exam for all doctors wishing to practice in the UK, akin to the United States Medical Licensing Exam (the USMLE). The format and costs of this ‘‘UKMLE’’ are currently unknown (USMLE costs are publicly available: $1200 for step 1 [written exams] and $1275 for step 2 [practical exams]), as is who will pay for it – the candidate, the medical schools, or the postgraduate training providers?

Regardless of the payer, examining the cost of assessments is a crucial component of the process of planning and running high-quality, fit-for-purpose assessments. We are not the first to think this (Reznick et al. 1993; Van der Vleuten, 1996) and so are amazed that, relatively speaking, there is little empirical research into the cost of assessment in medical education and training. While this may be emerging in relation to ‘‘formal’’ examination-like assessments (Brown et al., 2015a), we could find little literature considering cost explicitly in relation to methods of assessment that are embedded into daily education practice e.g. workplace-based assessment (WPBA: see Nair et al. 2014). One reason for this neglect may be that, whilst training departments may have a duty to release clinical staff to perform WPBAs, in reality this often means the clinician ‘‘squeezing’’ trainee/resident assessment around clinical duties. One outcome of this squeeze is that there are no data specifically relating to the costs of WPBA yet in many countries an enormous number of WPBAs are required for doctors in training across specialties and stages of training. The costs of the direct staff time required is clearly significant, as are the administration costs to produce assessment formats, store and manage assessment data (Brazil et al. 2012; Walsh 2013). Going back to where we started, the first step in fiscal responsibility is to know what assessments cost. We can make few assumptions about the cost and value of different methods until there is an evidence base which encompasses ‘‘fitness for purpose’’ both educationally and fiscally. For example, one common assumption is that computer based testing will save costs: a study by Mandel et al. (2011) showed that ‘‘computerbased formatting and evaluation of paper-based exams’’ was associated with lower costs than either a pure paper-based or pure computer-based method (see also Brown 2015). There are likely to be numerous ways to save costs while still ensuring high-quality assessment. For example, OSCEs should be used for assessing clinical examination, communication or practical procedure skills; however, it may be that some procedural skills could be assessed in a more costefficient way over the whole programme using clinical skills passports or work-place-based assessments (Brown et al. 2015b). That the OSCE is an expensive examination (Brown et al. 2015a) is unarguable, but this should not be considered prohibitively so, especially when there is currently no other

Correspondence: Craig Brown, Division of Medical & Dental Education, Polwarth Building, University of Aberdeen, Aberdeen, Scotland. E-mail: [email protected] ISSN 0142-159X print/ISSN 1466-187X online/16/020111–2 ß 2015 Taylor & Francis DOI: 10.3109/0142159X.2015.1105946

111

Downloaded by [Emory University] at 00:10 13 February 2016

C. Brown et al.

suitable instrument in terms of validity and reliability (Harden 2015). However, alternative, potentially more cost-effective, models of the OSCE could be considered (Pell et al., 2013; Currie et al. in press), and redundant assessment should be avoided. For example, in the UK, many medical programmes assess cardiopulmonary resuscitation (CPR) skills via OSCE. However, many medical schools mandate passing the Immediate Life Support (ILS) course also, which includes CPR. It would be cost-effective for institutions to remove CPR from OSCEs if it is being assessing elsewhere in the programme. Ultimately, the method of assessment must balance the cost with value. The method of assessment must also be decided upon in a sophisticated manner. A high-cost, low value assessment method is the worst of all possible outcomes. A low cost, low value-assessment method will not meet anyone’s needs in the long term. A high-cost, high-value assessment method would be acceptable. But the ultimate goal must be a low-cost, high-value assessment method. In conclusion, our knowledge of the broad literature on assessment tells us that much attention has been paid to the reliability and validity aspects of the van der Vleuten utility equation (Van der Vleuten 1996), less to the cost-effectiveness of assessments. This is short-sighted: those delivering medical education need to be explicit about the costs of high-quality, appropriate assessment (indeed, of all aspects of education/ training) in order to know when cost saving is appropriate, and when it is not. This information will arm education and training providers with necessary information to manage their budgets and decide on resource allocation – to maximise returns and minimise waste. Without it, providers are vulnerable to educationally-inappropriate cost-saving measures being dictated from above. By knowing the actual costs of medical education and assessment, and justifying your approaches with best evidence, you will be in a stronger position to resist cost cutting where this could threaten the quality of your product (capable doctors), and hence patient safety. In the meantime, studies of the utility of assessment methods should mention the costs of these methods as a matter of routine and should seek to balance these costs against other components of their utility.

Notes on contributors CRAIG BROWN, MBChB, BScMedSci(Hons), MRCEM, PGDipMedEd, FHEA, MAcadMEd, is an honorary clinical lecturer at the University of Aberdeen and a registrar in Emergency Medicine. JENNIFER CLELAND, BSc, MSc, PhD, D Clinical Psychol, is the John Simpson Chair of Medical Education at the University of Aberdeen,

112

Scotland and Chair of the Association for the Study of Medical Education (ASME). KIERAN WALSH, FRCPI, is the Clinical Director of BMJ Learning, London

Declaration of interest: The authors report no conflicts of interest. The authors alone are responsible for the content and writing of the article.

References Adam E. 2015. The cost of training. Emerg Med J. [E-pub ahead of print]. doi: 10.1136/emermed-2015-205161. Brazil V, Ratcliffe L, Zhang J, Davin L. 2012. Mini-CEX as a workplace-based assessment tool for interns in an Emergency department – Does cost outweigh value? Med Teach 34(12):1017–1023. Brown C. 2015. Tablet- or iPAD-based marking of OSCEs and MMIs: An imaginative cost-saving approach. Med Tech. [E-pub ahead of print]. doi: 10.3109/0142159X.2015.1072270. Brown C, Ross S, Cleland J, Walsh K. 2015a. Money makes the (medical assessment) world go round: The cost of components of a summative final year Objective Structured Clinical Examination (OSCE). Med Teach 37(7):653–659. Brown C, Sinha S, Cooper A. 2015b. The consumables cost of practical procedure teaching, revision and assessment. Unpublished data. Currie G, Sivasubramaniam S, Cleland J. [in press]. Sequential testing in the Objective Structured Clinical Examination: Determining the number of screening tests. Med Teach. [E-pub ahead of print]. doi: 10.3109/ 0142159X.2015.1079309. Harden R. 2015. Misconceptions and the OSCE. Med Teach 37(7):608–610. Mandel A, Ho¨rnlein A, Ifland M, Lu¨neburg E, Deckert J, Puppe F. 2011. Cost analysis for computer supported multiple-choice paper examinations. GMS Z Med Ausbild 28(4):Doc55. Nair B, Searles A, Ling R, Wein J, Ingham K. 2014. Workplace-based assessment for international medical graduates: At what cost? Med J Aust 200(1):41–44. Pell G, Fuller R, Homer M, Roberts T. 2013. Advancing the objective structured clinical examination: Sequential testing in theory and practice. Med Educ 47(6):569–577. Reznick R, Smee S, Baumber J, Cohen R, Rothman A, Blackmore D, Berard M. 1993. Guidelines for estimating the real cost of an objective structured clinical examination. Acad Med 68(7):513–517. Rosenfeld J, Reiter H, Trinh K, Eva K. 2008. A cost efficiency comparison between the multiple mini-interview and traditional admissions interviews. Adv Health Sci Educ Theory Pract 13(1):43–58. Van der Vleuten CP. 1996. The assessment of professional competence: Developments, research and practical implications. Adv Health Sci Educ Theory Pract 1(1):41–67. Walsh K. 2013. The cost and utility of the mini-CEX. Med Teach 35(9):789. Walsh K, Jaye P. 2013. Cost and value in medical education. Educ Prim Care 24(6):391–393. Wilkinson J, Crossley J, Wragg A, Mills P, Cowan G, Wade W. 2008. Implementing workplace-based assessment across the medical specialities in the United Kingdom. Med Educ 42:364–373.

The costs of medical education assessment.

The costs of medical education assessment. - PDF Download Free
566B Sizes 0 Downloads 6 Views