The American Journal of Bioethics, 14(1): 18–35, 2014 c Taylor & Francis Group, LLC Copyright  ISSN: 1526-5161 print / 1536-0075 online DOI: 10.1080/15265161.2013.861038

Open Peer Commentaries

Connecting Certification and Education Toby Schonfeld, Emory University Cory Labrecque, Emory University Hugh Stoddard, Emory University In their article “Structuring a Written Examination to Assess ASBH Health Care Ethics Consultation Core Knowledge Competencies,” Bruce D. White, Jane B. Jankowski, and Wayne N. Shelton (2014) describe a process for developing and overseeing a written certification exam as part of the credentialing process for health care ethics consultations. Although they engage many salient points regarding the development and administration of two different written examination formats (i.e., single-answer multiplechoice and short answer/essay), we contend that they have overlooked or minimized several key issues, including the nature of the core competencies for consultants, the characterization of a “content expert” for exam construction, the difficulty of objectivity in scoring, and the costs of administering the exam. These issues present substantial barriers to creating an examination, and they must be addressed before moving toward implementing a certification exam. CORE COMPETENCIES The plausibility of achieving standardization in a written examination for health care ethics committees (HCEC) seems remote when “specific competencies required of an individual who performs ethics consultation will depend on the specific roles and responsibilities of that individual” and when there is “no one model [that] is suited to the full range of ethics consultations an institution is likely to encounter” (ASBH 2011, 19). Granted, the ASBH identifies three categories of skills necessary for health care ethics consultation: ethical assessment and analysis skills, process skills, and interpersonal skills—all of which have both basic and advanced levels (2011, 22–25). In addition to this, the ASBH highlights nine areas of core knowledge required for HCEC, some of which are specific to the patient and staff population with whom the consultant is working (2011, 26–31). White and colleagues (2014) note that a written examination need not be the only mechanism for assessing skills, especially when the skill sets of individual consultants may be varied. This seems right if we are looking to weigh complex competencies—cognitive, meta-cognitive, and social (see Dochy, Segers, and Sluijsmans 1999, 332)—in light of the “considerable divergence of opinion about the kinds of

educational programs and professional credentials necessary to be a candidate for certification” (White et al. 2014, 6). Yet it is unclear how one would create legitimate uniformity and standardization out of this convolution or why the attempt to do so is warranted. CONTENT EXPERTISE AND EXAM DESIGN/ ASSESSMENT First and foremost, expertise in content is completely divorced from expertise in exam creation. Extensive training in the sciences of psychometrics and educational assessment is required to create exams that meet professional standards (Gullickson 2003). White and colleagues (2014) suggest that a multidisciplinary board of “thought leaders” in the field would be sufficient to ensure a quality assessment. The experience from many other professional fields, such as medicine, teacher education, and law, demonstrates the naivet´e of such a belief. The Joint Committee on Standards for Educational Evaluation lists 28 standards that must be met in order to produce a high-quality assessment (Gullickson 2003). It is particularly important within the field of ethics that we should adhere to these standards, particularly those that relate to principles of fairness and appropriateness. Content decisions about the exam should be made by a national benchmarking process of both experts in consultation and experts in bioethics more generally, if in fact examinees will be held accountable for the broad foundational concepts referenced in the core knowledge areas required for HCEC in the ASBH’s Core Competencies Guide. Administering any exam is only appropriate following the presentation of a curriculum that is aligned with the exam’s contents. Given the proliferation of certificate and master’s programs in bioethics, it is vital for those designing the credentialing exam to be in accord with the curriculum leaders of those training programs regarding content of the exam. Many students pursue advanced education as a way to formalize their expertise in bioethics generally and consultation more specifically. Accordingly, failure to coordinate with relevant educational programs may result in a high

Address correspondence to Toby Schonfeld, Emory University, Center for Ethics, 1531 Dickey Drive, Atlanta, GA 30322, USA. E-mail: [email protected]

18 ajob

Examination for ASBH Ethics Knowledge Competencies

failure rate on the exam. Making students bear the burden of such a lack of coherence between program curricula and the material tested on the exam is unethical. Consequently, students may be discouraged from taking the exam or may treat the graduate degree/certificate as their “credential.” Alternatively, if the exam is designed so that targeted graduate study is not required, educational programs may find their programs to be superfluous if students could pass the exam without obtaining specific training. In many professional fields, licensure or certification is only possible by completing both an accredited educational program and a standardized exam. The institutions that provide training must participate with the entities that administer the exam and those that issue the certificate. For example, the four major entities of allopathic physician education collaborate on accreditation of schools, creation and administration of board exams, and the issuance of licenses to practice. These include the Association of American Medical Colleges, American Medical Association, National Board of Medical Examiners, and Federation of State Medical Boards. White and colleagues (2014) have jump-started the discussion of a certification exam. However, they appear to discount the psychometric complexity of creating a fair, valid, and reliable instrument and do not account for the political and social barriers of proposing and achieving the necessary compliance from those educational programs that are expected to train candidates for the exam. OBJECTIVITY AND THE COST OF GRADING To envisage that “each examination question posed should be objective, focus on a single concept or single concept grouping, and be clear and unambiguous” and that each question be vetted individually “so that any extraneous or irrelevant ideas and cultural bias are eliminated” (White et al. 2014, 9) is one thing; accomplishing all of this is improbable, in addition to being time-consuming and expensive. At the level of graduate student or professional examinations, “objective” questions are ephemeral. At every point of multiple-choice question composition and exam construction, decisions are made that are not “objective” but are based on the judgment of the exam designer(s). Granted, multiple-choice questions and exams can measure high-level thinking and can conform to the aforementioned standards; however, we believe that White and colleagues overlooked the assumptions of the multiple-choice format. On the other hand, we wish to emphasize the potential of carefully crafted scoring rubrics for constructed response formats. Constructed response formats provide opportunities for examinees to demonstrate knowledge and skills that are consistent with the certification goals that White and colleagues are pursuing.A scoring rubric allows a team of graders to assess the written examination using a common, predetermined description that elucidates what accounts for failing to meet, meeting, and exceeding content and process objectives. The team approach is important in itself

January, Volume 14, Number 1, 2014

to keep individual grader bias in check. More importantly, analysis and reflection by scorers related to scoring examinee responses lead to a collective understanding of how educational programs, curricula, and national boards must align in order to implement a certification program. Accepted practice within the educational measurement community for certifying exams is readily available (Clauser, Margolis, and Case 2006). Discussion of creating a certification exam should start from the principles detailed in that work. Without a doubt, White and colleagues propose laudable goals. However, the barriers encountered and resources necessary to accomplish these goals are not sufficiently evaluated in their manuscript. More important than expending the resources necessary is the issue of benefit obtained. All things considered, administering a fair, well-constructed exam is better than not doing so. However, we believe that White and colleagues’ appraisal of the effort and negotiation needed for the process, that is, the costs of development, is far too low. Likewise, they do not build a compelling case for how the field would be improved with the imposition of a certifying exam or for how those benefits would be manifested. Overall, we agree with White and colleagues that “to advance the field of health care ethics consultation thought leaders should start to focus on the written examination possibilities, to date unaddressed carefully in the literature,” and that “examination models—both objective and written—must be explored as part of a viable strategy about how the field of health care ethics consultations can move toward professionalization” (2014, 12). The implications of this will be equally important for assessment in bioethics education and in encouraging pedagogues in the field to first come to terms with what kind of learner and learning they are trying to cultivate. On the other hand, we as practitioners in the field must work from a realistic set of assumptions and data about what such an endeavor entails.  REFERENCES American Society for Bioethics and Humanities. 2011. Core competencies for healthcare ethics consultation, 2nd ed. Glenview, IL: ASBH. Clauser, B. E., M. J. Margolis, and S. M. Case. 2006. Testing for licensure and certification in the professions. Educational Measurement 4: 701–731. Dochy, F., M. Segers, and D. Sluijsmans. 1999. The use of self-, peer and co-assessment in higher education: A review. Studies in Higher Education 24(3): 331–350. Gullickson, A. R., ed. 2003. The student evaluation standards: How to improve evaluations of students. Thousand Oaks, CA: Corwin Press. White, B. D., J. B. Jankowski, and W. N. Shelton. 2014. Structuring a written examination to assess ASBH health care ethics consultation core knowledge competencies. American Journal of Bioethics 14(1): 5–17.

ajob 19

Copyright of American Journal of Bioethics is the property of Routledge and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use.

Connecting certification and education.

Connecting certification and education. - PDF Download Free
53KB Sizes 1 Downloads 0 Views