Article

Developing and Successfully Implementing a Competency-Based Portfolio Assessment System in a Postgraduate Family Medicine Residency Program Laura A. McEwen, PhD, Jane Griffiths, MD, CCFP, FCFP, and Karen Schultz, MD, CCFP, FCFP

Abstract The use of portfolios in postgraduate medical residency education to support competency development is increasing; however, the processes by which these assessment systems are designed, implemented, and maintained are emergent. The authors describe the needs assessment, development, implementation, and continuing quality improvement processes that have shaped the Portfolio Assessment Support System (PASS) used by the postgraduate family medicine program at Queen’s University since 2009. Their description includes the impetus for change and contextual realities that guided the effort, plus the processes used for selecting assessment

T

he move towards competency-based models of postgraduate residency education has emphasized the role of assessment. Portfolios have been recognized as useful structures for collecting, organizing, and managing the large volume of assessment information necessary to support these educational models.1 The potential for portfolios to function as catalysts for learning by promoting residents’ active engagement L.A. McEwen is director, Assessment and Evaluation, Postgraduate Medical Education, and assistant professor, Department of Pediatrics, Queen’s University, Kingston, Ontario, Canada. J. Griffiths is assistant professor and assessment director, Department of Family Medicine, Queen’s University, Kingston, Ontario, Canada. K. Schultz is associate professor and program director, Department of Family Medicine, Queen’s University, Kingston, Ontario, Canada. Correspondence should be directed to Laura A. McEwen, PGME, Queen’s University, Faculty of Health Sciences, 70 Barrie St., Kingston, ON K7L 3N6; telephone: (613) 533-6000, ext. 74918; e-mail: [email protected]. Acad Med. 2015;90:1515–1526. First published online May 18, 2015 doi: 10.1097/ACM.0000000000000754

components and developing strategic supports. The authors discuss the identification of impact measures at the individual, programmatic, and institutional levels and the ways the department uses these to monitor how PASS supports competency development, scaffolds residents’ selfregulated learning skills, and promotes professional identity formation. They describe the “academic advisor” role and provide an appendix covering the portfolio elements. Reflection elements include learning plans, clinical question logs, confidence surveys, and reflections about continuity of care and significant incidents. Learning module elements cover the required, online

bioethics, global health, and consultrequest modules. Assessment elements cover each resident’s research project, clinical audits, presentations, objective structured clinical exam and simulated office oral exam results, field notes, entrustable professional activities, multisource feedback, and in-training evaluation reports. Document elements are the resident’s continuing medical education activities including procedures log, attendance log, and patient demographic summaries.

in and responsibility for the learning process makes these tools even more appealing.2 The structural and functional advantages, along with the endorsement of regulating bodies, have spurred uptake.3 However, the processes by which these assessment systems are designed, implemented, and maintained are emergent.4–7

experience and explaining PASS en toto, we hope to support others who are currently engaged in systematic portfolio design and/or who could potentially adapt aspects of PASS for their local programs. Needs Assessment

In this article, we describe the needs assessment, development, implementation, and continuing quality improvement processes that have shaped the Portfolio Assessment Support System (PASS) used in residency training in the Department of Family Medicine at Queen’s University, (Kingston, Ontario, Canada). As we built PASS, we used our educational philosophy—that supporting high-quality resident learning and assessment is a purposeful, deliberate activity—to guide our decisions. PASS supports competency development, scaffolds the use of self-regulated learning skills, and promotes professional identity formation. We understand that individual PASS components and processes are not unique, but in sharing our overall

Two events provided the impetus for major curricular reform and assessment system development in the family medicine department at Queen’s University. First, in 2009, the College of Family Physicians of Canada (CFPC), which is the accrediting body for Canadian family medicine training programs, launched the new competencybased Triple C Curriculum. The new curriculum mandated that programs provide training that (1) focuses on producing comprehensively trained family physicians; (2) incorporates continuity of patient care, curriculum, and supervision; and (3) centers that training in family medicine. According to the CFPC, the new outcomesoriented approach required “carefully designed curricular elements to achieve

Academic Medicine, Vol. 90, No. 11 / November 2015

The authors wish to support others who are engaged in the systematic portfoliodesign process or who may adapt aspects of PASS for their local programs.

1515

Copyright © by the Association of American Medical Colleges. Unauthorized reproduction of this article is prohibited.

Article

clearly stated desired outcomes” that would replace “traditional time-based educational strategies.”8 Second, the Queen’s University family medicine residency program grew from a single site with 100 residents centrally located in Kingston, Ontario, to a distributed program with 130 residents at four sites spread over a radius of 200 kilometers (125 miles). In 2008, 17 academic family medicine physicians supported the clinical training of the 100 residents. Now, approximately 20 salaried academic family physicians are affiliated with the original Kingston site, and 15 new part-time salaried physicians are affiliated with the three expansion sites. These physicians plan and implement programs, teach and assess residents, and provide individualized learner support and competency-based assessment by functioning as “academic advisors” or AAs (see Creating processes, below). An additional 750 to 1,000 community family physicians and specialists receive “sessional” payments for the teaching they provide, including supervising clinical care and/or facilitating academic sessions. Given our distributed program and our large number of residents and faculty, we felt that a Web-based assessment platform was necessary. Operationalizing the Triple C Curriculum at Queen’s has resulted in three different two-year curriculum structures. The original Kingston site maintains a traditional, rotation-based structure of 26 four-week blocks over two years. There are 12 blocks of family medicine training, and the 14 remaining blocks provide experiences relevant to family medicine such as obstetrics and pediatrics. In this structure, residents move from rotation to rotation with a consequent change in learning environment. Two of our new training sites (Belleville and Oshawa) have adopted predominantly longitudinal structures with few rotations. In this structure, residents are situated in ongoing longitudinal family medicine training environments. Threaded throughout the program are other short off-service learning experiences, and residents bring information learned on off-service experiences back to be applied in the family medicine environment. Our third new training site (Peterborough) uses a hybrid model, combining rotation-based and longitudinal learning experiences.

1516

The diversity in curriculum structure across our sites, coupled with the shift to a competency-based educational model, rendered conventional assessment processes in the form of rotation-based in-training evaluation reports (ITERs) inadequate. From an administrative perspective, the ITER system lacked the flexibility to accommodate the primarily longitudinal curriculum structure without rotations. Even for our more traditional rotationalbased sites, the ITER system lacked the comprehensiveness to document residents’ emerging competence as it developed over time and across different rotations. Competence in caring for children, for example, builds over time through rotations in family medicine, pediatrics, emergency medicine, and psychiatry. The rotation-based ITER system simply could not capture information of the quality or granularity necessary to formulate judgments about family medicine residents’ growing competence. We required an assessment system that would allow us to sample, collate, and interpret resident performance more longitudinally.8–10 As the Queen’s family medicine residency program has expanded, other needs have arisen. Although Queen’s University sets family medicine program objectives centrally, individual teaching sites leverage local strengths and resources, ultimately enacting curriculum in unique ways. Further, some important program objectives (e.g., those related to global health) are not achievable at all sites. Consequently, we have developed centralized online learning resources and assessments for these objectives such that learners across all sites can access and use them. Finally, we have been mindful that, given the wide scope of practice in the field of family medicine, there is no way to predict residents’ future needs in terms of knowledge and skills. Therefore, we have prioritized both supporting the development of residents’ self-regulated learning skills and fostering their early identity formation as family physicians. Our competency-based assessment needs are not unique. Medical education in many areas of the world is transitioning to competency-based educational models.11,12 Longitudinal programs that incorporate bounded learning environments and continuity of patient and supervisor relationships, both of which seem to

increase authentic learning experiences, are increasingly popular.13–15 We believe, therefore, that assessment systems like PASS, that feature not only purposefully defined components for capturing learners’ longitudinal growth but also processes for compiling and interpreting this assessment information so as to inform declarations of competency, have widespread applicability. Development

The development of Queen’s family medicine innovative postgraduate portfolio assessment system began in 2008. Anticipating the move to competency-based medical education (CBME), our assessment director (J.G.) conducted an extensive review of the growing literature on the topic. She noticed that, with respect to assessment, portfolios were a recurrent theme and that educators generally considered this system of capturing progress to align well with the tenets of CBME.1 According to the research literature, portfolios offer a useful structure for collecting, organizing, and managing the wide variety of assessment information required for competencybased assessment.16,17 They also have the potential to function as catalysts for learning by enhancing residents’ active engagement in and responsibility for their learning processes.2,18 The literature, however, cautions that the extent to which this catalytic benefit is realized in practice relates closely to how faculty facilitate and implement portfolio assessment processes.1,19 Specifically, the literature highlights the role of mentors who function as guides in supporting residents’ active reflection about their learning process.20,21 Overall, the literature emphasizes the critical importance of the purposeful inclusion of informed mentors as important aspects of successful portfolio design (Figure 1). Considering context As we began to blueprint assessment components and develop assessment processes, we were cognizant of our contextual constraints. Our assessment system had to accommodate the needs of multiple training sites—some of which were community based and some of which were hospital based. Ultimately, we needed a flexible system that was easily accessible for a diverse range of

Academic Medicine, Vol. 90, No. 11 / November 2015

Copyright © by the Association of American Medical Colleges. Unauthorized reproduction of this article is prohibited.

Article

Porolio Assessment Support System (PASS)

Electronic porolio components:

Academic advisor (AA) meeng with resident every 4 months. Roles of the AA:

Reflecons (e.g. learning plan, competency reflecon, clinical queson logs, connuity of care reflecon)

Interpreng assessment data and making competency development decisions

Learning modules

Planning individualized programs of study

Assessments (e.g. EPA field notes, In-training Evaluaon Reports, resident project, Mulsource Feedback)

Fostering identy as a developing family doctor

Documents (e.g. procedure log, a…endance log, paent summary)

Coaching and supporng resident's self regulated learning

Monitoring compleon of residency requirements

an objective; providing structured opportunities for residents to reflect, self-assess, and collaboratively plan their learning; responding to an accreditation requirement; or providing small biopsies of competency assessment, which when collated, would assist AAs in formulating competency decisions). In addition to the mapping of specific needs to particular components, we purposefully elicited feedback from all stakeholders including educational program leaders, clinical teachers, and residents. Stakeholders provided comments and critiques on a regular basis to ensure utility and feasibility and to foster community ownership of the system. These sessions raised practical concerns about issues regarding confidentiality and the possible legal implications of sharing personal writing or reflections. Such concerns prompted consultations with university legal services that, in turn, served to inform guidelines around confidentiality and professional responsibilities for reportable issues (e.g.: Portfolios can be viewed only by a select few to ensure privacy. Exceptions to this include responding to concerns that fall under the Mandatory Reporting policy of the College of Physicians and Surgeons of Ontario). Finally, we also sought insights from assessment experts external to Queen’s. This comprehensive process served to shape a purposeful, thorough portfolio blueprint for Queen’s postgraduate family medicine program (see Appendix 1). Creating processes

A…ending to resident well-being

Figure 1 Components of the Portfolio Assessment Support System (PASS) used by the Family Medicine PostGraduate Training Program at Queen’s University in Kingston, Ontario, Canada. EPA indicates entrustable professional activity.25

geographically distributed users. These requirements highlighted the need for a Web-based solution. Selecting assessment components Initially, the requirements of our accrediting college (CFPC) guided our assessment system blueprinting. The assessment director (J.G.) conducted a comprehensive analysis of the CFPC’s skills dimensions, the 99 priority topics,22 the “Phases of the Clinical

Encounter,”9 and the CanMEDs–family medicine roles.10 Next, the assessment director consulted with the postgraduate assessment specialist (L.A.M.), which helped us consider the future practice demands of residents through the lens of self-regulated learning theory. On the basis of this analysis of the literature and consultation, we defined a preliminary list of components for inclusion in our portfolios. Each component mapped to an established need (e.g., addressing

Academic Medicine, Vol. 90, No. 11 / November 2015

The program director (K.S.) recognized early that because of the sheer number of learners, coupled with their geographical distribution, she would need to delegate responsibility for monitoring their growth and making decisions about their attainment of competency. We developed the AA role to assume these responsibilities. As the portfolio assessment system developed, scheduling regular AA–resident meetings became a key process issue. We conceptualized that AAs would function essentially as academic coaches. They would review portfolios, interpret assessment data, coach and support residents’ SRL processes, engage in deliberate mentoring, promote the development of residents’ professional identify formation as family physicians, and ultimately make decisions

1517

Copyright © by the Association of American Medical Colleges. Unauthorized reproduction of this article is prohibited.

Article

about progress and advancement. We engaged faculty in discussions about these responsibilities early and sought feedback about the kinds of support they felt they would need to fulfill this critical role. In effect, we collaboratively defined the parameters of the academic advising task, which empowered the community and further fostered ownership of assessment processes. Implementation

Phase 1 We strategically phased in the implementation of PASS over two academic years. In 2009 we initiated the portfolio components. We provided data sticks for residents to store their work and share it with AAs in advance of meetings. During this phase we formalized academic advising processes. Prior to implementing PASS, faculty advisors met with residents only twice a year to discuss career plans and general progress through the program. The lack of specific objectives to guide the process resulted in wide variation in the focus, quality, and usefulness of these meetings. We addressed this weakness by establishing specific objectives for academic advising meetings and by instituting a scheduling protocol (meetings are to occur at fourmonth intervals). We also clearly defined preparatory processes for residents (e.g., reflective writing tasks, collating paperbased field notes23 [see Appendix 1 for detailed information about field notes]), and we delineated AAs’ responsibilities for reviewing assessment information in preparation for meetings. This first phase served to orient AAs to their new role, focusing on assessment rather than strictly mentoring. It also familiarized residents and faculty with the main components of the portfolio. Phase 2 Simultaneously, the assessment director (J.G.) worked with our Web designer to develop the electronic portfolio structure. Together, they created a Webbased solution, tailored to meet our exact requirements and specifications. Again, we engaged residents and faculty in the development process to ensure that the interface was intuitive and met their needs. Their periodic involvement eased their transition to the electronic format in 2010 because they were familiar with the interface. Further,

1518

users welcomed the move away from data sticks. After graduation, residents’ assessment information is removed from the active Web-based platform. Objective assessment data are kept in a digital format according to university policy; however, all subjective data are stored and kept in a deidentified format for program quality improvement and research. Support In addition to engaging in ongoing community consultation to help shape PASS and to stage implementation over two years, we have also developed an extensive network of user supports. We have identified “assessment leads” for each educational site who assume responsibility for orienting local users of the system. All residents receive a one-hour tutorial at the beginning of the program. Further, each portfolio component includes instructions, and we developed resource documents and videos for the “help” sections of PASS. Finally, we revise all of these annually to ensure that they remain current and useful. In addition to the technical supports, the platform includes embedded meeting checklists, including regular progress check-ins and the requirements for progression from first to second year and for graduation from the program. These support features help focus meetings and ensure that residents stay on track. In addition, we developed review protocols for site and program directors whereby they are able to review all portfolios one month prior to residents’ completion of the program. Residents with incomplete portfolios, as well as their AAs, receive a reminder that a complete portfolio is a condition for graduation. Initially, AAs attended several onehour faculty development sessions covering topics such as the parameters of the AA role, writing field notes, and navigating PASS. Annual individualized AA development sessions provide ongoing support. In addition, AAs receive compensation not only for the time they spend meeting with residents but also for the time they spend preparing for those meetings (e.g., reading and commenting on reflective exercises, reviewing modules, and assessing performance data). Remuneration for preparatory effort is an institutionalized acknowledgment of the investment required by AAs to effectively facilitate

meetings with residents and actively support their individualized growth. Furthermore, remuneration makes explicit the value Queen’s University places on the AAs who play a vital role in the assessment system by providing informed, professional judgments about residents’ progress and declaring them competent for independent practice. Continuing Quality Improvement

We have several quality improvement mechanisms in place that function to identify and address system problems and weaknesses as they arise. Our assessment director (J.G.) has a dedicated 0.2 FTE (one day) for the continued development, support, and oversight of assessment processes for the entire family medicine program. Assessment leads at each site have just under a half-day of dedicated time per week to support their work in that role. Our full-time Web designer manages all technical troubleshooting in a timely manner. In keeping with the community ownership model that we initiated during the design phase, we highly value, encourage, and swiftly address user feedback. The Resident Assessment Advisory Group meets with the assessment director on a regular basis to bring forward ideas and concerns. For example, when residents raised concerns about differences in the quality of AA meetings, we not only shared that feedback with faculty but also developed an AA evaluation system. Residents now complete evaluations of their AAs at regular intervals just as they do for their clinical preceptors. Residents’ feedback has been incorporated into the academic physician review processes along with field note completion rates (as compared with departmental averages). Likewise, in response to feedback from faculty about the burden of scheduling AA meetings with residents, we have provided administrative support for scheduling these sessions. All meetings occur during regular working hours and are scheduled at the beginning of the academic year. If and when rescheduling is required, administrative support staff organize the new meeting, relieving AAs of this burden. Furthermore, there is an institutional expectation that physicians reserve a 30-minute time slot in every clinical day for directly observing, providing feedback to, and documenting the performance of residents under their

Academic Medicine, Vol. 90, No. 11 / November 2015

Copyright © by the Association of American Medical Colleges. Unauthorized reproduction of this article is prohibited.

Article

purview. Our educational philosophy is that supporting high-quality learning for and assessment of residents does not occur by happenstance; rather, it is a purposeful, deliberate activity that requires dedicated clinician time and attention. Program Evaluation

We use multiple measures to monitor the impact of PASS at the individual, programmatic, and institutional levels. At the individual level, we gather and examine learners’ and faculty members’ perspectives about the impact and value of PASS in supporting CBME. At the programmatic level, we monitor field note completion rates and use cohort assessment data to inform curriculum development. Finally, at the institutional level, we monitor our effectiveness in identifying and remediating residents in difficulty. Ethical clearance for the use of interview quotes and survey data was granted by Queen’s Health Sciences Research Ethic Board and an exemption was granted for the use of program evaluation statistics. Individual level Residents’ perspectives. In February 2012, the CFPC appointed a Working Group for Survey Development. The group’s mandate was to develop a series of surveys (Entry, Graduate, and Practice) as a component of a longitudinal outcomes-based program evaluation of the Triple C Curriculum.24 The purpose of these surveys was to gather residents’ perspectives about their preparation for and intentions to practice. Respondents are asked to indicate their level of agreement with statements about their residency experiences, including assessment, on a five-point Likert scale. The Graduate survey was piloted in the spring of 2013 at Queen’s, and 67% (32/48) of the graduating family medicine residents completed it. Queen’s residents’ responses to the assessment items are one measure of the impact of PASS. The results of the 2013 survey indicate that 75% of Queen’s family medicine residents (24/32) agreed or strongly agreed that they understood programmatic expectations and were actively aware of their progress throughout the program. Furthermore, 66% (21/32) reported involvement in tailoring their learning when individual needs were identified, and the great

majority (91%, n = 29) indicated that they were confident in their ability to identify personal learning needs. Overall, residents have a solid grasp of the standards for performance and use those standards to self-monitor their development over the course of the residency program. The fact that many report both assuming an active role in directing their learning and feeling confident in their ability to do so going forward into practice suggests that our investment in academic advising is having the intended effect of supporting the development of learners’ self-regulated learning skills. Faculty members’ perspectives. Interviews with AAs for an upcoming article suggest that they value the electronic portfolio as a one-stop shop or central repository for resident performance information. In particular, AAs value the capacity to sort field notes; as one AA relates, “You can look at the grid of how they did in different domains to get a big-picture view. And then you can drill down within to get a small-picture view.” AAs perceive this kind of access to assessment data as a means to facilitate the early identification of performance gaps and to support planning for individualized learning opportunities. One AA described how he, in collaboration with a resident, “made an assessment of [her] competency and then looked to see, okay, [what] are your areas that you need to continue to work on? What do you have coming up? Let’s see if we can tailor this a bit for what you need.” AAs also recognize the value of the system beyond the accessibility and comprehensiveness of assessment information. In the words of one AA, regular meetings with residents “have drawn the academic advisors and their residents closer.” Another mentioned how reflective components of the portfolio supported her own self-reflection because “it’s a time for me to think—what do I do as a family doc?” Although AAs acknowledge the challenges associated with more intensive assessment processes, they also recognize the value of those same processes. To illustrate, one AA commented, “It’s not easy, but it’s an important thing to feel like you’re making a difference.” Another AA explicitly discussed his/her great sense of pride about being “acknowledged as being a leader in the area.”

Academic Medicine, Vol. 90, No. 11 / November 2015

Programmatic level Field note completion rates. Another important measure we use to monitor the impact of PASS is field note completion rates. Prior to implementation of the electronic field note, residents received, on average, 8 paper field notes annually. In 2011, that number had risen to 14. By 2012, residents received an average of 41 field notes annually, in 2013 the average was 46, and in 2014 the average was 59. We now have more than 23,000 field notes written and compiled in our system. Having successfully sustained the upward trend in field note completion rates over three years, our attention is shifting to quality. In keeping with our community ownership orientation, we have developed a three-pronged approach to the assessment of field note quality. Resident input is sought through a nomination process whereby residents share field notes they found particularly impactful, and field note functionality is being further enhanced to allow residents to indicate how a field note impacted their learning. We are also developing a field note audit tool that will be used by a trained administrative assistant to randomly monitor timeliness, tone, and focus. Finally, we plan to engage preceptors in a supported self-reflective process about the quality of feedback they provide in field notes. Preceptors will receive a bundle of 10 recently composed field notes along with the preceptor field note audit tool to support the review process. Our hope is to have this activity certified for the CFPC continuous professional development process, as a means of formally acknowledging our preceptors’ contribution to educational quality. Cohort assessment data. The utility of cohort assessment data to inform curriculum revision is another measure of the impact of PASS. Graduating residents complete a survey indicating their level of confidence in 99 key topic areas22 and clinical procedures. The curriculum committee reviews survey results annually to identify any gaps as perceived by graduates. When a substantial percentage of residents (10% or more) report lacking confidence in or not having had the opportunity to learn about a particular topic or procedure during residency, we consider curriculum revisions. These curriculum revisions

1519

Copyright © by the Association of American Medical Colleges. Unauthorized reproduction of this article is prohibited.

Article

may take the form of additional didactic teaching sessions during academic days (e.g., adding a talk on seizures) or adding procedures to the simulation sessions (e.g., simulated toenail removal). Institutional level Identification and remediation of residents in difficulty is an important measure of the impact of PASS. We monitor this particular measure at the institutional level through the number of cases referred to our Educational Advisory Board (EAB). The EAB is a special committee, convened by the associate dean of postgraduate medical education, that is responsible for assisting with academic planning for residents in need. Of the 24 cases reviewed by the EAB since 2013, 11 were family medicine cases. In all 11 cases, the EAB commended the family medicine program for the extensive resident performance information made available and for the high-quality remediation plans proposed. Outside acknowledgment Although external acknowledgment of PASS is not officially part of its evaluation system, the recognition of other groups involved in postgraduate medical education is encouraging. Winning the Professional Association of Interns and Residents of Ontario’s prestigious Residency Program Excellence Award in 2012 (plus being nominated again in 2015) evidences that Queen’s residents value the quality of learning and assessment that PASS enables. Additionally, the official acknowledgment and high praise that PASS received from CFPC after its most recent accreditation visit affirm the value and quality of the system. In Sum

In summary, the strategically designed components; the thoughtfully orchestrated processes grounded in educational theory; the dedicated, informed AAs; and the robust, customized, flexible electronic platform are the cornerstones of PASS. PASS has been in use for five years, and its processes, components, and platform functionality all continue to evolve so as to serve the needs of faculty and residents. In spite of the complexity of change that moving to PASS has involved, it has been strongly endorsed by family medicine faculty and residents—and

1520

by the Queen’s postgraduate medical education community more generally. Impact measures at the individual, programmatic, and institutional levels provide evidence that the program is realizing our initial intentions and goals. Acknowledgments: The authors acknowledge editors at Academic Medicine, Elizabeth S. Karlin and Anne L. Farmakidis, for their patience and guidance. They also thank Ulemu Luhanga for her contribution in finalizing this manuscript. Funding/Support: There was no external funding supporting this project. The Queen’s University Department of Family Medicine supported this work through funding dedicated time for the program director (K.S.) and assessment director (J.G.) to do this work and for funding a dedicated Web designer (Rachelle Porter). The Office of Postgraduate Medical Education at Queen’s University supported this work through funding dedicated time of the director, Assessment and Evaluation (L.A.M.). Other disclosures: None reported.

8

9

10

11

Ethical approval: Ethical clearance for the use of interview quotes and survey data was granted by Health Sciences and Affiliated Teaching Hospitals Research Ethics Board of Queen’s University, Kingston, Ontario, Canada. An exemption was also granted for the use of program evaluation statistics. Previous presentations: 2010: Getting Started with Portfolio Evaluation Systems. Workshop at ICRE, Ottawa, Ontario, Canada; 2010: Portfolio Power: Triple C Mandate: Enhancing Education and Evaluation. Poster at CFPC Family Medicine Forum, Vancouver, British Columbia, Canada; 2011: The Triple C Mandate. Poster, CCME, Toronto, Ontario, Canada.

12

13

14

15

References 1 Tochel C, Haig A, Hesketh A, et al. The effectiveness of portfolios for post-graduate assessment and education: BEME guide no 12. Med Teach. 2009;31:299–318. 2 Van Tartwijk J, Driessen EW. Portfolios for assessment and learning: AMEE guide no. 45. Med Teach. 2009;31:790–801. 3 Donato AA, George DL. A blueprint for implementation of a structured portfolio in an internal medicine residency. Acad Med. 2012;87:185–191. 4 Nagler A, Andolsek K, Padmore JS. The unintended consequences of portfolios in graduate medical education. Acad Med. 2009;84:1522–1526. 5 Webb TP, Merkley TR, Wade TJ, Simpson D, Yudkowsky R, Harris I. Assessing competency in practice-based learning: A foundation for milestones in learning portfolio entries. J Surg Educ. 2014;71:472–479. 6 Cooney CM, Redett RJ 3rd, Dorafshar AH, Zarrabi B, Lifchez SD. Integrating the NAS milestones and handheld technology to improve residency training and assessment. J Surg Educ. 2014;71:39–42. 7 Hurtubise L, Roman B. Competency-based curricular design to encourage significant

16

17

18

19

20

21

learning. Curr Probl Pediatr Adolesc Health Care. 2014;44:164–169. Tannenbaum D, Kerr J, Konkin J, et al. Triple C Competency-Based Curriculum: Report of the Working Group on Postgraduate Curriculum Review—Part 1. Missassauga, Ontario, Canada: College of Family Physicians of Canada; March 2011. http://www.cfpc.ca/uploadedfiles/ education/_pdfs/wgcr_triplec_report_ english_final_18mar11.pdf. Accessed March 27, 2014. Oandasan IF, Saucier D, eds. Triple C Competency-Based Curriculum Report— Part 2: Advancing Implementation. Missassauga, Ontario, Canada: College of Family Physicians of Canada; 2013. http:// www.cfpc.ca/uploadedfiles/education/_pdfs/ triplec_report_pt2.pdf. Accessed March 27, 2015. Tannenbaum D, Konkin J, Parsons E, et al. CanMEDS—Family Medicine. Missassauga, Ontario, Canada; College of Family Physicians of Canada; October 2009. http:// www.cfpc.ca/uploadedFiles/Education/ CanMeds%20FM%20Eng.pdf. Accessed March 27, 2015. Frank JR, Snell LS, ten Cate O, et al. Competency-based medical education: Theory to practice. Med Teach. 2010;32: 638–645. Iobst WF, Sherbino J, ten Cate O, et al. Competency-based medical education in postgraduate medical education. Med Teach. 2010;32:651–656. Holmboe ES, Ward DS, Reznick RK, et al. Faculty development in assessment: The missing link in competency-based medical education. Acad Med. 2011;86:460–467. Torbeck L, Wrightson AS. A method for defining competency-based promotion criteria for family medicine residents. Acad Med. 2005;80:832–839. Litzelman DK, Cottingham AH. The new formal competency-based Indiana University School of Medicine: Overview and five-year analysis. Acad Med. 2007;82:410–421. Mathers NJ, Challis MC, Howe AC, Field NJ. Portfolios in continuing medical education—Effective and efficient? Med Educ. 1999;33:521–530. Carraccio C, Englander R. Evaluating competence using a portfolio: A literature review and Web-based application to the ACGME competencies. Teach Learn Med. 2004;16:381–387. Norcini J, Anderson B, Bollela V, et al. Criteria for good assessment: Consensus statement and recommendations from the Ottawa 2010 Conference. Med Teach. 2011;33:206–214. Van Tartwijk J, Driessen E, Van Der Vleuten C, Stokking K. Factors influencing the successful introduction of portfolios. Qual High Educ. 2007;13:69–79. Donato AA, Pangaro L, Smith C, et al. Evaluation of a novel assessment form for observing medical residents: A randomised, controlled trial. Med Educ. 2008;42: 1234–1242. Tigelaar DEH, Dolmans DHJM, Wolfhagen IHAP, van der Vleuten CPM. Using a conceptual framework and the opinions of portfolio experts to develop a teaching

Academic Medicine, Vol. 90, No. 11 / November 2015

Copyright © by the Association of American Medical Colleges. Unauthorized reproduction of this article is prohibited.

Article portfolio prototype. Stud Educ Eval. 2004;30:305–321. 22 Allen T, Bethune C, Brailovsky C, et al. Defining Competence for the Purposes of Certification by the College of Family Physicians of Canada: The Evaluation Objectives in Family Medicine. Missassauga, Ontario, Canada: College of Family Physicians of Canada; 2010. http://www.cfpc.ca/uploadedFiles/ Education/Defining%20Competence%20 Complete%20Document%20bookmarked. pdf. Accessed March 27, 2015. 23 Donoff MG. Field notes: Assisting achievement and documenting competence. Can Fam Physician. 2009;55:1260–1262, e100.

24 Oandasan IF, Archibald D, Authier L, et al. Giving curriculum planners an edge: Using entrance surveys to design family medicine education. Can Fam Physician. 2015;61:e204–210.

References cited in Figure 1 or Appendix 1 25 Ten Cate O. Nuts and bolts of entrustable professional activities. J Grad Med Educ. 2013;5:157–158. 26 Schultz K, Griffiths J, Lacasse M. The application of entrustable professional activities to inform competency decisions in a family medicine residency program [published online ahead of print February 23, 2015]. Acad Med. doi: 10.1097/ACM.0000000000000671.

Academic Medicine, Vol. 90, No. 11 / November 2015

27 McEwen LA, Luhanga U, Griffiths J, Schultz K, Spiller A, Acker A. Queen’s Multisource Feedback Rubrics: Operationalizing Frames of Reference for Raters and Residents. Presented at: 2013 AAMC Annual Meeting MedEdPORTAL Poster Session on “Excelling in Health Education Assessment.” MedEdPortal iCollaborative, Resource ID: 820. https://www.mededportal.org/ icollaborative/resource/820. Accessed March 27, 2015. 28 The Evaluation Objectives in Family Medicine: Procedure Skills. Missassauga, Ontario, Canada: College of Family Physicians of Canada. http://www.cfpc.ca/uploadedFiles/ Education/Procedure%20Skills.pdf. Accessed March 27, 2015.

1521

Copyright © by the Association of American Medical Colleges. Unauthorized reproduction of this article is prohibited.

1522

Once, end of postgraduate year (PGY) 1

Continuity-of-care reflection

Resident reflects on benefits and challenges of continuity of care, a central aspect of FM

Resident documents selfevaluated confidence in managing the College of Family Physicians of Canada’s 99 priority topics,22 core procedures, and CanMEDS– FM roles10

Three times over the course of the program

Confidence survey

Resident self-assesses learning needs based on portfolio review, formulates learning goals in relation to program objectives, develops a learning plan, and documents attainment of goals

Action

Alternate academic Resident reflects on a clinical coaching meetings problem experienced, identifies problem-specific learning needs, documents learning strategies employed, and summarizes learning gains

Iterative, 4-month intervals

Duration and/or scheduling

Clinical question log

Reflection elements Learning plan

Elements (and explanation if necessary) Possible challenges

To promote reflection and to foster professional identity formation

Medical Expert, Communicator, Health Advocate, Professional

(Appendix continues)

•  F osters development of •  Resident’s hesitancy to discuss communication skills some issues •  Highlights benefits and challenges •  Lack of confidence about associated with long-term patient responding to residents’ reflective relationships writing on the part of AA •  Supports personal growth and formation of professional identity •  Promotes discussion and mentoring

•  T ime-consuming •  Resident’s hesitancy to admit and/ or document weak performance

•  Identifies gaps in competency development •  Informs exam preparation •  Identifies gaps in formal curriculum Medical Expert, Professional

To scaffold selfmonitoring

•  Learning resources may not be •  M  akes SRL process explicit, readily available affording opportunities for support from academic advisor (AA) •  Establishing markers as thresholds of learning goal attainment •  Widens scope of reflection to global competency development •  Providing faculty development resources and garnering faculty •  Enhances resident responsibility for and ownership of learning buy-in process •  Emphasizes importance and provides evidence of professional learning ability

Educational benefits

•  Identifies personally relevant topics •  Identifying and accessing resources to address question(s) •  Promotes active reflection-onaction •  Exposes clinical reasoning and problem-solving strategies that can remain hidden in clinical environments •  Promotes discussion and mentoring by AA •  Fosters learning partnership between resident and AA

Medical Expert, Manager, Scholar, Professional

Focus of assessment

To promote reflection Medical Expert, and effective use of Manager, Scholar learning resources, and to foster development of SRL skills

To promote reflection and to foster development of selfregulated learning (SRL) skills

Purpose

Queen’s University Family Medicine (FM) Postgraduate Training Program Portfolio Assessment and Support System Blueprint

Appendix 1

Article

Academic Medicine, Vol. 90, No. 11 / November 2015

Copyright © by the Association of American Medical Colleges. Unauthorized reproduction of this article is prohibited.

Academic Medicine, Vol. 90, No. 11 / November 2015 To promote reflection

Dependent on module topic

Medical Expert, Communicator, Health Advocate, Professional

Focus of assessment

Once, mid-PGY2

Once, end of PGY1 Resident completes a clinical To provide exposure to practice audit on an aspect quality improvement of primary care processes

Clinical practice audit

Communicator, Manager, Scholar, Professional

Resident conducts To provide exposure to Communicator, scholarly project, under research processes and Manager, Scholar, the guidance of a faculty evidence-based medicine Professional supervisor, involving one of the following: an indepth critical review of the literature, a research project, an advocacy project, or an IT project

Resident completes each learning module independently or in groups

To promote reflection and to create a springboard for discussion and mentoring around difficult situations

Purpose

Research project

Assessment

Bioethics, Global Prior to completion Health, and Writing of program Consult Request Letters

Learning modules

Alternate academic Resident writes about coaching meetings a significant incident encountered in the clinical environment and reflects on cognitive and emotional impact on all people involved

Significant incident reflection

Action

Duration and/or scheduling

Elements (and explanation if necessary)

(Continued)

Appendix 1

•  C  ompleted outside of clinical hours •  Not a substitute for workplace learning •  Requires inferring competency development

•  F osters development of quality improvement skills in the office setting •  Provides opportunity for residents to plan and apply implementation strategies •  Exposes impact of practice improvements on quality of care •  Provides opportunity for mentoring around ongoing practice improvement.

(Appendix continues)

•  P roviding resources to support resident’s work

•  P romotes development of •  Significant time investment research, critical appraisal, and •  L ack of confidence about time management skills supporting resident’s research •  Fosters development of specialized process on the part of AA knowledge in a focused topic area •  Providing resources to support resident’s work

•  P rovides standardized learning opportunities for content that residents may have limited clinical exposure to •  Offers flexible learning opportunities •  Fosters valuable discussion when completed in groups

•  P romotes discussion about ethics and other challenging issues with AA

Possible challenges •  R  esident’s hesitancy to admit and/ or document issues exposing weak performance •  Lack of confidence about responding to residents’ reflective writing on the part of AA •  Potential for resident’s writing to expose patient care situations that were not well handled

Educational benefits •  S upports personal growth and development

Article

1523

Copyright © by the Association of American Medical Colleges. Unauthorized reproduction of this article is prohibited.

1524

Field notes or 1 per half-day clinic FNs23 (A means of documenting verbal feedback given by the preceptor to the resident about his or her performance, ideally daily, usually based on direct observation in a clinical situation. FNs are coded for assessment frameworks, include an assessment of this small “biopsy” of performance, and include a free-text box to document narrative feedback.) Resident performs clinical activity, and preceptor or allied health professional completes a Web-based FN, which may be flagged for follow-up should significant weaknesses emerge; resident may draft an FN on behalf of the preceptor, based on the verbal feedback given, and forward this to the preceptor for review and comment

Resident completes a 15-minute simulated officebased exam evaluating assessment, management, and patient-centered medicine skills

Multiple (> 10) formal opportunities during program; informal practice encouraged

Simulated office oral (SOO)

Resident prepares a presentation on the topic of his/her choice and is assessed by a preceptor and collects and aggregates assessments from audience members Resident completes a 5-station OSCE

1 preceptor assessment required, and multiple audience member assessments encouraged

Presentation

Action

Objective structured Once, mid-PGY2 clinical exam (OSCE)

Duration and/or scheduling

Elements (and explanation if necessary)

(Continued)

Appendix 1

Medical Expert, Communicator, Manager, Health Advocate

Medical Expert, Communicator, Scholar

Focus of assessment

To allow—through the All CanMEDs-FM Web-based FNs which rolesa are collected over time, collated, and displayed within the portfolio— interpretation about patterns of performance and trajectory across multiple competencies and settings by multiple observers

To assess residents’ All CanMEDs–FM competencies in patient- rolesa centered care through a standardized means

To assess resident’s procedural and patient interaction skills through a standardized means

To provide an assessment of resident’s formal teaching skills by peers and preceptors

Purpose

•  •  •  • 

•  F aculty development required for reliable scoring

•  V  ariable quality •  Narrowly focused on easily assessed competencies (procedure skills, greeting patients, things done well, etc.) •  Negotiating standards of performance among assessors

•  F osters assessment-seeking behavior •  Provides structured feedback about teaching approach •  P rovides opportunity for practice with access to feedback •  Allows flexibility about which competencies are assessed •  P rovides opportunity for practice with access to feedback •  Familiarizes resident with certification exam format •  Allows flexibility about which competencies are assessed •  E nables provision of feedback linked directly to clinical performance •  Affords opportunities to verify resident’s interpretation of feedback when a resident completes the FN him/herself •  Provides a window into resident’s ability to independently resolve learning issues when flagging function is activated •  Supports self-monitoring by resident and ongoing tracking of competency development by AA

(Appendix continues)

 ost of development and delivery C Rater training Loss of clinical time Administrative burden

Possible challenges •  P otential for peers to be less or overly critical

Educational benefits

Article

Academic Medicine, Vol. 90, No. 11 / November 2015

Copyright © by the Association of American Medical Colleges. Unauthorized reproduction of this article is prohibited.

Duration and/or scheduling

Academic Medicine, Vol. 90, No. 11 / November 2015

Multisource feedback (MSF) rubrics27

Resident performs clinical activity, and preceptor fills out an FN to document feedback given. By electronically coding, he/she chooses appropriate EPA on the electronic interface of the FN. Preceptor also selects the phase of encounter (e.g., history, physical), determines level of competence demonstrated (descriptions of 3 levels of performance are available), and enters qualitative feedback. These FNs collate to provide the documentation of competence for each EPA.

Action To document clinical competence across a range of clinical settings (this serves as the primary basis for competency decisions)

Purpose

Once, end of PGY1 Resident sends MSF To document clinical questionnaires to team competence from nurse(s), nurse practitioner, multiple perspectives receptionist, team administrative assistant, team social worker, 2 resident colleagues, 2 faculty members, and any other team member with whom the resident has a working relationship (total 8–15 people); MSF data are collated automatically upon submission of each rubric within the portfolio; resident also completes the same rubric prior to reviewing aggregated MSF data; patient feedback is collected electronically using touchscreen tablet computers in waiting rooms; patient data are collated automatically and displayed alongside the team MSF in the portfolio

Entrustable 1 per half-day clinic professional activities or EPAs25,26 (Core professional activities [e.g., wellbaby care] that are the expression of multiple integrated competencies as they apply in the clinical setting. Thirty-five EPAs were written for FM training with benchmarking for 3 levels of performance.)

Elements (and explanation if necessary)

(Continued)

Appendix 1

Communicator, Collaborator, Manager, Professional

All CanMEDs-FM rolesa

Focus of assessment Educational benefits

Possible challenges

(Appendix continues)

•  O  ffers insights about resident’s •  Allied health professionals’ abilities from a variety of apprehension about assuming perspectives assessor role •  Supports calibration of resident’s self-assessments when reviewed with guidance from AA •  Provides development opportunities for whole team with regard to the role of feedback in learning and the characteristic of constructive feedback

•  M  akes standards of performance •  Yet to be identified explicit for both resident and preceptor •  Directs focus of assessment to important aspects of performance •  Scaffolds preceptor’s provision of high-quality, focused feedback linked to clinical performance •  Supports self-monitoring by resident and ongoing tracking of competency development by AA •  Links EPAs with established FN framework, facilitating uptake with minimal faculty development

Article

1525

Copyright © by the Association of American Medical Colleges. Unauthorized reproduction of this article is prohibited.

1526

Continuous

Attendance log

Summary of End of PGY1 demographics of patients on core FM rotation

Continuous

Procedure log28

Dependent on context/focus.

a

At 4-month intervals, minimally

In-training evaluation reports (ITERs)

Document

Duration and/or scheduling

Elements (and explanation if necessary)

(Continued)

Appendix 1

Administrative staff upload spreadsheet of information extracted from electronic medical records of resident’s patients

Resident signs into all formal curriculum events; data are collated and uploaded by administrative staff

Resident logs exposure to, and level of confidence with, core and enhanced procedures

Primary preceptors complete ITERs that summarize residents’ performance over the course of a specific rotation or longitudinal learning experience; the completion of ITERs is based on a review of FNs

Action

Medical Expert, Professional

All CanMEDs–FM roles

Focus of assessment

To document breadth of exposure to various diagnoses, patient profiles, and continuity of care

Medical Expert, Scholar

To document level of Medical engagement in program Expert, Scholar, Professional

To document range of exposure to clinical procedures and selfassessed confidence

To ensure ongoing progress and timely identification of residents in difficulty so as to ensure access to remedial support

Purpose

Possible challenges

•  P otential for falsification (e.g., peers signing in for other, early departure) •  Administrative and IT burden for data tracking

•  S upports identification of resident’s learning needs •  Fosters discussions about level of engagement

•  S upports self-monitoring •  Administrative and IT burden for data extraction •  Supports identification of learning needs

•  S ubject to falsification given selfreported nature of data •  Need for diligent recording

•  P romotes self-monitoring •  Identifies gaps in exposure to procedures for AA •  Makes explicit resident’s perceptions of confidence and competency development

•  R  eveals patterns of performance •  Failure to fail when appropriate over time based on multiple •  L ack of standards of performance observations by multiple observers •  Triggers remediation processes when warranted

Educational benefits

Article

Academic Medicine, Vol. 90, No. 11 / November 2015

Copyright © by the Association of American Medical Colleges. Unauthorized reproduction of this article is prohibited.

Developing and successfully implementing a competency-based portfolio assessment system in a postgraduate family medicine residency program.

The use of portfolios in postgraduate medical residency education to support competency development is increasing; however, the processes by which the...
350KB Sizes 0 Downloads 12 Views