M ILITARY M EDICINE, 180,4:97, 2015

Development and Initial Validation of a Program Director’s Evaluation Form for Medical School Graduates Ting Dong, PhD*; Steven J. Durning, MD, PhD*; COL William R. Gilliland, MC USA (Ret.)f; Kimberly A. Swygert, PhDf, CDR Anthony R. Artino Jr., MSC USN*

ABSTRACT Background: In the early 1990s, our group of interdepartmental academicians at the Uniformed Services University (USU) developed a PGY-1 (postgraduate year 1) program director evaluation form. Recently, we have revised it to better align with the core competencies established by the Accreditation Council for Graduate Medical Education. We also included items that reflected USU’s military-unique context. Purpose: To collect feasibility, reliability, and validity evidence for our revised survey. Method: We collected PGY-I data from program directors (PD) who oversee the training of military medical trainees. The cohort of the present study consisted of USU students graduating in 2010 and 2011. We performed exploratory factor analysis (EFA) to examine the factorial validity of the survey scores and subjected each of the factors identified in the EFA to an internal consistency reliability analysis. We then performed correlation analysis to examine the relationship between PD ratings and students’ medical school grade point averages (GPAs) and performance on U.S. Medical Licensing Examinations Step assessments. Results: Five factors emerged from the EFA---- Medical Expertise, Military-unique Practice, Professionalism, System-based Practice, and Communication and Interpersonal Skills.” The evaluation form also showed good reliability and feasibility. All five factors were more strongly associated with students’ GPA in the initial clerkship year than the first 2 years. Further, these factors showed stronger correlations with students’ performance on Step 3 than other Step Examinations. Conclusions: The revised PD evaluation form seemed to be a valid and reliable tool to gauge medical graduates' first-year internship performance.

INTRODUCTION Medical educators are responsible for training competent physicians. This responsibility is not simply a goal; it is a fundamental obligation to society. To help fulfill this obliga­ tion, medical educators need to develop reliable and valid assessment measurements that evaluate domains of compe­ tence across the spectrum of medical education. Medical educators currently lack diverse, reliable, and valid measures to assess trainees at the transition between undergraduate and graduate medical education (GME).1This gap is due, in part, to the fact that undergraduate medical education assessment tools typically differ from GME assess­ ment tools. Furthermore, from a feasibility standpoint, it is difficult to track graduates across the wide spectrum of resi­ dency programs offered in the United States. Another potential reason for this gap in assessment measures is that medical schools and GME programs use different evaluation frameworks. In GME, the emphasis of evaluation frameworks is on competencies including patient care, medical knowledge, professionalism, communication and interpersonal skills, systems-based practice, and practice-based learning and improvement. These competencies are used in *Department of Medicine. Uniformed Services University of the Health Sciences, 4301 Jones Bridge Road, Bethesda, MD 20814. tF. Edward Hebert School of Medicine, Uniformed Services University of the Health Sciences, 4301 Jones Bridge Road, Bethesda, MD 20814. ^National Board of Medical Examiners, 3750 Market Street, Philadelphia, PA 19104. The views expressed in this article are those of the authors and do not necessarily reflect the official policy or position of the Department of Defense or the U.S. Government. doi: 10.7205/MILMED-D-14-00551

MILITARY MEDICINE, Vol. 180, April Supplement 2015

multiple GME settings and evidence to support their reli­ ability and validity has been gathered previously.2 Medical schools, until recently, have not adapted competencies from the Accreditation Council for Graduate Medical Education (ACGME) or other sources into their evaluation frameworks. In the early 1990s, our group of interdepartmental acade­ micians at the Uniformed Services University of the Health Sciences (USU) developed a program director evaluation form. In a prior study composed of 1,247 forms from eight specialties with a response rate of 80%, an 18-item “post­ graduate year 1" (PGY-1) survey demonstrated reliability (high internal consistency) and validity, with the items orga­ nized into a two-factor structure: “Expertise and Profes­ sionalism.”3 In this study, Expertise displayed a modest correlation with students’ medical school grade point aver­ age (GPA), U.S. Medical Licensing Examination (USMLE) Step 1 and Step 2 clinical knowledge (CK). We recently revised this program director (PD) evaluation fomi to better align the items with the competencies estab­ lished by the ACGME. In this revised form, we also included items that reflected USU’s military-unique context. Therefore, the purpose of the present study was to collect feasibility, reliability, and validity evidence for our revised survey. For the investigation of validity, we included students’ perfor­ mance on all USMLE Step assessments and their medical school GPAs. We hypothesized that PD evaluations would correlate more strongly with clinical GPA than preclinical GPA and show a stronger association with Step 2 Clinical Skill (CS) and Step 3 than Step 1 or Step 2 CK. The reason is that PD evaluations are mainly based on interns’ clinical performance and are more proximate in time with clerkship GPA and Step 2 CS and Step 3.

97

Development and Initial Validation of a Program Director’s Evaluation Form

METHOD Study Context This study was part of the larger Long-Term Career Outcome Study conducted at the F. Edward Hebert School of Medi­ cine, USU. As the United States’ only federal medical school, USU matriculates approximately 170 medical students annu­ ally and, at the time of this study, offered a traditional 4-year curriculum: 2 years of basic science courses followed by 2 years of clinical rotations (clerkships). Item Development The PGY-1 survey developed in the present study was based on the 2005 survey and took into account the core competen­ cies defined by the ACGME: (1) patient care, (2) medical and population health knowledge, (3) interpersonal and com­ munication skills, (4) practice-based learning and improve­ ment, (5) professionalism, and (6) systems-based practice (see Appendix).4 The survey included 58 items organized into six sections: Patient Care, Communication and Interper­ sonal Skills, Medical Knowledge, Professionalism, Systembased Practice, and Military-unique Practice. The last two items were general ratings of “overall clinical competence” and whether “I would trust this trainee to care for me or a close family member.” These two items were excluded from subsequent exploratory factor analysis. The remainder of the 56 items are listed in Table I. Participants and Procedures We collect PGY-1 data annually from program directors who oversee the training of military medical trainees. Each spring we identified military treatment facilities (and some non-military training programs) where our interns and resi­ dents are trained, and we mailed the evaluation forms for each trainee to the respective GME PDs. We asked GME directors to distribute the forms to their PDs and then return the completed forms in envelopes or via emailed attach­ ment. Forms we received were recorded upon receipt. The cohort of the present study consisted of USU students grad­ uating in 2010 and 2011. Measures and Statistical Analyses First we conducted an exploratory factor analysis (EFA) to examine the factorial validity of the survey scores. Next, we subjected each of the factors identified in the EFA to an internal consistency reliability analysis and computed a mean score for the items associated with a particular factor (i.e., the variables were unweighted composite scores). Finally, to investigate the construct validity of the survey scores, we performed correlation analysis to examine the relationship between PD ratings and students’ medical school GPAs, including preclinical GPA, initial clerkship year GPA, and cumulative GPA. We also examined the correlations between PD ratings and student’s performance on USMLE Step

98

assessments. Statistical significance was set at p < 0.01 because of the number of correlation coefficients being cal­ culated. Preclinical GPA was calculated using course grades from the first 2 years of medical school. Initial clerkship year GPA is a composite of students’ clerkship clinical points (a weighted summation of a student’s clinical grade recom­ mendations), objective structured clinical examination scores, and National Board Medical Examiner Subject Exam­ ination scores. Cumulative GPA refers to the GPA calculated based on the entire 4 years of medical school study. All GPAs were on a common 4-point scale. The USMLE is a single program consisting of four sepa­ rate examinations designed to assess an examinee’s under­ standing of and ability to apply concepts and principles that are important in health, disease, and effective patient care. We obtained study participants’ USMLE Step 1 and Step 2 CK scores from USU’s Registrar’s Office, and Step 2 CS and Step 3 scores from the National Board of Medical Examiners. Step 2 CS is a standardized patient-based clinical skills assessment with a set of three conjunctive outcomes: Inte­ grated Clinical Encounter (ICE), Communication and Inter­ personal Skills (CIS), and Spoken English Proficiency (SEP). Because the spoken English proficiency component shows little meaningful variability for native English speakers, as medical students in the United States overwhelmingly tend to be, the present study focused only on the ICE and CIS components. All analyses were completed using SPSS 22.0 (IBM Cor­ poration, New York, NY), and the study was approved USU Institutional Review Board. RESULTS Exploratory Factor Analysis We had 293 surveys returned. The response rate was 86.2% (293/340). We conducted a principal axis factor analysis with oblique rotation (Direct Oblimin). Evaluation of the correlation matrix indicated that it was factorable: KaiserMeyer-Olkin Measure of Sampling Adequacy was 0.95, which is “marvelous” (>0.90) according to Kaiser’s criteria.2 Bartlett’s test of sphericity ( / 2 = 18521.47, df = 1540, p < 0.001) was significant, indicating that the correlation matrix was not an identity matrix, and all measures of sampling adequacy were deemed sufficient (i.e., >0.60).5 We determined the number of factors to extract using several criteria, including parallel analysis, examination of the resulting scree plot, and eigenvalues greater than 1.0.6 All three criteria suggested a five-factor solution, with the five factors accounting for 87.0% of the total variance in the items. Inspection of the table of communalities revealed all the items had high extracted communalities (i.e., >0.40; see Table I), indicating that much of the common variance in the items can be explained by the five extracted factors.5 We used several additional rules to determine the number of factors and individual items to retain in the final solution: (1) factors needed to contain at least three items, (2) the

MILITARY MEDICINE, Vol. 180, April Supplement 2015

Development and Initial Validation o f a Program Director’s Evaluation Form T A B L E 1.

Results Front the Exploratory Factor Analysis With Oblique Rotation (Direct Oblimin) of PGY-1 Survey (N

=

293)

Pattern Coefficient of Factors Item

Communality

1

Conducting Patient Histories Physical Examination Skills Conducting Daily Patient Evaluations Performing Basic Technical Skills (i.e.. Inserting IV, Suturing, and Inserting a-line) Performing Advanced Procedural Skills (e.g.. Inserting Central Line, Performing Endoscopy, and Operating Room Performance Q6 Analysis of Clinical Data, Differential Diagnosis and Selection/Interpretation of Tests Q7 Ability to Manage and/or Refer Patients with Life-threatening Illness Q8 Ability to Manage and/or Refer Patients with Complex Multisystem Illness Q9 Knowledge and Selection of Treatment Options/Patient Management Q 10 Coordination and Continuity of Care Q 11 Ability to Appreciate a Patient’s Illness in the Context of their Life Q12 Oral Communication Skills Q13 Written Communication Skills QI4 Relationships with Patients Q15 Relationships with Families Q16 Relationships with Peers, Staff, and Other Health Care Personnel Q17 Effectiveness as a Teacher QI8 Sensitivity to Patient’s Age and Gender Q19 Sensitivity to Patient's Culture and Disabilities Q20 Effectiveness as a Member of a Health Care Team Q21 Effectiveness with End-of-Life Care Issues Q22 Fund of Basic Science Knowledge Q23 Fund of Clinical Science Knowledge Q24 Clinical Judgment Q25 Ability to Apply Basic Science Knowledge to Patient Care Q26 Ability to Apply Clinical Science Knowledge to Patient Care Q27 Initiative and Motivation Q28 Conscientiousness Q29 Attitude Q30 Maturity Q 31 Ethical Conduct Q32 Aware of Own Limitations Q33 Willingness to Admit an Error in Judgment Q34 Self-directed Learning Skills Q35 Time Management Skills Q36 Quality of Medical Records Q37 Accesses and Critically Evaluates Current Medical Information and Scientific Evidence Q38 Understanding of the Contexts and Systems in TRICARE Q39 Performance in Volunteerism, Social, or Humanitarian Clinical Activities Q40 Elective Involvement in Research—Specify type: Q41 Participation in Volunteerism, Social, or Humanitarian Clinical Activities Q42 Consideration of Costs in Diagnosis and Management Q43 Adaptation to New Technology Q44 Quality Assurance and Improvement Initiatives Q45 Military Leadership Q46 Perceived Ability to Conduct Patient Care in Deployed Environment Q47 Perceived Ability to Conduct Patient Care in Humanitarian Mission Environment Q48 Understanding of Psychosocial Impacts of Deployment on Service Members and Families Q49 Knowledge of Common Postdeployment Medical or Psychological Conditions (i.e., TBI and PTSD) Q50 Understanding of Patient Flow and Procedures in the Deployment or Humanitarian Mission Environment Q 51 Knowledge of Electronic Health Record Applications Q52 Knowledge of Electronic Health Records/Technology Used in Theater Q53 Ability to Cope with the Stress of Military Medical Practice Q54 Adaptation to Unique Situations and Stressors in Military Medical Practice Q55 Exemplifying Ideals of Military Medical Practice Q56 Motivation to Participate in Humanitarian Mission or Deployment

0.967 0.986 0.977 0.97 0.974

0.64 0.67 0.57 0.44 0.49

0.961 0.979 0.984 0.962 0.966 0.975 0.922 0.921 0.984 0.982 0.942 0.916 0.988 0.987 0.955 0.965 0.956 0.963 0.976 0.979 0.98 0.948 0.952 0.95 0.869 0.943 0.942 0.901 0.956 0.953 0.934 0.958 0.924 0.972 0.941 0.961 0.963 0.927 0.957 0.94 0.983 0.984 0.976 0.972

0.83 0.71 0.72 0.76 0.47

Q1 Q2 Q3 Q4 Q5

2

3

4

0.38 0.45

5

0.4 0.42

0.48 0.62 0.60 0.42

0.47 0.3

0.41

0.54 0.31 0.34

0.51 0.36

0.49 0.92 0.96 0.81 0.84 0.89 0.79 0.86 0.84 0.87 0.8 0.68 0.69 0.56 0.44 0.42 0.54

0.32 0.44 0.45

0.77 0.81 0.63 0.8 0.71 0.32

0.52

0.64 0.76

0.31

0.93 0.93 0.83 0.93

0.973

0.94

0.96 0.963 0.985 0.985 0.958 0.946

0.81 0.83 0.87 0.89 0.78 0.82

Coefficients (absolute value) smaller than 0.3 are suppressed. Entries in bold indicate pattern coefficients (absolute values) >0.40 on at least one factor and pattern coefficients (absolute values) >0.30 on only one factor.

MILITARY MEDICINE, Vol. 180, April Supplement 2015

99

Development and Initial Validation of a Program Director's Evaluation Form TA B LE II.

Pearson Correlations Between the Five Factors of PGY-1 Survey and Medical School GPAs and USMLE Step Assessments Variables

Mean

SD

1

1. Medical Expertise 3.72 0.75 — 3.67 0.77 0.82* 2. Military Unique Practice 3. Professionalism 3.96 0.82 0.81* 4. System-Based Practice 3.53 0.71 0.79* 5. Communication and Interpersonal Skills 3.75 0.74 0.84* 0.48 0.28* 6. Preclinical GPA 3 3.18 0.46 0.41* 7. Initial Clerkship Year GPA 8. Cumulative GPA 3.15 0.37 0.38* 215.97 17.61 0.23* 9. Step 1 221.87 18.15 0.28* 10. Step 2 CK 11. CIS of Step 2 CS 20.06 1.01 0.13 12. ICE of Step 2 CS 0.31 0.7 0.15 13. Step 3 213.93 14.41 0.31*

2

3

— 0.80* 0.82* 0.80* 0.17* 0.29* 0.25* 0.13 0.14 0.05 0.04 0.21*

0.71* 0.78* 0.18* 0.30* 0.26* 0.06 0.11 0.09 0.11 0.16

4

5

6

7

8

— 0.85* 0.58* 0.64* 0.27* 0.35* 0.57*

0.73* 0.70* 0.23* 0.37* 0.61*

9

10

11

12

— —

0.77* 0.21* 0.30* 0.28* 0.15 0.13 0.08 0.08 0.23*



0.15* 0.29* 0.24* 0.1 0.11 0.17* 0.07 0.14



0.66* 0.95* 0.73* 0.65* 0.16* 0.32* 0.54*

— —

0.74* — — 0.09 0.09 0.21* 0.31* 0.28* — 0.64* 0.70* 0.09 0.27*

*p0.40 on at least one factor, and (3) items with factor pattern coefficients (absolute value) >0.30 on more than one factor were dropped (see recommendations in Pett et al).5 The factor pattern and structure coefficients from the principal axis factor analysis are displayed in Table I. As shown in Table I, the first factor included 17 items: Q1 to Q10, Q13, Q22 to Q26, and Q34. The second factor included 11 items: Q46 to Q56. The third factor included 7 items: Q27 to Q33. The fourth factor included 6 items: Q38 to Q42, and Q44. The fifth factor included 4 items: Q14, Q15, Q18, and Q21. The correlations among the factors are shown in Table II. F a c to r L a b e ls a n d R e lia b ility A n a ly s is

Based on our EFA results, we named each of the five factors: (1) Factor 1 was labeled “Medical Expertise,” (2) Factor 2 was labeled “Military-unique Practice,” (3) Factor 3 was labeled “Professionalism,” (4) Factor 4 was labeled “Systemsbased Practice,” and (5) Factor 5 was labeled “Communication and Interpersonal Skills.” Next, we conducted a reliability analysis on the scores of the items retained in each of the five subscales. The Cronbach’s alpha values were as follows: Medical Expertise (17 items) = 0.99; Military-unique Practice (11 items) = 0.96; Professionalism (7 items) = 0.97; Systems-based Practice (6 items) = 0.97; and Communication and Interpersonal Skills (4 items) = 0.96. These internal consistency reliabilities were all considered good to excellent (i.e., >0.90 as excellent and >0.80 as good; see guidelines in Gable and Wolfe).7 Finally, we created composite variables to be used in subsequent anal­ yses. These variables were created by computing a mean score for the items associated with a particular factor. C o rre la tio n A n a lys is

Table II presents the Pearson correlations between the five factors, which were all high, ranging from 0.71 to 0.84. In terms of correlations with medical school GPAs, all five

100

factors showed stronger positive correlations with initial clerkship year GPA (0.30 < r < 0.41) than preclinical GPA (0.16 < r < 0.29) or the cumulative medical school GPA (0.25 < r < 0.38). All of the correlations between measures were statistically significant ( p < 0.01). Also shown in Table II are the correlations between PD ratings and Step Examinations. Among the five factors of PGY-1 evaluation, “Medical Expertise” was most closely associated with the Step Examinations. It had positive significant correlations with Step 1 (r = 0.23, p < 0.01), Step 2 CK ( r = 0.28, p < 0.01), and Step 3 ( r = 0.31, p < 0.01). “Military-unique Practice” ( r = 0.21, p < 0.01) and “System-based Practice” (r = 0.23, p < 0.01) were both significantly correlated with Step 3. “Communication and Interpersonal Skills” were significantly correlated with CIS component of Step 2 CS (r = 0.17, p < 0.01). DISCUSSION We evaluated the feasibility, reliability, and validity of a new form of a PGY-1 survey. The feasibility of this instrument was demonstrated by the satisfactory response rate (86.2%). This high response rate is impressive considering the fact that program directors were from multiple specialties spread throughout the continental United States. We achieved this response rate despite a relatively high turnover of program directors at our military GME programs. This speaks to the feasibility of our evaluation system, which we believe could be adopted by other institutions. Although the survey was designed with six sections to largely parallel the six ACGME competencies, only 5 factors emerged from the EFA—“Medical Expertise, Military-unique Practice, Professionalism, System-based Practice, and Com­ munication and Interpersonal Skills.” The results seemed to suggest that it is difficult to tease out knowledge from exper­ tise, which is consistent with findings of our previous study.3 The correlation analysis indicated that all five factors were more strongly associated with students’ GPA in the initial clerkship year than their GPA across the first 2 years.

MILITARY MEDICINE, Vol. 180, April Supplement 2015

Development and Initial Validation of a Program Director’s Evaluation Form

Further, the four factors of “Medical Expertise, Militaryunique Practice, Professionalism, and System-based Practice” all showed stronger correlations with students’ performance on Step 3 than other Step Examinations. The results were con­ sistent with our research hypotheses. Besides the fact that PD evaluations are mainly based on interns’ clinical performance and are more proximate in time with clerkship GPA and Step 2 CS and Step 3, another explanation for stronger correlations between PD evaluations and Step 3 might be the restriction of range. For those medical students who actually got into a residency, the Step 1 and Step 2 CK ranges have been nar­ rowed. Not surprisingly, “Communication and Interpersonal Skills” showed the strongest correlation with CIS component of Step 2 CS. This is another piece of validity evidence for the PGY-1 survey and confinned our research hypotheses. The findings from this study differ from our prior work, which uncovered only two factors. In the present study, we

reconfirmed these same two factors—“Expertise” (referred to as “Medical Expertise” in the current study) and “Profession­ alism.” Further, as expected, we identified military-unique practice elements as well as several of the other ACGME competencies. Considering USU’s focus on military medi­ cine, and our inclusion of several items addressing this focus, we were not surprised to find a stand-alone military-unique practice factor. As local context is important, we would encour­ age other medical schools to employ our newly developed survey and consider adding school-specific items that repre­ sent their local context and “signature experiences.” In conclusion, scores from our revised PGY-1 PD evalua­ tion form have evidence of reliability and validity and, as such, can be used as a tool to gauge medical graduates’ firstyear internship performance. The development of this tool has helped to close the gap in assessment between medical schools and GME programs.

APPENDIX P rogram D irector’s E valuation F orm— PGY-1

Name of Resident:_____

_______(Please print)

Residency type/Specialty: Evaluator’s Name/Title:_

(Please print)

Please evaluate this individual’s performance in comparison to ALL first-year residents you have trained over the years, not just in comparison to this individual’s peers. Please use the following rating scale (write “0” if unable to judge): 1

2

3

4

5

Unacceptable

Significantly below most of their peers

On par with most peers

Better than most peers

Consistently at least one level higher than almost all peers

NA = not applicable Patient Care _____Conducting patient histories _____ Physical examination skills _____ Conducting daily patient evaluations _____ Performing basic technical skills (e.g., inserting IV, suturing, inserting a-line) _____Performing advanced procedural skills (e.g., inserting central line, performing endoscopy, operating room performance) _____Analysis of clinical data, differential diagnosis, and selection/interpretation of tests _____Ability to manage and/or refer patients with life-threatening illness _____ Ability to manage patients with complex multisystem illness _____ Knowledge and selection of treatment options/patient management _____ Coordination and continuity of care _____ Ability to appreciate a patient’s illness in the context of their life Communication and Interpersonal Skills _____Oral communication skills _____Written communication skills

MILITARY MEDICINE, Vol. 180, April Supplement 2015

101

Development and Initial Validation of a Program Director's Evaluation Form

____ Relationships with patients ____ Relationships with families ____ Relationships with peers, staff, and other health care personnel ____ Effectiveness as a teacher ____ Sensitivity to patient’s age and gender ____ Sensitivity to patient’s culture and disabilities ____ Effectiveness as a member of a health care team ____ Effectiveness with end-of-life care issues Medical Knowledge ____ Fund of basic science knowledge Fund of clinical science knowledge ____ Clinical judgment ____ Ability to apply basic science knowledge to patient care ____ Ability to apply clinical science knowledge to patient care Professionalism ____ Initiative and motivation ____ Conscientiousness ____ Attitude ____ Maturity ____ Ethical conduct ____ Aware of own limitations ____ Willingness to admit an error in judgment Systems-Based Practice & Practice-Based Learning and Improvement ____ Self-directed learning skills ____ Time management skills ____ Quality of medical records ____ Accesses and critically evaluates current medical information and scientific evidence ____ Understanding of the contexts and systems in TRICARE ____ Performance in volunteerism, social, or humanitarian clinical activities ____ Elective involvement in research-Specify type:________________________ ____ Participation in volunteerism, social, or humanitarian clinical activities ____ Consideration of costs in diagnosis and management ____ Adaptation to new technology ____ Quality assurance and improvement initiatives Military Unique Practice, Deployments, and Humanitarian Missions ____ Military leadership ____ Perceived ability to conduct patient care in deployed environment ____ Perceived ability to conduct patient care in humanitarian mission environment ____ Understanding of psychosocial impacts of deployment on Service members and families ____ Knowledge of common postdeployment medical or psychological conditions (i.e., TBI, PTSD) ____ Understanding of patient flow and procedures in the deployment or humanitarian mission environment

102

MILITARY MEDICINE, Vol. 180, April Supplement 2015

Development and Initial Validation of a Program Director’s Evaluation Form

____ Knowledge of electronic health record applications ____ Knowledge of electronic health record/technology used in theater ____ Ability to cope with the stress of military medical practice ____ Adaptation to unique situations and stressors in military medical practice _____Exemplifying ideals of military medical practice ____ Motivation to participate in humanitarian mission or deployment Overall clinical competence_____ I would trust this trainee to care for me or a close family member, (circle) Yes Why or why not? (Please explain)

REFERENCES 1. Lurie SJ, Mooney CJ, Lyness JM: Measurement of the general competen­ cies of the Accreditation Council for Graduate Medical Education: a systematic review. Acad Med 2009; 84: 301-9. 2. Kash KM, Leas BF, Clough J, et at: ACGME competencies in neurology: web-based objective stimulated computerized clinical encounters. Neurology 2009; 72: 893-8. 3. Duming SJ, Pangaro LN, Lawrence LL, Waechter D, McManigle J, Jackson JL: The feasibility, reliability, and validity of a program direc­ tor’s (supervisor’s) evaluation form for medical school graduates. Acad Med 2005; 80: 964-8.

MILITARY MEDICINE, Vol. 180, April Supplement 2015

No

Not Sure

4. ACGME core competencies: Educational Commission for Foreign Medical Graduates. Available at http://www.ecfmg.org/echo/acgmecore-competencies.html; last accessed October 1, 2014. 5. Pett MA, Lackey NR, Sullivan JJ: Making Sense of Factor Analysis: The Use of Factor Analysis for Instrument Development in Health Care Research. Thousand Oaks, CA, Sage Publications, 2003. 6. Hayton JC, Allen DG, Scarpello V: Factor retention decisions in explor­ atory factor analysis: a tutorial on parallel analysis. Organ Res Methods 2004; 7: 191-205. 7. Gable RK, Wolfe MB: Instrument Development in the Affective Domain: Measuring Attitudes and Values in Corporate and School Settings. Boston, MA, Kluwer Academic Publishers, 1993.

103

Copyright of Military Medicine is the property of AMSUS and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use.

Development and initial validation of a program director's evaluation form for medical school graduates.

In the early 1990 s, our group of interdepartmental academicians at the Uniformed Services University (USU) developed a PGY-1 (postgraduate year 1) pr...
4MB Sizes 0 Downloads 6 Views