AAMC Standardized Video Interview Evaluation Summary

The AAMC Standardized Video Interview (SVI) was introduced in 2016 as a research project and was administered as an operational pilot for emergency medicine residency selection during the ERAS® 2018, 2019, and 2020 seasons. The AAMC and the emergency medicine community designed a longitudinal and multi-pronged study to evaluate the SVI.

At the conclusion of the SVI for ERAS 2020 season, the AAMC reviewed data from the last four years. The AAMC stands confidently behind the SVI as a reliable, valid assessment of behavioral competencies that does not disadvantage individuals or groups. However, there is a lack of interest among the emergency medicine community in continuing to use and research the SVI, as well as operational challenges scaling the SVI to the full applicant pool across multiple specialties. As such, the AAMC decided not to renew or expand the SVI pilot in emergency medicine for the ERAS 2021 application cycle.

This document provides an overview of the AAMC’s evaluation of the SVI. For more detailed information, refer to cited articles.

Evaluation Plan

The SVI evaluation plan included four broad areas: (1) psychometrics, (2) validity evidence, (3) fairness and preparation, and (3) program director and applicant reactions. Each broad area was composed of multiple research questions. As shown in the table below, data from multiple ERAS application cycles were used to evaluate the broad areas. We provided data in some areas (i.e., psychometrics) immediately following each SVI pilot year and are still waiting for data to accumulate in other areas (e.g., correlations with performance).

  Research Year ERAS 2018 ERAS 2019 ERAS 2020
Psychometrics X X X In Progress
Validity Evidence        

Content

X X X X

Correlations with selection data

X X X In Progress

Correlations with performance

Not Applicable Available 2020* Available 2020 Available 2021
Fairness and Preparation X X X In Progress
PD and Applicant Reactions X X X In Progress

*While we have begun to collect performance data from participating applicants in the ERAS 2018 cycle, sample sizes are too small to draw any conclusions. Data from multiple years must be combined in order to report and interpret results.

Psychometrics

Raters were trained to use a standardized rating process and scoring rubrics. They participated in several rounds of practice and were given feedback prior to making operational ratings. We estimated reliability by examining the extent to which raters agreed with each other at the conclusion of rater training and found that rater reliability met industry standards.1 Findings were consistent across all four years. We also examined score distributions after each SVI administration. Results showed that SVI total scores were relatively normally distributed and there was variance in the ratings.2

Validity Evidence

Validity refers to the extent to which evidence and theory support the inferences drawn from scores based on their intended uses. The process of providing evidence of validity is ongoing and involves accumulating multiple sources of evidence.3 We established the validity of SVI scores using evidence based on content and relations with other variables. This approach aligns with best practices in the professional testing literature and legal guidelines.4-6

We provided evidence of validity based on content by working with subject matter experts in graduate medical education to develop SVI questions and the scoring rubric. The competencies assessed on the SVI are based on two of the Accreditation Council for Graduate Medical Education (ACGME) competencies,7 which have been identified as important and required for success in residency. Over the years, two different groups of subject matter experts (SMEs) in graduate medical education reviewed SVI questions and confirmed that each question assessed “Interpersonal Skills and Communication” or “Knowledge of Professional Behavior.” Only the questions that SMEs confirmed to be appropriate assessments of the target competencies were retained. In addition, two different sets of emergency medicine SMEs also confirmed the accuracy of the scoring rubric. They confirmed that each behavior example demonstrated the target competency and was an accurate representation of the proficiency level expected.

We also provided evidence of validity based on relations with other residency selection variables. Using data from participating applicants in the ERAS 2017, 2018, and 2019 cycles, we showed that there were no correlations or very small correlations between SVI scores and academic-focused selection data, such as USMLE® Step 1 scores and Alpha Omega Alpha honor society membership. In contrast, there were small, but higher positive correlations with variables that purport to assess behavioral competencies and have some overlap with SVI, such as USMLE Step 2 Clinical Skills scores and Electronic Standardized Letters of Evaluation (eSLOE) ratings. These findings suggest that—as was intended--SVI scores provide different information than the academic-focused metrics typically used in residency selection. Moreover, the pattern of correlations supports the validity of the SVI because scores are more highly correlated with other measures of behavioral competencies than with academic metrics.1

To date we are not able to provide evidence of validity based on relationship to intern performance data due to methodological limitations associated with the local validity study. We partnered with 17 emergency medicine programs to examine the correlation between SVI scores, ACMGE Milestone ratings, end-of-shift ratings, and ratings from a research-only performance evaluation tool. During this work, we learned that programs interpret and use ratings scales differently, so we need to conduct analyses at the program level. Sample sizes (n=5 to 25) are too small to have adequate power.8 Additional years of data are needed to have sufficient sample sizes and to analyze the relations between SVI scores and intern performance.

Diversity, Fairness, and Preparation

Unconscious bias training was provided to raters and made available to program directors. Across all four years of data, we consistently found that the average score differences between White, Black, Hispanic, and Asian applicants did not reach the threshold for a small effect. These group differences are substantially smaller than what is observed for standardized tests, which usually have large group differences for Black applicants and medium differences for Hispanic applicants.9 Group differences in SVI scores are also smaller than observed for the eSLOE, which has a small effect for Black applicants.10 The SVI is the only national assessment being used operationally in residency selection that does not result in group differences in scores by race/ethnicity. The inclusion of the SVI in the selection process has the potential to facilitate holistic review and broaden the pool of applicants invited to the in-person interview.

The AAMC has taken several steps to ensure equal access by providing free SVI preparation materials to applicants, including providing sample questions, Tips for SVI Applicants,11 the AAMC SVI Applicant Preparation Guide,12 and two free online practice interviews.13 Post-SVI survey results showed that the majority applicants used at least one of the AAMC’s free resources.

In addition, we studied the extent to which preparation for the SVI and location of taking the SVI affected SVI scores. The vast majority of applicants reported spending less than 4 hours preparing for the SVI. Data from the SVI for ERAS 2019 pilot showed that applicants who did not prepare for the SVI had slightly lower scores than applicants who spent between 1-6 hours preparing. Seventy-five percent of applicants took the SVI at home, and the location in which the applicant recorded their video responses did not affect SVI scores.14 On average, individuals who started or completed at least one of the AAMC’s free online SVI practice interviews had slightly higher SVI scores.15 Together, these data suggest that extensive preparation is not necessary to perform well on the SVI.

Program Director and Applicant Reactions

We surveyed program directors about their use and attitudes about the SVI in the ERAS 2018 and 2019 seasons.16,17 Results from both studies were largely consistent and showed that program directors used the SVI cautiously during the pilot. Only 42% of those surveyed in the ERAS 2019 season reported using SVI scores at some point in the process and most of whom reported that SVI scores were not important when deciding whom to invite to the in-person interview. Most program directors reported wanting more research on the value of SVI scores before incorporating it into their selection processes.17

During each season, we surveyed applicants immediately following their completion of the SVI. We also surveyed applicants after receiving their scores during the ERAS 2018 season. Results have consistently shown that applicants are satisfied with the instructions and policies that support the SVI. However, they are not satisfied with the SVI overall. They do not believe it will contribute to holistic review or that it is a good measure of their interpersonal and communications skills or professionalism.17,18

Scalability

Given the large volume of residency applicants across all specialties, the AAMC has been studying computer scoring of interviews. After several years of research, we do not believe computer scoring is ready for operational use.

Conclusion

The AAMC—in collaboration with specialty leadership on the Emergency Medicine Standardized Video Interview Working Group-- established a comprehensive evaluation plan for the SVI that could serve as a framework for the AAMC, or others, to use when evaluating new assessments for use in the future. Based on the data collected to date, the AAMC concluded that the SVI is a reliable, valid assessment of behavioral competencies that does not disadvantage individuals or groups. The decision not to offer the SVI for the ERAS 2021 application cycle is based on lack of interest among the emergency medicine community in continuing to use and research the SVI, as well as an assessment of operational factors necessary for successful expansion of the program.

References

  1. Bird SB, Hern HG, Blomkalns A, et al. Innovation in residency selection: the AAMC standardized video interview. Acad Med. 2019;94(10):1489-97.
  2. https://students-residents.aamc.org/applying-residency/article/how-svi-scored/. Accessed October 3, 2019.
  3. American Educational Research Association, American Psychological Association, National Council on Measurement in Education. Standards for Educational and Psychological Testing. 3rd ed Washington, DC: American Educational Research Association, 2014.
  4. Principles for the Validation and Use of Personnel Selection Procedures. Fourth Edition Bowling Green, OH: Society for Industrial and Organizational Psychology, Inc, 2003. Available at: http://www.siop.org/_Principles/principles.pdf. Accessed Sep 13, 2017.
  5. Downing SM. Validity: on the meaningful interpretation of assessment data. Med Educ 2003;830–7.
  6. Guardians v. CSC (1980).
  7. Accreditation Council for Graduate Medical Education. Milestones by specialty. https://www.acgme.org/What-We-Do/Accreditation/Milestones/Milestones-by-Specialty. Accessed February 28, 2019.
  8. Cohen J. Statistical Power Analysis for the Behavioral Sciences. 1988.2nd ed. Hillsdale, NJ: Lawrence Erlbaum Associates
  9. Sackett PR, Shen W. Subgroup differences on cognitively loaded tests in contexts other than personnel selection. In Outtz JL, ed. Adverse Impact: Implications for Organizational and Staffing and High Stakes Selection. New York, NY: Taylor and Francis Group; 2010:323-346.
  10. Hopson LR, Regan L, Bond MC, et al. Comparison of the performance characteristics of the AAMC Standardized Video Interview (SVI) and the Electronic Standardized Letter of Evaluation (eSLOE) in emergency medicine. Acad Med. 2019;1513-1521.
  11. The Association of American Medical Colleges. Tips for SVI Applicants. https://students-residents.aamc.org/applying-residency/article/tips-svi-applicants/. Accessed October 3, 2019.
  12. The Association of American Medical Colleges. AAMC Standardized Video Interview: Applicant Preparation Guide. https://aamc-orange.global.ssl.fastly.net/production/media/filer_public/41/37/41376739-d8a1-4e38-85f5-479d99caa1a3/aamc_standardized_video_interview_applicant_prep_guide_22019.pdf. Published February 20, 2019. Accessed October 3, 2019.
  13. Practice AAMC Standardized Video Interviews. The Association of American Medical Colleges website. https://students-residents.aamc.org/applying-residency/article/svi-practice-interviews/. Accessed October 3, 2019.
  14. Jarou Z, Karl E, Alker A, et al. Factors affecting Standardized Video Interview performance: Preparation elements and the testing environment. EM Resident. April 17, 2018. https://www.emra.org/emresident/article/svi-study-results. Accessed October 3, 2019.
  15. Deiorio NM, Hopson LR, Fletcher L. Do use of Standardized Video Interview practice tests correlate with applicant score? Presented at: SAEM 2019; May 14-17; Las Vegas, NV.
  16. Gallahue FE, Hiller KM, Bird SB, et al. Emergency Medicine residency programs’ use of and reactions to the Association of American Medical Colleges Standardized Video Interview during the 2018 application cycle. Acad Med. 2019;94(10):1506-1512.
  17. Deiorio NM, Jarou ZJ, Alker A, et al. Applicant reactions to the AAMC Standardized Video Interview during the 2018 application cycle. Acad Med. 2019; 94(10):1498-505.
  18. The Association of American Medical Colleges. Results of the 2016 Program Directors Survey: Current Practices in Residency Selection. https://aamc-orange.global.ssl.fastly.net/production/media/filer_public/2c/91/2c919beb-e350-4ca3-a3c2-7b7528802899/2018-2019_svi_program_directors_survey_results_final.pdf. Accessed October 3, 2019.