Why does the verbal section of the MCAT correlate with USMLE performance?

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

ESPNdeportes

Member
7+ Year Member
15+ Year Member
20+ Year Member
Joined
Jul 7, 2003
Messages
146
Reaction score
0
Just wondering.......

Does it really correlate to performance?

Members don't see this ad.
 
Acad Med 1996 Feb;71(2):176-80

Prediction of students' USMLE step 2 performances based on premedical credentials related to verbal skills.

Roth KS, Riley WT, Brandt RB, Seibel HR.

PURPOSE. To examine the relationship between the objective premedical credentials and performances on Step 2 on the United States Medical Licensing Examination (USMLE) of 480 students in three classes at the Virginia Commonwealth University Medical College of Virginia School of Medicine. The purpose of the study was to seek those selection criteria that might best predict performance on an examination designed to assess problem-solving skills, the essence of clinical medicine.

METHOD. Premedical data from two classes (1193, 1994) were analyzed, and a regression equation was used to calculate theoretical USMLE Step 2 scores for the students in the class of 1995, who had not yet taken this examination. The premedical variables were scores on the verbal and math section on the Scholastic Aptitude Test (SAT), scores on the six sections of the pre-1991 Medical College Admission Test (MCAT), grade-point average (GPA) in science courses required of premedical students, and undergraduate major. Once the class of 1995 had taken the USMLE Step 2, the equation was cross validated, and the theoretical and actual scores of the class of 1995 were correlated.

RESULTS. The correlation between theoretical and actual scores was r = .443. In the analysis for the classes of 1993 and 1994, the single variables most highly predictive of USMLE Step 2 performance were scores on the verbal section of the SAT (r = .317) and the Skills Analysis: Reading section of the MCAT (r = .331). However, the MCAT scores were excluded from the final regression analysis because of the pre-1991 MCAT cannot be useful in predicting the performances of present medical school applicants. The resulting regression equation (using the SAT verbal section and premedical GPA) was able to account for 21.2% of the variance for the class of 1995.

CONCLUSION. The use of the verbal section of the SAT as a predictive factor is unique. It is significant that this variable was strongly related to premedical GPA, suggesting that high verbal aptitude serves one well, even when coping with complex scientific concepts.
 
Members don't see this ad :)
Basco, W.T., Jr., Way, D.P., Gilbert, G.E., & Hudson, A. (2002). Undergraduate Institutional MCAT Scores as Predictors of USMLE Step 1 Performance. Academic Medicine, 77, S13-S16.

PURPOSE: The purpose of this study was to explore the use of institutional MCAT scores (or MCAT scores averaged across all students from an undergraduate institution) as a measure of selectivity in predicting medical school performance. Using data from two medical schools, this study tested the hypothesis that employing MCAT scores aggregated by undergraduate institution as a measure of selectivity improves the prediction of individual students' performances on the first sitting of the United States Medical Licensing Examination Step 1 (USMLE Step 1).

METHOD: Subjects consisted of the 1996-1998 matriculants of two publicly funded medical schools, one from the Southeast region of the United States and the other from the Midwest. There were 16,954 applicants and 933 matriculants in the combined data set. Independent variables were matriculants' undergraduate science grade-point averages (sciGPAs), and three MCAT scores (Physical Sciences, Biological Sciences, and Verbal Reasoning). The investigational variables were the average MCAT scores attained by all students from a particular undergraduate institution that sat for the exam between April 1996 and August 1999. Demographic variables that included medical school, year of medical school matriculation, gender, and minority status were employed as control variables. The dependent variable was the matriculants' scores from their first sittings for the USMLE Step 1. Multiple regression, multicollinearity, and cross-validation procedures were employed. Correlations with performance on the USMLE Step 1 were adjusted for restriction of range.

RESULTS: Bivariate analyses demonstrated moderate correlations between sciGPA and the individual MCAT scores and between sciGPA and USMLE Step 1 scores. There were substantial correlations between individual MCAT scores and USMLE Step 1 scores. Correlations between individual MCAT scores and the USMLE Step 1 scores were slightly higher than institutional MCAT scores, in part due to adjustment for restriction in range. For the regression model without undergraduate selectivity measures, multicollinearity was observed in MCAT Physical Sciences (MCAT-PS) scores and MCAT Biological Sciences (MCAT-BS) scores. Undergraduate institutional Physical Sciences and undergraduate Biological Sciences also demonstrated multicollinearity in addition to URM status, MCAT-PS scores, and MCAT-BS scores in the model with the selectivity measures. The base multiple regression model containing gender, URM status, and SciGPA accounted for 13.9% of the variation in USMLE Step 1. When applicant MCAT scores were added to the model, the model explained 29.1% of the variation in USMLE Step 1 scores. Finally, when institutional MCAT scores were added to the predictive model, .94% additional percentage of variation in USMLE Step 1 scores was explained.

CONCLUSION: Consistent with findings from previous studies, this study demonstrated that undergraduate science GPAs and MCAT scores are strong predictors of standardized test performances during medical school. In contrast to prior studies, this study employed institutional MCAT averages and demonstrated that their inclusion in regression models, as a measure of selectivity, can produce a small improvement when used in a theoretical model in the prediction of a medical student's performance. Regardless of how the average institutional MCAT scores are interpreted as a measure of selectivity, a measure of academic rigor, or a measure of educational climate, this study shows it to be a useful addition to the traditional prediction model used for admission.
 
r = 0.331? that's a terrible correlation. is THIS where people get the premise that verbal scores correlate to performance/GPA/USMLE scores in med school? :rolleyes:
 
Originally posted by spumoni620
r = 0.331? that's a terrible correlation. is THIS where people get the premise that verbal scores correlate to performance/GPA/USMLE scores in med school? :rolleyes:

I couldn't agree more;)
 
The second study you posted is very interesting. It reminds me of all the painful regression analyses I underwent in college...:D . Again, I don't feel that the conclusion supports the results. First of all, they haven't provided the coefficients/t-statistics so you don't know whether the variables are even significant! Second, while the 7% increase upon addition of MCAT scores is supportive of the conclusion...it's still only a 7% jump, pushing it up to 30% at MOST. This indicates that the majority of factors motivating the USMLE score surrounds areas NOT related to MCAT, undergrad selectivity, demographic status/race/gender, and science gpa.

finally, i think there's an element of endogeneity b/t MCAT and undergrad GPA--it's viable to think that while studying for the test, sciGPA might be affected as a result. This potentially is a flaw in their model.

Oh well. just my $.02, i suppose. :D
 
Originally posted by spumoni620
The second study you posted is very interesting. It reminds me of all the painful regression analyses I underwent in college...:D . Again, I don't feel that the conclusion supports the results. First of all, they haven't provided the coefficients/t-statistics so you don't know whether the variables are even significant! Second, while the 7% increase upon addition of MCAT scores is supportive of the conclusion...it's still only a 7% jump, pushing it up to 30% at MOST. This indicates that the majority of factors motivating the USMLE score surrounds areas NOT related to MCAT, undergrad selectivity, demographic status/race/gender, and science gpa.

finally, i think there's an element of endogeneity b/t MCAT and undergrad GPA--it's viable to think that while studying for the test, sciGPA might be affected as a result. This potentially is a flaw in their model.

Oh well. just my $.02, i suppose. :D

I agree with your analysis and that the r values are crap. However I dont think the endogenous argument is a very sound one. I dont see how only sciGPA would be affected by studying for the MCAT (overall GPA might as well), and even then most students who dealt with both only put the MCAT ahead of their schoolwork during the last week or two. i agree with everything else though.
 
hey gleevec,
yeah, i see where you're coming from...i don't think the original analysis included overall gpa--i thought scigpa was the only RHS gpa variable that they used...it would definitely affect overall gpa as much (or more so) than the scigpa. and the endogeneity might not hold true for all the students, but to perform a good econometric regression they should at least run a test to rule out the possibility. for us, endogeneity was the apparition that precluded good results...so i'm sort of anal about it now :)

i think it might also have been interesting to include "ugrad major" as a variable--i.e. either science or nonscience...
 
Top