The first link is a freaking blog post. But he even lays out the caveats clearly.
- These data are from one year only. There are often fluctuations from year-to-year at each school up to 2-3 points in the absence of any significant changes in curriculum. Unfortunately, longitudinal data is not available to do a more thorough/stable analysis.
- These data are probably self-reported from the school to USNWR. Schools could lie, and there is rampant speculation on SDN that schools do manipulate their statistics when presenting them to applicants because publicly available data on Step scores from the NBME are not available. A number of schools have admitted fudging their undergraduate data. For all we know, Baylor could be pulling BS on everyone. But I trust people. =)
- This is only the top 20 schools (because I’m lazy). Below that could be a different picture.
Conversely, Swanson4 derived a range of MCAT subtest coefficients (r 0.14 to 0.52) on the USMLE Step 1 examination based on a large sample of 11,145 students, and in a study of 27,406 students, Julian5 found that their total MCAT scores correlated moderately well across all three USMLE Step examinations (Step 1, r 0.61; Step 2, r 0.49; Step 3, r 0.49)
A random-effects model meta-analysis of weighted effects sizes (r) resulted in (1) a predictive validity coefficient for the MCAT in the preclinical years of r 0.39 (95% confidence interval [CI], 0.21– 0.54) and on the USMLE Step 1 of r 0.60
All permutations of MCAT scores (first, last, highest, average) were weakly associated with GPA, Step 2 clinical knowledge scores, and Step 3 scores. MCAT scores were weakly to moderately associated with Step 1 scores.
The quotes above speak for themselves. If you want to play in the big leagues, it's not enough to just post random links. You have to actually read the things you're pretending to source (not just google the links) to make coherent arguments.