If it were really just "the last few years", I might agree with you. But if you look over those studies on the AAMC website (which are what was studied in the 2007 meta analysis -- no new data), you will see that the most recent ones (published in 2005) actually covered folks who started med school in 1992-3, meaning they took the MCAT a year or two prior. That's 15 years, which is an eternity in terms of trying to still hold this data out as valid. That the "AAMC is still publishing those studies on their website" isn't exactly critical reasoning for supporting the validity of these studies. The AAMC is responsible for the MCAT, so of course they have a vested interest in suggesting it is of value. But if you read beyond the journal article titles, you will see that this is more of an historical perspective than something that ought to be extrapolated to the test and student body in existence today. I've got to wonder whether the lack of recent studies (as opposed to meta analysis of old data) is indicative of the fact that the AAMC doubts it will ever improve upon the historical correlation they showed back in the early 90s.
We've been over this before:
aamc.org/students/mcat/research/bibliography/start.htm
Basco, W.T., Jr., Way, D.P., Gilbert, G.E., & Hudson, A. (2002). Undergraduate Institutional MCAT Scores as Predictors of USMLE Step 1 Performance. Academic Medicine, 77, S13-S16.
PURPOSE: The purpose of this study was to explore the use of institutional MCAT scores (or MCAT scores averaged across all students from an undergraduate institution) as a measure of selectivity in predicting medical school performance. Using data from two medical schools, this study tested the hypothesis that employing MCAT scores aggregated by undergraduate institution as a measure of selectivity improves the prediction of individual students' performances on the first sitting of the United States Medical Licensing Examination Step 1 (USMLE Step 1).
METHOD: Subjects consisted of the 1996-1998 matriculants of two publicly funded medical schools, one from the Southeast region of the United States and the other from the Midwest. There were 16,954 applicants and 933 matriculants in the combined data set. Independent variables were matriculants' undergraduate science grade-point averages (sciGPAs), and three MCAT scores (Physical Sciences, Biological Sciences, and Verbal Reasoning). The investigational variables were the average MCAT scores attained by all students from a particular undergraduate institution that sat for the exam between April 1996 and August 1999. Demographic variables that included medical school, year of medical school matriculation, gender, and minority status were employed as control variables. The dependent variable was the matriculants' scores from their first sittings for the USMLE Step 1. Multiple regression, multicollinearity, and cross-validation procedures were employed. Correlations with performance on the USMLE Step 1 were adjusted for restriction of range.
RESULTS: Bivariate analyses demonstrated moderate correlations between sciGPA and the individual MCAT scores and between sciGPA and USMLE Step 1 scores. There were substantial correlations between individual MCAT scores and USMLE Step 1 scores. Correlations between individual MCAT scores and the USMLE Step 1 scores were slightly higher than institutional MCAT scores, in part due to adjustment for restriction in range. For the regression model without undergraduate selectivity measures, multicollinearity was observed in MCAT Physical Sciences (MCAT-PS) scores and MCAT Biological Sciences (MCAT-BS) scores. Undergraduate institutional Physical Sciences and undergraduate Biological Sciences also demonstrated multicollinearity in addition to URM status, MCAT-PS scores, and MCAT-BS scores in the model with the selectivity measures. The base multiple regression model containing gender, URM status, and SciGPA accounted for 13.9% of the variation in USMLE Step 1. When applicant MCAT scores were added to the model, the model explained 29.1% of the variation in USMLE Step 1 scores. Finally, when institutional MCAT scores were added to the predictive model, .94% additional percentage of variation in USMLE Step 1 scores was explained.
CONCLUSION: Consistent with findings from previous studies, this study demonstrated that undergraduate science GPAs and MCAT scores are strong predictors of standardized test performances during medical school. In contrast to prior studies, this study employed institutional MCAT averages and demonstrated that their inclusion in regression models, as a measure of selectivity, can produce a small improvement when used in a theoretical model in the prediction of a medical student's performance. Regardless of how the average institutional MCAT scores are interpreted as a measure of selectivity, a measure of academic rigor, or a measure of educational climate, this study shows it to be a useful addition to the traditional prediction model used for admission.
This was published in 2002 using data from 1996-1998 matriculants and was not included in the 2007 meta study. Again, I doubt that medical school student bodies have changed much since then and even if they have, this study indicates that a reasonable MCAT/Step1 correlation exists independent of demographics.
I'm basing my argument on evidence that has been published, which, like any study in science or medicine, isn't perfect. You can make conjecture about any study you want. While it's possible that there hasn't been a more recent study because the AAMC is incapable of critiquing its own exam, I think it is more likely that the changes in both MCAT and Step 1 in the past few years have prevented more recent studies from being accomplished. I also suspect that the changes in the MCAT would likely make it a better, not a worse predictor of Step 1 because the AAMC does have a vested interest in selecting students who will perform well in their medical schools. But that is all just conjecture. The only evidence anyone can reasonably go on is what has been published and the evidence consistently shows a moderate correlation between MCAT/Step 1.