Originally posted by alosteostudent
Actually, I go to PCOM. I was just responding to some past posts about medical school rankings. I thought that it was funny that no one ever mentions this school in WEst Va but yet apparently it is one of the best in the country. And by the way, have received my PhD in biostatistics and epidemiology from Stanford, the rankings provided are very conclusive! Just thought you would like to know.
Sorry, we didn't know that you are the God of biostatistics and epidemiology.
US News rankings are far from conclusive. Here is how they are determined (
http://www.usnews.com/usnews/edu/grad/rankings/about/05med_meth_brief.php)
---------------------------------
Medicine Methodology
The 125 medical schools fully accredited by the Liaison Committee on Medical Education plus the 19 schools of osteopathic medicine fully accredited by the American Osteopathic Association were surveyed for the ranking of research medical schools; 119 schools provided the data needed to calculate the research rankings based on the indicators used in the research model. The same medical and osteopathic schools were surveyed for the primary-care ranking; 119 schools provided the data needed to calculate the primary-care ranking. Both rankings are based on a weighted average of seven indicators, six of them common to both models. The research model factors in research activity; the primary-care model adds a measure of the proportion of graduates entering primary-care specialties.
Quality assessment(weighted by .40): Peer assessment surveys were conducted in the fall of 2003, asking medical and osteopathic school deans, deans of academic affairs, and heads of internal medicine or the directors of admissions to rate program quality on a scale of "marginal" (1) to "outstanding" (5). Survey populations were asked to rate program quality for both research and primary-care programs separately on a single survey instrument. The response rate was 56 percent. A research school's average score is weighted .20; the average score in the primary-care model is weighted .25. Residency program directors were also asked to rate programs using the same 5-point scale on two separate survey instruments. One survey dealt with research and was sent to a sample of residency program directors in fields outside primary care including surgery, psychiatry, and radiology. The other survey involved primary care and was sent to residency directors in those fields. The response rate for those sent the research survey was 28 percent. The response rate for those sent the primary-care survey was also 28 percent. Residency directors' opinions are weighted .20 in the research model and .15 in primary care.
Research activity (.30 in research model only): Activity was measured as the total dollar amount of National Institutes of Health research grants awarded to the medical school and its affiliated hospitals, averaged for 2002 and 2003. An asterisk indicates schools that reported only research grants to their medical school in 2003.
Primary-care rate (.30 in primary-care model only): The percentage of M.D. school graduates entering primary-care residencies in the fields of family practice, pediatrics, and internal medicine was averaged over 2001, 2002, and 2003.
Student selectivity (.20 in research model, .15 in primary-care model): This includes three components, which describe the class entering in fall 2003: mean composite Medical College Admission Test score (65 percent), mean undergraduate grade-point average (30 percent), and proportion of applicants accepted (5 percent).
Faculty resources (.10 in research model, .15 in primary-care model): Resources were measured as the ratio of full-time science and clinical faculty to full-time M.D. students in 2003.
Overall rank: The research-activity indicator had significant outliers; to avoid distortion, it was transformed using a logarithmic function. Indicators were standardized about their means, and standardized scores were weighted, totaled, and rescaled so that the top school received 100; other schools received their percentage of the top score.
Specialty rankings: The rankings are based solely on ratings by medical school deans and senior faculty at peer schools.They each identified up to 10 schools offering the best programs in each specialty area. Those receiving the most nominations appear here.
---------------------------------
These rankings rely heavily on subjective interpretations of medical school administrators and counts of research dollars. Conclusive? I think not.
What kind of biostatistics did you learn at Stanford, anyway?
By the way, I have a PhD in biostatistics from Hopkins. The program I attended is ranked much higher than Stanford. That makes me the God of biostatistics. Grow up.
PH