When do U.S. news rankings come out this year?

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

HumbleMD

hmmmm...
10+ Year Member
15+ Year Member
Joined
Sep 22, 2006
Messages
2,574
Reaction score
31
When do U.S. news rankings come out this year? It would be interesting to see how schools have changed, and tell what direction they're headed (decline or on the up and up). Maybe it could help a smidgen on the decision process for many...

Members don't see this ad.
 
When do U.S. news rankings come out this year? It would be interesting to see how schools have changed, and tell what direction they're headed (decline or on the up and up). Maybe it could help a smidgen on the decision process for many...

March 30th. Pretty soon.
 
March 30th. Pretty soon.
Big day: UMich scholarhsip decisions, and UPenn decisions hopefully by then. Maybe we should start a USNews countdown thread:laugh: .
 
Members don't see this ad :)
Good question....I'll be looking for this as well on the 30th!
 
Lol I guess that this would be considered a Match for pre-allos with all the rankings being released and whatnot :)
 
If HumbleMD had a penny for every time an SDNer accused him of not being humble....

he'd have enough money to justify his unbridled hubris?

I'm sorry man--I couldn't pass up the opportunity. :smuggrin:
 
I thought they came out with rankings every few years - the only other year I could find other than 2006's was 2003's...there's no 2004 or 2005...anyway, if they are coming out with one this year, I'd bet there'd be little change within a year...
 
I thought they came out with rankings every few years - the only other year I could find other than 2006's was 2003's...there's no 2004 or 2005...anyway, if they are coming out with one this year, I'd bet there'd be little change within a year...

The current rankings are 2007
 
Members don't see this ad :)
I thought they came out with rankings every few years - the only other year I could find other than 2006's was 2003's...there's no 2004 or 2005...anyway, if they are coming out with one this year, I'd bet there'd be little change within a year...

I'm not an expert on this topic, but I would assume they came out with rankings every year because it's a great money making opportunity. you know how all the yuppies and starry eyed parents buy into this crap
 
I'm not an expert on this topic, but I would assume they came out with rankings every year because it's a great money making opportunity. you know how all the yuppies and starry eyed parents buy into this crap

They come out with the Medicine rankings every year. The ranking for graduate school are only done every couple years like the previous poster suggested.
 
I'm not an expert on this topic, but I would assume they came out with rankings every year because it's a great money making opportunity. you know how all the yuppies and starry eyed parents buy into this crap

And what about us starry-eyed yuppies?
 
i dont think the US news rankings will change awfully, that much, and the little change they do make shouldnt affect anyones decision, because in 2009theres no telling what the ranking will be, or in 2010, or 2011 when you graduate...
 
i dont think the US news rankings will change awfully, that much, and the little change they do make shouldnt affect anyones decision, because in 2009theres no telling what the ranking will be, or in 2010, or 2011 when you graduate...

I can guarantee you that Harvard will still be (and will always be) #1....
 
I can guarantee you that Harvard will still be (and will always be) #1....

well yea b/c grant-funding is such a huge portion of the ranking equation.... but that, IMO, reflects little of the school's caliber (though may reflect the school's research opps)
 
I can guarantee you that Harvard will still be (and will always be) #1....

Unless it drops to 12... that'll cause a stir:p

Why do we buy into this crap:p
 
well yea b/c grant-funding is such a huge portion of the ranking equation.... but that, IMO, reflects little of the school's caliber (though may reflect the school's research opps)

Oh c'mon...it's HARVARD...haha.

I just hope my school will always be in the top 5 :p If it falls below 5 at any time I will just drop out of med school.
 
An old post of mine:

usnews rankings are stupid. look at the methodology.

Quality Assessment (weighted by .40)

* Peer Assessment Score (.20 for the research medical school model, .25 for the primary-care medical school model)
In the fall of 2005, medical and osteopathic school deans, deans of academic affairs, and heads of internal medicine or the directors of admissions were asked to rate programs on a scale from "marginal" (1) to "outstanding" (5). Survey populations were asked to rate program quality for both research and primary-care programs separately on a single survey instrument. Those individuals who did not know enough about a school to evaluate it fairly were asked to mark "don't know." A school's score is the average of all the respondents who rated it. Responses of "don't know" counted neither for nor against a school. About 54 percent of those surveyed responded.


* Assessment Score by Residency Directors (.20 for the research medical school model, .15 for the primary-care medical school model)
In the fall of 2005, residency program directors were asked to rate programs on two separate survey instruments. One survey dealt with research and was sent to a sample of residency program directors in fields outside primary care, including surgery, psychiatry, and radiology. The other survey involved primary care and was sent to residency directors in the fields of family practice, pediatrics, and internal medicine. Survey recipients were asked to rate programs on a scale from "marginal" (1) to "outstanding" (5). Those individuals who did not know enough about a program to evaluate it fairly were asked to mark "don't know." A school's score is the average of all the respondents who rated it. Responses of "don't know" counted neither for nor against a school. About 28 percent of those surveyed for research medical schools responded. Twenty-three percent responded for primary-care.

Research Activity (weighted by .30 in the research medical school model only)

* Total Research Activity (.20) measured by the total dollar amount of National Institutes of Health research grants
awarded to the medical school and its affiliated hospitals, averaged for 2004 and 2005. An asterisk indicates schools that reported only research grants to their medical school in 2005.


* Average Research Activity Per Faculty Member (.10) measured by the dollar amount of National Institutes of Health research grants awarded to the medical school and its affiliated hospitals per full-time faculty member, averaged over 2004 and 2005. Both full-time basic sciences and clinical faculty were used in the faculty count. An asterisk indicates schools that reported research grants only to their medical school in 2005.

Primary-Care Rate (.30 in the primary-care medical school model only) the percentage of M.D. or D.O. school graduates entering primary-care residencies in the fields of family practice, pediatrics, and internal medicine was averaged over 2003, 2004, and 2005.

Student Selectivity (.20 in the research medical school model, .15 in the primary-care medical school model)

* Mean MCAT Score (.13 in the research medical school model, .0975 in the primary-care medical school model) the mean composite Medical College Admission Test score of the 2005 entering class.


* Mean Undergraduate GPA (.06 in the research medical school model, .045 in the primary-care medical school model) the mean undergraduate grade-point average of the 2005 entering class.


* Acceptance Rate (.01 in the research medical school model, .0075 in the primary-care medical school model) the proportion of applicants to the 2005 entering class who were offered admission.

Faculty Resources (.10 in the research medical school model, .15 in the primary-care medical school model) resources were measured as the ratio of full-time science and clinical faculty to full-time M.D. or D.O. students in 2005.

Overall Rank: Indicators were standardized about their means, and standardized scores were weighted, totaled, and rescaled so that the top school received 100; other schools received their percentage of the top score.

Specialty Rankings: The rankings are based solely on ratings by medical school deans and senior faculty from the list of schools surveyed. They each identified up to 10 schools offering the best programs in each specialty area. Those receiving the most nominations appear here.


------------> Notice that the surveys are rarely collected (which make up a staggering 40%). Notice also that many schools will have a hard time improving their rank because its not as if NIH grants can increase substantially year-to-year or a school can hire significantly more doctors (40%). Schools do have some control over their rank... ie, mcat score, acceptance rate, etc (20%).
 
Maybe it's a good thing... but generally I find it tells you what you already knew.
 
An old post of mine:

usnews rankings are stupid. look at the methodology.

Quality Assessment (weighted by .40)

* Peer Assessment Score (.20 for the research medical school model, .25 for the primary-care medical school model)
In the fall of 2005, medical and osteopathic school deans, deans of academic affairs, and heads of internal medicine or the directors of admissions were asked to rate programs on a scale from "marginal" (1) to "outstanding" (5). Survey populations were asked to rate program quality for both research and primary-care programs separately on a single survey instrument. Those individuals who did not know enough about a school to evaluate it fairly were asked to mark "don't know." A school's score is the average of all the respondents who rated it. Responses of "don't know" counted neither for nor against a school. About 54 percent of those surveyed responded.


* Assessment Score by Residency Directors (.20 for the research medical school model, .15 for the primary-care medical school model)
In the fall of 2005, residency program directors were asked to rate programs on two separate survey instruments. One survey dealt with research and was sent to a sample of residency program directors in fields outside primary care, including surgery, psychiatry, and radiology. The other survey involved primary care and was sent to residency directors in the fields of family practice, pediatrics, and internal medicine. Survey recipients were asked to rate programs on a scale from "marginal" (1) to "outstanding" (5). Those individuals who did not know enough about a program to evaluate it fairly were asked to mark "don't know." A school's score is the average of all the respondents who rated it. Responses of "don't know" counted neither for nor against a school. About 28 percent of those surveyed for research medical schools responded. Twenty-three percent responded for primary-care.

Research Activity (weighted by .30 in the research medical school model only)

* Total Research Activity (.20) measured by the total dollar amount of National Institutes of Health research grants
awarded to the medical school and its affiliated hospitals, averaged for 2004 and 2005. An asterisk indicates schools that reported only research grants to their medical school in 2005.


* Average Research Activity Per Faculty Member (.10) measured by the dollar amount of National Institutes of Health research grants awarded to the medical school and its affiliated hospitals per full-time faculty member, averaged over 2004 and 2005. Both full-time basic sciences and clinical faculty were used in the faculty count. An asterisk indicates schools that reported research grants only to their medical school in 2005.

Primary-Care Rate (.30 in the primary-care medical school model only) the percentage of M.D. or D.O. school graduates entering primary-care residencies in the fields of family practice, pediatrics, and internal medicine was averaged over 2003, 2004, and 2005.

Student Selectivity (.20 in the research medical school model, .15 in the primary-care medical school model)

* Mean MCAT Score (.13 in the research medical school model, .0975 in the primary-care medical school model) the mean composite Medical College Admission Test score of the 2005 entering class.


* Mean Undergraduate GPA (.06 in the research medical school model, .045 in the primary-care medical school model) the mean undergraduate grade-point average of the 2005 entering class.


* Acceptance Rate (.01 in the research medical school model, .0075 in the primary-care medical school model) the proportion of applicants to the 2005 entering class who were offered admission.

Faculty Resources (.10 in the research medical school model, .15 in the primary-care medical school model) resources were measured as the ratio of full-time science and clinical faculty to full-time M.D. or D.O. students in 2005.

Overall Rank: Indicators were standardized about their means, and standardized scores were weighted, totaled, and rescaled so that the top school received 100; other schools received their percentage of the top score.

Specialty Rankings: The rankings are based solely on ratings by medical school deans and senior faculty from the list of schools surveyed. They each identified up to 10 schools offering the best programs in each specialty area. Those receiving the most nominations appear here.


------------> Notice that the surveys are rarely collected (which make up a staggering 40%). Notice also that many schools will have a hard time improving their rank because its not as if NIH grants can increase substantially year-to-year or a school can hire significantly more doctors (40%). Schools do have some control over their rank... ie, mcat score, acceptance rate, etc (20%).

ok - but didn't you also get into a bunch of top 10s? my question is, if these rankings are totally bogus, then how did you happen to make the decision to apply to a bunch of schools that all happened to be ranked in the top 10?

I mean, surely there are some differences b/w schools and wouldn't some of these differences be reflected in the rankings? How do we know that Harvard and Hopkins are better schools than LLU or Albany?
 
ok - but didn't you also get into a bunch of top 10s? my question is, if these rankings are totally bogus, then how did you happen to make the decision to apply to a bunch of schools that all happened to be ranked in the top 10?

I mean, surely there are some differences b/w schools and wouldn't some of these differences be reflected in the rankings? How do we know that Harvard and Hopkins are better schools than LLU or Albany?

I've gotten into one top ten - Duke.

I'm leaning toward Vandy as of now (ranked well below Duke).

I want to go into Academia so going to a medical school that is a research powerhouse helps, i.e., though rankings cannot be used very precisely IMO (that is to say, a #5 school isn't necessarily ''better'' than a #15 school)... it does generally give a good indication of the strength of research available at the school and its status in Academia. I applied to places that I felt I would be competitive at (in terms of getting in) and have strong reputations in Academia. Those schools happen to be in the top 30 or so in the US news rankings.
 
If one insists on using the us news info, the breakdowns are probably more useful than the overall scores of the schools. For example if more interested in having a high percentage as many of the faculty be top researchers they might choose hopkins over harvard since the per faculty grant money is much higher there (and the student to faculty ratio is more favorable at hopkins so access should be better too).

Or vice versa if the school's research volume is what matters to them. Or residency director rankings if that matters to them. The overall rank if simple and fun for joe shmo to throw around, but not all that useful to premeds.
 
I can guarantee you that Harvard will still be (and will always be) #1....

well, you never know. it was #1 for undergrad forever which is also based on amount of money etc and fell to #2 with yale as #1 this past year i believe.
 
Just thought I'd mention that the reason Harvard only has such an enormous amount of NIH funding per US news report because they include the funding received by affiliated hospitals as well Harvard has _numerous_ affiliated hospitlas (12 I think?) If you look at funding received by the actual medical school per se, Harvard's rank is #26 and its peer institutions are no longer hopkins and UCSF, but Einstein and U of Maryland.
http://grants.nih.gov/grants/award/rank/medttl05.htm

Some of you may remember that Baylor split with their main teaching hospital, Methodist, and Methodist subsequently affiliated with Weill Cornell. Question: Why would a NYC med school need an affiliate hospital in Texas? Answer: If Methodist affiliates with Cornell, the nominal association will allow Cornell to count Methodist's research revenue in the US News NIH dollars, even though the real NIH grant rankings reveal that Cornell is #37, only a hair ahead of the eminent UT Galveston.

I think that NIH funding is the best criterion on which to base Research rankings. However, US News should stop cooking the books and use the REAL NUMBERS. Of course, all hell would break loose if next year Harvard dropped out of the top 25. Harvard would probably use their 15 bajillion gazillion dollar endowment to train an army of ninjas to carry out a vendetta killing as a lesson to other meritocrats who might be plucky enough to toy with the idea of using legit NIH data for rankings.

Since I've bashed two schools, I'd like to give kudos to two schools now: Pitt and Case. Ranked 9th and 12th respectively according to NIH. Compare to US News where Pitt is ranked 16th and Case is ranked 22nd.
 
Does that mean the premium edition that I bought will be gone on March 30? Nooooo!:(
 
Just thought I'd mention that the reason Harvard only has such an enormous amount of NIH funding per US news report because they include the funding received by affiliated hospitals as well Harvard has _numerous_ affiliated hospitlas (12 I think?) If you look at funding received by the actual medical school per se, Harvard's rank is #26 and its peer institutions are no longer hopkins and UCSF, but Einstein and U of Maryland.
http://grants.nih.gov/grants/award/rank/medttl05.htm

Some of you may remember that Baylor split with their main teaching hospital, Methodist, and Methodist subsequently affiliated with Weill Cornell. Question: Why would a NYC med school need an affiliate hospital in Texas? Answer: If Methodist affiliates with Cornell, the nominal association will allow Cornell to count Methodist's research revenue in the US News NIH dollars, even though the real NIH grant rankings reveal that Cornell is #37, only a hair ahead of the eminent UT Galveston.

I think that NIH funding is the best criterion on which to base Research rankings. However, US News should stop cooking the books and use the REAL NUMBERS. Of course, all hell would break loose if next year Harvard dropped out of the top 25. Harvard would probably use their 15 bajillion gazillion dollar endowment to train an army of ninjas to carry out a vendetta killing as a lesson to other meritocrats who might be plucky enough to toy with the idea of using legit NIH data for rankings.

Since I've bashed two schools, I'd like to give kudos to two schools now: Pitt and Case. Ranked 9th and 12th respectively according to NIH. Compare to US News where Pitt is ranked 16th and Case is ranked 22nd.

this is a great point... I've always looked at that NIH funding list with confusion. These weird affiliations really explain it alot
 
Just thought I'd mention that the reason Harvard only has such an enormous amount of NIH funding per US news report because they include the funding received by affiliated hospitals as well Harvard has _numerous_ affiliated hospitlas (12 I think?) If you look at funding received by the actual medical school per se, Harvard's rank is #26 and its peer institutions are no longer hopkins and UCSF, but Einstein and U of Maryland.
http://grants.nih.gov/grants/award/rank/medttl05.htm

Some of you may remember that Baylor split with their main teaching hospital, Methodist, and Methodist subsequently affiliated with Weill Cornell. Question: Why would a NYC med school need an affiliate hospital in Texas? Answer: If Methodist affiliates with Cornell, the nominal association will allow Cornell to count Methodist's research revenue in the US News NIH dollars, even though the real NIH grant rankings reveal that Cornell is #37, only a hair ahead of the eminent UT Galveston.

I think that NIH funding is the best criterion on which to base Research rankings. However, US News should stop cooking the books and use the REAL NUMBERS. Of course, all hell would break loose if next year Harvard dropped out of the top 25. Harvard would probably use their 15 bajillion gazillion dollar endowment to train an army of ninjas to carry out a vendetta killing as a lesson to other meritocrats who might be plucky enough to toy with the idea of using legit NIH data for rankings.

Since I've bashed two schools, I'd like to give kudos to two schools now: Pitt and Case. Ranked 9th and 12th respectively according to NIH. Compare to US News where Pitt is ranked 16th and Case is ranked 22nd.
this is true. however, as a medical school student, you are free to work in ANY of the labs at ANY of the affiliated teaching hospitals. and plus, US news world and report includes ALL affiliated hospitals when calculating research money for ALL medical schools. it's not harvard's fault that it has 17 teaching hospitals...
 
The response rate for usnews peer assesment and residency director scores is not a problem here. I have no idea why people think this is such a problem. Their measurement for faculty resources (fac/stu ratio) is, however, idiotic.
 
What is interesting is that they only rank to 63 (half of the US allos). Everybody else is tied for #64 to avoid hurting anyone's feelings.
 
this is true. however, as a medical school student, you are free to work in ANY of the labs at ANY of the affiliated teaching hospitals. and plus, US news world and report includes ALL affiliated hospitals when calculating research money for ALL medical schools. it's not harvard's fault that it has 17 teaching hospitals...

Sure... Harvard students can do research at any of the affiliated teaching hospitals. However, if you're going to include all affiliated academic hospitals, why not including other insitutions that provide research opportunities for students?

UCSD students can do research at the Salk Institute, but you don't see Salk's research money counted in UCSDs favor. Likewise, UCSF students can do projects in conjunction with Berkeley professors, but you don't see Cal's research funding counted towards UCSF. Baylor students can do research at MD Anderson or Rice, but that doesn't count either.

See the problem?

NIH only counts grants towards a med school if the insitution if the researcher's primary appointment is at said med school. Seems fair to me.
 
What is interesting is that they only rank to 63 (half of the US allos). Everybody else is tied for #64 to avoid hurting anyone's feelings.

ha... and they only have like 70 schools that participate (including a few DO schools) it seems like those at #64 are really just the worst MD schools that were part of the rankings
 
No, i think they rank more, you just have to pay for the premium version to see the rest right?
 
Sure... Harvard students can do research at any of the affiliated teaching hospitals. However, if you're going to include all affiliated academic hospitals, why not including other insitutions that provide research opportunities for students?

UCSD students can do research at the Salk Institute, but you don't see Salk's research money counted in UCSDs favor. Likewise, UCSF students can do projects in conjunction with Berkeley professors, but you don't see Cal's research funding counted towards UCSF. Baylor students can do research at MD Anderson or Rice, but that doesn't count either.

See the problem?

NIH only counts grants towards a med school if the insitution if the researcher's primary appointment is at said med school. Seems fair to me.
well...i think the line has to be drawn somewhere in regards to which institutions may be counted towards a medical school's research funding. if you were to count caltech and the Salk Institute, then you would likewise have to count MIT and institutes such as the Broad for HMS, as many students do end up doing research there as part of the MD/PhD or HST. which, if you were to combine the research funding of Harvard and MIT in the HMS count just because students do research there, I think it would be a little ridiculous...
 
well...i think the line has to be drawn somewhere in regards to which institutions may be counted towards a medical school's research funding. if you were to count caltech and the Salk Institute, then you would likewise have to count MIT and institutes such as the Broad for HMS, as many students do end up doing research there as part of the MD/PhD or HST. which, if you were to combine the research funding of Harvard and MIT in the HMS count just because students do research there, I think it would be a little ridiculous...

Bingo. And if you draw the line where NIH does, i.e. with Harvard faculty, then the big H drops to #26. By only counting NIH money toward a med school if the researcher has his/her primary appointment there, you can prevent places like Weill Cornell from pulling shenanigans like associating with a hospital half way accross the country to increase their funding on paper.

(Nothing personal against Harvard, it's a great school and deserves its legendary reputation... I'm just using it as an example though.)
 
I still don't see why we even use research rankings.

Shouldn't they just have med school rankings period?
 
No, i think they rank more, you just have to pay for the premium version to see the rest right?

I think that you pay to get 51-63. I've never seen USN rank their so-called bottom half. I don't know how they would anyway. I have a hard enough time finding differences in their second quartile.
 
I still don't see why we even use research rankings.

Shouldn't they just have med school rankings period?

I think they use research rankings because what else can they use? "Student satisfaction" is too subjective. Ranking by the school's match list is impossible, and ranking by the school's average Step 1/2 scores is pretty meaningless. There aren't any other measures that I can think of. Anyone else?

Until people realize that deciding which med school to attend is a very personal decision (*cough* that shouldn't be heavily influenced by USNWR rankings *cough*), and as long as enough people are convinced that med school reputation is a golden ticket into a great residency, then they'll keep buying these rank lists.
 
I think they use research rankings because what else can they use? "Student satisfaction" is too subjective. Ranking by the school's match list is impossible, and ranking by the school's average Step 1/2 scores is pretty meaningless. There aren't any other measures that I can think of. Anyone else?

Until people realize that deciding which med school to attend is a very personal decision (*cough* that shouldn't be heavily influenced by USNWR rankings *cough*), and as long as enough people are convinced that med school reputation is a golden ticket into a great residency, then they'll keep buying these rank lists.

Why not somehow make it multifactorial and utilize all these tidbits of info.

Its even more useless for rankings to be based off of total NIH funding (does it really really make a difference for us as medical students?, come on) versus something like a matchlist.

And whats wrong with making USMLE averages part of the ranking formula? They use average MCAT score and GPA already...

There is no perfect way to utilize all this info. I just feel that its way to biased toward NIH funding (why not include other forms of funding?). Its not like you'll find extremely better research oppurtunities as a med student at a place like Harvard versus a school ranked in the 20s for NIH funding. As a med student, you'll just look for one lab or a few PIs for clin research if you're even interested in doing research. You won't be able to take advantage of the 100s of millions of more dollars at Harvard or elsewhere.
 
And whats wrong with making USMLE averages part of the ranking formula? They use average MCAT score and GPA already...

They would if they had the info. Schools don't publish this stuff except for whatever they tell you at the interview and gets repeated on SDN. I agree that it would be a lot more relevant to med students than NIH funding.
 
The US News rankings do one thing: attempt to rationalize what laypeople already think to be true.

Harvard and Hopkins have eminent reputations as medical institutions. US News has picked criteria that try to make those reputations look objective. In truth, the quality of your medical education has nothing to do with any of those statistical factors whatsoever. Just being at Hopkins doesn't ensure that you'll take advantage of the opportunities here, and just being at School X which is lower on the list doesn't mean you won't get involved/do big research/get a great residency slot in the field of your choice.

US News has just done essentially nothing (actually, they probably actively avoid rocking the boat by picking factors that ensure places like Harvard/Hopkins/etc. always appear at the top of the list).

IMHO, if you like what you're doing and where you're doing it, you don't need rankings from some third-tier magazine to validate your decision.
 
The US News rankings do one thing: attempt to rationalize what laypeople already think to be true.

Harvard and Hopkins have eminent reputations as medical institutions. US News has picked criteria that try to make those reputations look objective. In truth, the quality of your medical education has nothing to do with any of those statistical factors whatsoever. Just being at Hopkins doesn't ensure that you'll take advantage of the opportunities here, and just being at School X which is lower on the list doesn't mean you won't get involved/do big research/get a great residency slot in the field of your choice.

US News has just done essentially nothing (actually, they probably actively avoid rocking the boat by picking factors that ensure places like Harvard/Hopkins/etc. always appear at the top of the list).

IMHO, if you like what you're doing and where you're doing it, you don't need rankings from some third-tier magazine to validate your decision.

:clap: :clap: :clap: :clap: :clap: :clap:

:thumbup: Trudat.
 
The little gunner inside each one of us wants to be able to say "top 10."
 
The US News rankings do one thing: attempt to rationalize what laypeople already think to be true.

Harvard and Hopkins have eminent reputations as medical institutions. US News has picked criteria that try to make those reputations look objective. In truth, the quality of your medical education has nothing to do with any of those statistical factors whatsoever. Just being at Hopkins doesn't ensure that you'll take advantage of the opportunities here, and just being at School X which is lower on the list doesn't mean you won't get involved/do big research/get a great residency slot in the field of your choice.

US News has just done essentially nothing (actually, they probably actively avoid rocking the boat by picking factors that ensure places like Harvard/Hopkins/etc. always appear at the top of the list).

IMHO, if you like what you're doing and where you're doing it, you don't need rankings from some third-tier magazine to validate your decision.

Right-o. I've somewhat alluded to this argument in previous posts, but didn't have the cajones to say it straight out :) Well said.
 
Why not somehow make it multifactorial and utilize all these tidbits of info.

Its even more useless for rankings to be based off of total NIH funding (does it really really make a difference for us as medical students?, come on) versus something like a matchlist.

And whats wrong with making USMLE averages part of the ranking formula? They use average MCAT score and GPA already...

But if each tidbit of info is a subjective measure, then it's not very useful to even make it multifactorial. I used the matchlist example because those are notoriously unreliable measure of the quality of med students that come out of that school. 4th year med students, in general, have much more pressing restrictions on where they can go after graduation (in terms of family, etc.) than the average college senior does.

I think USMLE averages aren't very useful because the only thing a lot of schools will say is that "we are above the national average." But so many schools say this, so it doesn't really show anything helpful. And I think that, while Step 1 scores are very, very important, you could also argue that they don't prove that you'll be a good or sought after intern/resident. Clinical evals are just as important in that regard.

I just feel that its way to biased toward NIH funding (why not include other forms of funding?). Its not like you'll find extremely better research oppurtunities as a med student at a place like Harvard versus a school ranked in the 20s for NIH funding. As a med student, you'll just look for one lab or a few PIs for clin research if you're even interested in doing research. You won't be able to take advantage of the 100s of millions of more dollars at Harvard or elsewhere.

I totally agree with you here.
 
Top