Hopkins, MGH, B&W, Mayo... Are the US news hospital rankings a bunch of crap?

This forum made possible through the generous support of SDN members, donors, and sponsors. Thank you.

mTOR

Full Member
15+ Year Member
Joined
Jun 18, 2007
Messages
572
Reaction score
4
Do you care?

Best Hospitals 2011-12: the Honor Roll

Rank Hospital Points Specialties
1
Johns Hopkins Hospital, Baltimore 30 15
2
Massachusetts General Hospital, Boston 29 15
3
Mayo Clinic, Rochester, Minn. 28 15
4
Cleveland Clinic 26 13
5
Ronald Reagan UCLA Medical Center, Los Angeles 25 14
6
New York-Presbyterian University Hospital of Columbia and Cornell, N.Y. 22 12
7
UCSF Medical Center, San Francisco 20 11
8
Brigham and Women's Hospital, Boston 18 12
9
Duke University Medical Center, Durham, N.C. 18 10
10
Hospital of the University of Pennsylvania, Philadelphia 17 12
11
Barnes-Jewish Hospital/Washington University, St. Louis 16 11
12
UPMC-University of Pittsburgh Medical Center 14 8
13
University of Washington Medical Center, Seattle 13 9
14
University of Michigan Hospitals and Health Centers, Ann Arbor 10 6
14
Vanderbilt University Medical Center, Nashville 10 6
16
Mount Sinai Medical Center, New York 8 6
17
Stanford Hospital and Clinics, Stanford, Calif. 7 6

church-3.jpg


Published in the Annals of Internal Medicine, 2010:

The Role of Reputation in U.S. News & World Report's Rankings of the Top 50 American Hospitals

Ashwini R. Sehgal, MD

For the past 20 years, the newsmagazine U.S. News & World Report has published annual rankings of the top 50 American hospitals in several specialties. These rankings generate widespread attention among the general public, health care providers, and policymakers. Other media outlets, such as newspapers, amplify the visibility of these rankings through their own stories about top-ranked hospitals. Hospital leaders generally believe that the U.S. News & World Report rankings are accurate and use them as marketing tools to attract patients (1, 2). In addition, political leaders and health policy researchers searching for models for health care reform often focus on U.S. News & World Report's top-ranked hospitals (3).

U.S. News & World Report's rankings combine 3 quality domains with approximately equal weighting: structure, process, and outcomes. Structure is the people and resources involved in providing health care, process refers to how care is delivered, and outcomes are the results of such care. Structure and outcomes were assessed by objective measures, including nurse staffing, patient volume, patient safety adverse events, and adjusted mortality. Process was assessed primarily by the single subjective measure of hospital reputation. The rankings have been criticized for emphasizing too strongly the subjective reputation of a few nationally prominent hospitals, but the role of reputation has not been quantified (1, 4).

The goal of this study is to quantify the role of reputation in determining the relative standings of the top 50 hospitals in each of the 12 specialties evaluated (cancer; diabetes and endocrine disorders; digestive disorders; ear, nose, and throat; geriatric care; gynecology; heart and heart surgery; kidney disorders; neurology and neurosurgery; orthopedics; respiratory disorders; and urology). This study particularly attempts to determine whether a high degree of variation in the subjective reputation of hospitals results in a composite ranking that is dominated by subjective reputation. Understanding the role of reputation may help users interpret the rankings more knowledgeably and may inform efforts to improve the assessment of hospital quality.

[-- snip --]


Discussion

By combining several subjective and objective measures, U.S. News & World Report's rankings appear to be a rigorous, complex, and multidimensional index of hospital quality. However, the relative standings of the top 50 hospitals largely reflect the subjective reputation of those hospitals. This subjective reputation does not already capture objective measures of hospital quality; in fact, little relationship exists between reputation and objective measures in the top 50 hospitals. The validity of these findings is strengthened by their consistency across all 12 specialties and across multiple complementary analytic approaches.

The predominant role of reputation is caused by an extremely high variation in reputation score compared with objective quality measures among the 50 top-ranked hospitals in each specialty (Table 2). As a result, reputation score contributes disproportionately to variation in total U.S. News score and therefore to the relative standings of the top 50 hospitals. In addition, all 100 randomly selected unranked hospitals had reputation scores of 0 or that were minimal. These data suggest that hospitals lacking national recognition are unlikely to be highly ranked by U.S. News & World Report.

This study's findings have implications for users of the rankings and for efforts to improve the assessment of hospital quality. Because obtaining national data on process is difficult, U.S. News & World Report relies on reputation as a proxy measure (5). Because reputation score is determined by asking approximately 250 specialists to identify the 5 best hospitals in their specialty, only nationally recognized hospitals are likely to be named frequently. High rankings also may enhance reputation, which in turn sustains or enhances rankings in subsequent years. Users should understand that the relative standings of U.S. News & World Report's top 50 hospitals largely indicate national reputation, not objective measures of hospital quality.

Because reputation can deteriorate quickly from a single negative event, even highly rated hospitals should be concerned about rankings based primarily on reputation. Moreover, reputation-based rankings may act as barriers to improving hospital quality, because even highly successful quality improvement programs are unlikely to enhance a hospital's ranking. Assessors of hospital quality should be wary of subjective measures and of combining measures with large and small degrees of variation within a single index. Otherwise, the high variation measure will dominate differences among hospitals even if all measures are weighted equally in calculating the index.

Previous investigations have focused on validating U.S. News & World Report's rankings by comparison with external measures. Highly ranked cardiology hospitals were found to have a lower adjusted 30-day mortality among elderly patients with acute myocardial infarction (9). However, many of these hospitals performed poorly in providing evidence-based care for patients with myocardial infarction or heart failure (10). Performance on Medicare's core measures related to myocardial infarction, congestive heart failure, and community-acquired pneumonia was also frequently discordant with U.S. News & World Report's rankings (11). The analyses presented here examined a broader range of measures internal to U.S. News & World Report's ranking system and found little relationship between rankings and objective quality measures for most specialties (Table 4).

This study's findings apply primarily to interpretations about the relative standings of U.S. News & World Report's 50 top-ranked hospitals in each specialty and not necessarily to the hundreds of unranked hospitals. This study focused on the top 50 hospitals for several reasons. First, only the top 50 hospitals in each specialty are publicized by U.S. News & World Report, other media outlets, and the hospitals themselves. Second, unranked hospitals do not receive numerical rankings, making it difficult to examine the role of reputation in determining rankings. Third, the data needed to examine unranked hospitals are not readily available. The total U.S. News scores and some, but not all, subjective and objective components of unranked hospitals are available on a Web site. However, the information must be accessed separately for each combination of hospital and specialty (a total of 1859 hospitals × 12 specialties = 22 308 Web pages) (7).

This study also did not address the validity of using reputation as a process measure; the representativeness of the physicians who rated hospital reputation; the accuracy of the objective measures; or the absence of other key factors, such as patient satisfaction. The study's findings may not apply to other hospital quality-rating systems, such as the Medicare Hospital Quality Compare in the United States or the National Health Service's Performance Ratings in the United Kingdom. Medicare provides quantitative data on several performance measures but does not create an overall quality measure, whereas the National Health Service uses a semiquantitative approach to categorize overall quality as excellent, good, fair, or weak (12, 13).

U.S. News & World Report's method report states that its rankings are "based largely on hard data" (5). A necessary step for this claim to be valid would be to dramatically reduce or eliminate the role of reputation. In addition, the data used to develop the rankings should be more readily available to review. The current rankings fall short of being an evidence-based system that data-conscious consumers, value-based purchasers, and reform-minded policymakers can rely on for health care decisions (1, 11).

http://www.annals.org/content/152/8/521.full

This thread is inspired by my disgust for the unquestioned non-evidence based traditionalism in the structure of our society, and particularly in supposedly scientifically-minded disciplines. Personally, I shot up the U.S. News & World Report list quite a bit going from medical school to residency, but I must say, the experience has been sorta like getting a huge, extravagant, and excessively wrapped empty box for Christmas. "You want to change/do what? But we've been doing it this way for years... Don't you know we're good for a reason/number X/highly ranked" (on crappy overrated and commercialized non-evidence faith-based list). And the "BRB can't remember how to treat/perform/diagnose X, I mainly do research" attendings.

Folks, it's a facade.

There are legitimate attempts at more objective measures of hospital quality, and you can find them here and here. But I guess the question is.. when choosing a residency, does it really matter? If both you and your future employers believe you were touched by King Midas, who cares if the reality is that you were actually molested by Cletus from down the street.. right? Keep the lie alive if it keeps you alive.

Members don't see this ad.
 
I think a good name does make a difference coming out of residency and looking for work. Most of the hospitals at the top of that list also comprise the residency programs that are most competitive.

Often times leaders in the fields work in those hospitals. Like it or not there are large differences in the quality of doctors out there. It is just my personal belief that the quality at a ranked hospital is significantly better having seen some private vs university practices. This is more in regards to not generally doing unnecessary tests, amount of time spent with patients, etc. Students who travel around and do away rotations should be able to chime in.

Just like the quality of medical students at the "top 25" schools are eons beyond us lowly peons at state schools right? :laugh:.

I still and will always love you officedepot, no matter how elitist you are or become one day!
 
Members don't see this ad :)
Just like the quality of medical students at the "top 25" schools are eons beyond us lowly peons at state schools right? :laugh:.

I still and will always love you officedepot, no matter how elitist you are or become one day!

Hey now, there are plenty of state schools in the top 25. :p Us state schoolers gotta represent and show the private kids how it's done outside the ivory tower... or something. :laugh:
 
I think a good name does make a difference coming out of residency and looking for work. Most of the hospitals at the top of that list also comprise the residency programs that are most competitive.

Often times leaders in the fields work in those hospitals. Like it or not there are large differences in the quality of doctors out there. It is just my personal belief that the quality at a ranked hospital is significantly better having seen some private vs university practices. This is more in regards to not generally doing unnecessary tests, amount of time spent with patients, etc. Students who travel around and do away rotations should be able to chime in.

I agree that the name = marketability which only helps the graduate.

Then you say that ranked hospitals have better quality doctors. That's an interesting yet typical conclusion and that's what the OP is getting at, how can we objectively measure the quality of the doctors. Then we know if they're better or just more marketable/recognizable.

The truth lies between the extremes. The list is not crap, yet it's not reflective on the best healthcare outcomes either.
 
For pretty much everyone except scientists (or the science-minded), people equate reputable hospitals with good care. No patient is going to dig through the literature trying to find the most objective data on their health care system. Marketability reigns over objectivity...that's just the world we live in.
 
Oh rankings, haha.

I just love watching and listening to people who put so much stock into them. Especially medical students. Hilarious.
 
The most important measure concerning hospitals is size and quailty of design, the number of services offered, and the breadth of specialists. For someone coming to the ER, the big general and academic hospitals that have every service offer a greater medical success and less morbidity and death (Level I's are a simply different species than that of Level IV's).

For elective and continuity of care issues, finding the best doctor is most important, no matter where located.
 
Oh rankings, haha.

I just love watching and listening to people who put so much stock into them. Especially medical students. Hilarious.

Indeed. Enjoy your residency training in Guam.
 
Reputation does play a part in having a hospital with the newest, coolest things, and having a more solid financial security and outlook.

I remember a C*O from one of the ranked institute talking about how his system thrives when a lot of other places are going belly up. One interesting story was just the power of reputation. When you have a brand name that the average health consumer recognizes and preferably demands, that results in a pressure on insurance companies. Places with reputation are able to leverage much better contracts and agreements with insurance companies. End story is that they are able to bring in a lot more money, and you can throw a lot of money at objective quality.
 
Top