Wednesday, August 03, 2011

US Rumor and Hospital Report

It has been almost four years since I commented on the annual hospital ranking prepared by US News and World Report.  I have to confess now that I was relatively gentle on the magazine back then.  After all, when you run a hospital, there is little be gained by critiquing someone who publishes a ranking that is read by millions.  But now it is time to take off the gloves.

All I can say is, are you guys serious?  Let's look at the methodology used for the 2011-12 rankings:

In 12 of the 16 [specialty] areas, whether and how high a hospital is ranked depended largely on hard data, much of which comes from the federal government. Many categories of data went into the rankings. Some are self-evident, such as death rates. Others, such as the number of patients and the balance of nurses and patients, are less obvious. A survey of physicians, who are asked to name hospitals they consider tops in their specialty, produces a reputation score that is also factored in.

Here are the details:

Survival score (32.5 percent). A hospital's success at keeping patients alive was judged by comparing the number of Medicare inpatients with certain conditions who died within 30 days of admission in 2007, 2008, and 2009 with the number expected to die given the severity of illness. Hospitals were scored from 1 to 10, with 10 indicating the highest survival rate relative to other hospitals and 1 the lowest rate. Medicare Severity Grouper, a software program from 3M Health Information Systems used by many researchers in the field, made adjustments to take each patient's condition into account.
Patient safety score (5 percent). Harmful blunders occur at every hospital; this score reflects how hard a hospital works to prevent six of the most egregious types. A 3 puts a hospital among the 25 percent of those that were best in this regard, a 2 in the middle 50 percent, and a 1 in the lowest 25 percent. Examples of the six kinds of medical episodes factored in are deaths of patients whose conditions should not have put them at significant risk and surgical incisions that reopen.
Reputation (32.5 percent). Each year, 200 physicians per specialty are randomly selected and asked to list hospitals they consider to be the best in their specialty for complex or difficult cases. A hospital's reputational score is based on the total percentage of specialists in 2009, 2010, and 2011 who named the hospital. This year some physicians were asked to list up to five hospitals, the rest to list up to 10.
Other care-related indicators (30 percent). These include nurse staffing, technology, and other measures related to quality of care. The American Hospital Association's 2009 survey of all hospitals in the nation was the main source.

Let's see how this pans out for one specialty, pulmonology. We see that the number 1 and 2 ranked hospitals have great reputations but the lowest score for patient safety.  The first hospital with “superior” safety rankings doesn’t appear until number 21.

The reputational data is opaque, as it has to be.  With great respect for the 200 pulmonologists who were surveyed, how much current data have they seen about the outcomes achieved by hundreds of hospitals and thousands of doctors around the country.  Answer:  None.  Why?  Because there is no current data published on such outcomes.  Likewise, there is no current data published about hospital related infections, falls, medication errors and other matters that could affect the treatment of a pulmonary patient, even if the pulmonologists are top-notch.

So, the reputational survey is likely to based on the following type of "information":

Oh, I like Dr. Smith at ABC hospital.  We were in residency together 25 years ago.  He was a great guy.  I still remember that amazing Christmas party in 1986.

Or, maybe:

That Dr. Jones is at XYZ hospital is terrific. I heard him give a paper at the last meeting of the ATS (or ACCP, or AABIP.)  His Powerpoint presentation about his clinical successes (or research with mouse models) was gripping.

Or, maybe:

Dr. Pebble was trained by Dr. Stone, one of the best in the business in his day (40 years ago.)  That's good enough for me.

Or, even:

I sent a really sick patient to Dr. Good at RST Hospital.  He saved her life.  It was a very tough case, and he deserves a lot of credit.

US News needs to stop relying on unsupported and unsupportable reputation, often influenced by anecdote, personal relationships and self-serving public appearances, and work on real -- and more recent -- data. Maybe that will also cause hospitals to be more willing to report their data so they can be named to the “Honor Roll.” As it is, you are better off keeping things opaque to protect your reputation.

I think it is time to acknowledge that this ranking offers very little in the way of valuable information.  It is mainly a vehicle for advertisements from the pharmaceutical industry, who know that this issue of the magazine gets a lot of attention and high circulation.  As you flip through to each specialty, you are blasted with ads for drugs related to syndromes within that specialty.  Here's the top part of the pulmonology page:

Then, if you click through to "find resources about" a particular disease, you do get some nice content information, but you get sprayed with even more ads.

There would be no market for this magazine survey if the government or insurance companies did their job and displayed real-time clinical outcome data.  But those with the reputational advantage do not want that to happen.  And those who profit from the lack of data also have nothing to gain by a more open presentation of the actual record and qualifications of hospitals and doctors in each specialty.


Barry Carol said...

I wonder what percentage of medical as opposed to surgical inpatient admissions could be competently handled by most community hospitals and all teaching hospitals. I suspect the number is quite high. Surgical procedures lend themselves more readily to outcomes measurement. If the brand name hospitals with the big reputations are no better than most others when it comes to the relatively uncomplicated cases that probably account for most of the procedures, insurers should have an economic interest in making that information available to both the public and to referring doctors. It should be a golden opportunity to create some countervailing power against the famous hospitals that command high rates because of their market power and not their care quality.

@Namaste555 said...

From Twitter:

Amen Paul! It's about time someone shed light on this

Howard said...

From Google+:

Amen! Glad that you took the gloves off. These hospital rankings are virtually worthless. But it is probably a very popular issue, so we're bound to see it reappear over and over again.

Nick Dawson said...

Thank you!

Two things matter: outcomes and experience. We are still learning how to really measure outcomes, how do deal with the data, how to process, analyze and share it.... until we have a true clearinghouse for quality, aren't all of the rankings and reports snake oil?

Experience, of course, is another matter - although I'm hopeful social sites will shed light on experience the way amazon rankings reflect people's perceptions of product quality. At least they can be a driver for change inside organizations.

Thanks Paul - glad someone stood up and said this!

Mark Graban said...

Thanks so much for taking off the kid gloves.

I've always thought the 5% patient safety component was an embarrassment.

Basing a hospital's ranking on their reputation is such circular logic, it makes my head spin.

As I drive around the Dallas-Fort Worth area, there are different billboards from three (I think, maybe it's four) different hospitals that all trumpet how they are the best in North Texas, based on some different methodology that's likely just as flawed.

They're all the best hospital, but none of them are the best they can be.

Curious Cat said...

Ahem, thanks for being open and frank. The rating of schools have long be criticized for similar (and other) reasons. People seem to like the rankings, which I guess is fine, but they have not much more value than which fashion designs are "in" today or some such what I consider nonsense (but some people enjoy following).

Bruce said...

From Facebook:

The data is flawed because no one has defined the processes of patient care (in systems science that is the first step, otherwise the outcomes measures are close to worthless, and as you point out in this case they can be significantly biased). None of this is reality. Thank you for taking off the gloves- this kind of honesty is critical to our future.

shimon said...

Hi Paul,
Taking off the gloves to take on US News and World Report may make you feel better but unlikely to change the outcome of the value of current healthcare services delivered in US hospitals. There are hundreds of state, foundation, for profit web sites with hospital ratings including patient safety information available to consumers and patients. The challenge is to make healthcare institutions, especially those receiving public money, accountable. In my opinion, civic engagement of "citizens" with their local hospitals is the more effective way to go. It can start with "demanding" reliable outcome and safety information on hospital web sites.(Perhaps based on The Informed Patient Institute criteria) There are many challenges to this kind of approach, however if more of us take off our gloves and direct our energies to where the problem is, we can influence what hospitals do in reporting their outcomes, and all the benefit to improved care and cost that will likely come out of it.
Our group Citizens4health is embarking on such an effort. I invite you and your readers to consider this approach. We hope to launch our efforts in the next few weeks. I will be happy to keep you posted on our public reporting initiative. For an early description, pre public, of our initiative check out our web site.

Theresa said...

US News, Health Grades, Castle Connelly and others create these rankings as revenue raising opportunities. They sell licensing rights to the hospitals and practitioners who make it onto their lists, giving them the right to publicize the fact that they have been highly ranked. The whole thing seems kind of, well, rank to me.

Anonymous said...

This should serve as an eye opener for hospital execs that often gloat over these rankings or overemphasize their value.
Recently a colleague and me were very surrpised to find a hospital we have consulted with on the list.
This suburban washington DC hospital, on the inside looks like a train wreck both with the infrastructure(lacking proper equipment in critical service lines like the OR) and management(zero accountability).
This hospital has been financially in the red and yet retains its CFO with a substantial bonus for 5 years in a row.
The nurse to patient ratio is 1:7/8and the working conditions are atrocious on some units for nurses. One of the physicians even called it a mom and pop shop describing how it's run.

And yet they were on US News and world report. And it really surprised us. We then went about discussing how this whole report is a bit of joke. Now with your clinical dismantling of their methodology we stand validated.

Hospital Execs: Please take a lesson out of this and stop focussing on publicizing our US News Rankings. Instead focus on the voices of your patients, staff and physicians to really solve a problem.

Anonymous said...


Your efforts are laudable, as it is important to find a way to engage the power of a critical mass of everyday citizens (past, present and future patients) in this effort.
However, I think an aspect of Paul's post which has not been addressed, is the adverse effect these rankings have on shaping hospital behavior - that is, modifying specific practices and processes solely to achieve a better ranking.
To the extent that these meaningless rankings can be made less publicly influential, the more influence and attention will be rendered to efforts such as yours. So I see these two efforts as complementary rather than competing.

nonlocal MD

James said...

Paul - Thanks for taking the time to research and expose the reality of reporting at the level of US News... Truth is that at any level it is difficult to pull the trigger on another institution and not expect the reverse full disclosure. Even with the level of transparency which you tried to create and did create while CEO at BI I'm sure there were and still are troubling issues which couldn't be brought to light.

FInally, there continues to be many technological roadblocks which prevent all institutions from aggregating and sharing completely transparent data, but these are being broken down one by one.

Best wishes -

Anonymous said...

In follow up to nonlocalMD's comments:
Overheard at an academic medical center after the recent USN&WR rankings were released:

Strategic Planning Manager: "You guys did better this year, but we need to find a way to improve your reputation score."

Rank and File MD: "None of our patient safety scores are at the top level. Perhaps our attention should be on that instead of reputation?"

SPM: "I would agree with you. However, the issue is that patient safety accounts for 5% of the score while reputation accounts for 32.5%."

R&FMD: "So, maybe we should consider this an invalid measure and stop paying atention to it -since they consider reputation six times more important than patient safety?"

Crickets: chirping

Eric and tim said...

From The Health Care Blog:

Eric Tremont says:

Excellent post. Anybody who closely examines the U.S. News methodology will realize that much of their survey rankings are based on dubious (“junk” might be a better word) science. Furthermore, I recall seeing at least a couple of peer reviewed published studies which found no statistical association between the U.S. News hospital rankings and rankings based on objective measures of health outcomes. I just wish the hospital industry would not cooperate with the U.S. News racket—it does not help when those hospitals that have received high rankings shamelessly plaster billboards all over their towns celebrating their ranking.

tim says:

Nobody in the hospital executive suites thinks those rankings mean anything. They play along because they have to. They spend huge dollars to splash their USNWR ranking in advertising not because they think it is meaningful, but because the public does, and a real comparative advantage — what you might honestly use to distinguish your product — is nearly impossible for hospitals to articulate. (Like banks.)

Every newspaper in the country has learned they can sell advertising to hot dog stands the week they rank the hot dog stands. We have a regional business newsletter that collects “data” on medical offices in our specialty and then ranks the group practices based on their one page survey. They also happen to offer discounts on advertising in that edition for medical group practices.

All of the writers for these articles know very well they are intentionally blurring the line between journalism and sales.

Anonymous said...

Actually you last commented in April of 2010 not four years ago. At that time you wrote "Given the importance attributed to the U.S. News ranking, this article is bound to raise concerns. I know that the folks at the magazine have worked hard over the years to make their rankings as objective as possible, and it will be interesting to see their response to Dr. Sehgal's critique."

I am concerned about your comment "little to be gained" as it suggests that one should speak their mind or be transparent only when there is personal or professional benefit to be gained. Your comment that they "work hard" could seem self serving especially when your hospital was included relatively high in their listing at the time

Paul Levy said...

Shows how insidious the whole thing it, right?

@ewidera said...

From Twitter:

Nicely done. I rank hospitals by # of twitter followers.

clsmt said...

Years ago they did a study asking lawyers to rank Law Schools and included Princeton Law School on the list. Princeton Law had a GREAT reputation in the study - ranked high. But Princeton doesn't have a law school.

They call this the halo effect where something takes on the positive attributes of things it is associated with, whether or not it actually is good itself(or even exists at all)

These kinds of surveys are bogus.

Anonymous said...

At least you have something to talk about, even if it bitterly criticized. In Greece, there is no rating whatsoever of hospitals so neither doctors have data to decide whether it's better to work in one hospital or another and far less patients, who go in a hospital not knowing anything about its performance. Would like to have some kind of performance evaluation and rankings, even if not very accurate, that would serve as a first triage of HCPs.

Jacques said...

Hi... I suggest to rate hospitals by end users reviews.. We are working in this way with our social media: Is the first social media where hospitals are full integrated in.. So is possible to post a comment, or rate through widget.. i invite you take a look... and sure we wait your comments.