Here's a quiz. Can you guess who posted the following messages on Twitter?
Any idea how many hospital execs' bonuses are tied to their institution's U.S. News rankings?
When execs confide this arrangement, they expect me to be impressed or flattered. Are you kidding? I'm deeply disturbed.
In my view it's a symptom the board has abdicated its responsibility to measure, monitor & incentivize quality improvement.
You might be surprised to learn that it was Ben Harder, @benharder, chief of health analysis at US News and World Report, the magazine that publishes "data, rankings & tools to help consumers choose hospitals, doctors, health plans & more."
Probably more than anyone in the country, Ben understands the inherent limitations in any such rankings. More important, he understands that the rankings are designed to advise patients with complex medical conditions. They are not an indication about the general level of quality of care or safety in an institution.
He certainly knows that hospitals use the rankings in their marketing materials, but he understands that what makes marketing effective is different from what makes it possible for a hospital to deliver the highest level of care and to engage in ongoing clinical process improvement.
Bravo to Ben for putting this out there so clearly. I'm hoping board members take note.
Any idea how many hospital execs' bonuses are tied to their institution's U.S. News rankings?
When execs confide this arrangement, they expect me to be impressed or flattered. Are you kidding? I'm deeply disturbed.
In my view it's a symptom the board has abdicated its responsibility to measure, monitor & incentivize quality improvement.
You might be surprised to learn that it was Ben Harder, @benharder, chief of health analysis at US News and World Report, the magazine that publishes "data, rankings & tools to help consumers choose hospitals, doctors, health plans & more."
Probably more than anyone in the country, Ben understands the inherent limitations in any such rankings. More important, he understands that the rankings are designed to advise patients with complex medical conditions. They are not an indication about the general level of quality of care or safety in an institution.
He certainly knows that hospitals use the rankings in their marketing materials, but he understands that what makes marketing effective is different from what makes it possible for a hospital to deliver the highest level of care and to engage in ongoing clinical process improvement.
Bravo to Ben for putting this out there so clearly. I'm hoping board members take note.
1 comment:
!!! Thank you for this! That's an amazing development compared to the history of the magazine's rankings.
The utter non-transparency of their algorithms always made me take it with a grain of salt, but a few years back I learned about the appalling original ranking method for the college rankings - in short they knew that Harvard, Princeton and Yale were the best, so they jiggered the formula if the data didn't produce that result. From a 2000 article in Washington Monthly:
"To Elfin, however, who has a Harvard master's diploma on his wall, there's a kind of circular logic to it all: The schools that the conventional wisdom of the meritocracy regards as the best, are in fact the best--as confirmed by the methodology, itself conclusively ratified by the presence of the most prestigious schools at the top of the list. In 1997, he told The New York Times: "We've produced a list that puts Harvard, Yale and Princeton, in whatever order, at the top. This is a nutty list? Something we pulled out of the sky?"
That was back in 2000, and it was about the college rankings, but it makes the point: if they don't show their work and the source data, we have no idea what we're being sold.
So, hooray to @BenHarder for being out there regarding for whose benefit the rankings are being done. I very much hope the magazine will support him, and maybe even publicly discourage boards from misusing the rankings.
Post a Comment