Tuesday, September 27, 2011

"A" for effort, but . . .

The Pennsylvania Health Care Cost Containment Council ("PHC4") has posted its latest annual report entitled, "Good Data Drives Good Decisions."  [Undated, but marked as "new" on the website, as seen above.] It is so thoughtfully and clearly written that I would like to say, "Well done," but no, it is not.

From the introduction:

Now more than ever, PHC4’s data on the cost and quality of health care services is needed to make informed decisions, to facilitate competition in the health care arena, and to critically evaluate the value Pennsylvanians receive in return for their health care dollars. In the coming years, good data will be needed to thoughtfully implement health care programs and to evaluate their effectiveness. Good data is also essential in identifying and eliminating significant cost drivers, such as preventable waste and error. The Council can serve as a valuable resource in providing this data.

The problem, as I noted in the past with regard to Massachusetts data and Federally provided data, is timeliness.  Although the report is dated 2010, the numbers presented are much older.  The report discusses chronic health conditions and payments for them in 2007; hospital-specific information for 31 common procedures and treatments performed in Pennsylvania’s general acute care hospitals from October 1, 2008 to September 30, 2009; coronary artery bypass graft (CABG) and/or valve surgeries performed in Pennsylvania in 2007 and 2008; readmissions in 2008; hospital acquired infection data from 2009; financial results from FY 2009.

Not mentioned in the annual report, but available elsewhere on the website, are more recent financial reports, for FY2010.  Now we are getting better, but even those were not published until September 2011.

As I said with regard to Massachusetts, "Don't you think we deserve more timely information about the quality of our [care] than we can get about cars, airplanes, and commuter rail?"

And, "We all appreciate the steps the state is taking, but if we are going to be serious about transparency, let's improve what is posted so consumers have up-to-date and accurate information."

And, "While you cannot manage what you do not measure, trying to manage with data that are a year or two or more older is like trying to drive viewing the road through a rearview mirror."


"[The government] information reported needs to be a lot more up to date, said Carolyn Clancy, director of the Agency for Healthcare Research and Quality. "We're not so good at timely transparency," she said. "We must get to a place where we get data in something like real time."

Some will say that I am being too picky, but I just don't see how these PHC4 data or other such data from other states help "to make informed decisions, to facilitate competition in the health care arena, and to critically evaluate the value Pennsylvanians receive in return for their health care dollars."

Maybe some people from the state, including those members of the PHC4 board, will dispute this and give us all a better explanation.  Here's the list and a promise to print anything they post on this blog in reply:

4 comments:

Anonymous said...

While I support the need for timely data, it's fairly obvious that providers and payers are reluctant to submit data to state agencies in a timely fashion. The process that these stakeholders insist on for publicly reporting measures of payment and quality is also burdensome enough that a huge time lag becomes unavoidable. The good should not be sacrificed in favor of perfection, and it would help if providers in particular got on board with that.

Anonymous said...

No doubt every hospital administration knows its accounts receivable and payable, as well as inventory, down to the month (or less), as well as the price of every aspirin and syringe.

Preventable harm is not measured in a timely way NOT because we do not know how to do it. Nor because (tertiary academic, no less) institutions lack the capacity or experience. Adverse events - and their harbingers, near misses - aren't collected or reported in a systematic, comprehensive and timely manner because it is not a priority. And data are not shared publicly, because apparently a life spared is someone else's income. How about a dashboard of protectionist morbidity and mortality? Time-lag analysis of similar disclosed and undisclosed events?

For example, in Boston we can chalk up very similar deaths in different hospitals that made front page news. A root cause analysis would show that the first (that we know about) was due to a failed telemetry response. But every one since includes culpability by providers, hospitals, and medical schools that failed to share and seek out both the vulnerabilities and improvements of these events. It should not take a secret handshake to know how to save a life.

The time lag in public data in your example reinforce the wasted time - and lives - involved. Interventions and quality-sensitive policies are hamstrung by this mild enthusiasm for reality checks. And every staff and every provider knows that daily volume trumps this lazy caution. I'm guessing that if the public must wait so long, the hospital staff are clueless to current conditions as well.

Anonymous said...

I agree with the above commenters about the data. In today's world this data is easily collectible in real time and even could be reviewed timely in automatic fashion, therefore there is really no excuse.

However, the composition of the Board struck me oddly, the way the allegiance of each member is specified. Then, if one does some counting, Labor seems to be running the table, followed closely by business. Providers and payors have 5 representatives total (all with different perspectives of course) and 'consumers' only one.

Must make for interesting votes.

nonlocal

Katherine Rowell said...

If I may a bit of a history lesson, reminder..... In the early 1980's the insurance industry came together to develop a Universal Billing Form (UB82) for the submission and payment of health insurance claims by hospitals--this was followed by 1500 forms for providers. Prior to this claims were filed in an un-structured manner on paper (seems so quaint now).

What we now accept as a common and sensical thing was revolutionary back in the day. As a result the third party payers amassed piles of data and leveraged it to their advantage. Researchers used it as well--consider the work of Dr. Jack Wennberg and the Dartmouth Atlas, which has essentially changed how we think about healthcare delivery across the board.

Now consider clinical data. The medical community and its leadership failed to grasp (and continue to in large part) the power of a common taxonomy for the collection and timely reporting of data in a universally accepted electronic format. Instead they have "bull-headedly" held on to the "craft" model of medical records which have to be mined by rows and rows of clinical nurse reviewers or researchers to gather any data whatsoever. Top this off with a hubris that each medical society or research groups knows "best" how to define clinical data and you have nothing short of a colossal nightmarish mess of limited use data.

Consider for example the following surgical quality improvement programs (that I can easily name of the top of my head):
The Society for Thoracic Surgery Cardiac Resistry, National Surgical Quality Improvement Program, Surgical Care Outcomes Program Washington State, Michigan Surgical Quality Improvement Program, Society for Bariatric Surgery (ASBS and ACS both have bariatric programs), Trauma Surgery, Northern New England Heart Study (not to be confused with STS Cardiac) .....all of these program measure outcomes such as wound infections, pneumonia and urinary tract infections......ALL of them use different data definitions and have varied (if any) audit and data validation policies. That data is abstracted out of medical records by reviewers after the medical record is available. It is then re-entered into the respective databases. Much of it is then "harvested" a couple of times of year, reviewed for problems, sent back for correction before any analysis may be performed or any reports created. AND most of these programs have been developed with full knowledge of the existence of the other.......but each group and society "knows best".

Okay...you get the point...this is lunacy and I believe...with all due respect fingers are getting pointed in the wrong direction......where are the medical societies and the providers in this conversation???????

Well perhaps there is hope. Meaningful Use standards that require that clinical data be captured using the same definitions and algorithms...and get this ....IN THE CONTINUUM OF CARE...and entered and verified by clinicians is a good (and LONG overdue) start and imagine...... the prototype has been right in front of us since the early 1980s.

This is how we will get timely, reliable, uniform, audited data. Kudos to David Blumenthal, MD and his leadership.....and to the Obama administration for making funds available for electronic health records.

If only the medical societies had shown leadership sooner. Will they ever learn? Ever?

And on a final note....I will say that this report (and all healthcare data reports) need more data visualizations that use the best practices of data display.....but that of course is a different topic.