tag:blogger.com,1999:blog-32053362.post3075810001608866928..comments2024-03-18T06:27:51.599-04:00Comments on Not Running a Hospital: "A" for effort, but . . .Paul Levyhttp://www.blogger.com/profile/17065446378970179507noreply@blogger.comBlogger4125tag:blogger.com,1999:blog-32053362.post-2974668154651017492011-10-10T14:28:19.420-04:002011-10-10T14:28:19.420-04:00If I may a bit of a history lesson, reminder..... ...If I may a bit of a history lesson, reminder..... In the early 1980's the insurance industry came together to develop a Universal Billing Form (UB82) for the submission and payment of health insurance claims by hospitals--this was followed by 1500 forms for providers. Prior to this claims were filed in an un-structured manner on paper (seems so quaint now). <br /><br /> What we now accept as a common and sensical thing was revolutionary back in the day. As a result the third party payers amassed piles of data and leveraged it to their advantage. Researchers used it as well--consider the work of Dr. Jack Wennberg and the Dartmouth Atlas, which has essentially changed how we think about healthcare delivery across the board.<br /><br />Now consider clinical data. The medical community and its leadership failed to grasp (and continue to in large part) the power of a common taxonomy for the collection and timely reporting of data in a universally accepted electronic format. Instead they have "bull-headedly" held on to the "craft" model of medical records which have to be mined by rows and rows of clinical nurse reviewers or researchers to gather any data whatsoever. Top this off with a hubris that each medical society or research groups knows "best" how to define clinical data and you have nothing short of a colossal nightmarish mess of limited use data. <br /><br />Consider for example the following surgical quality improvement programs (that I can easily name of the top of my head):<br />The Society for Thoracic Surgery Cardiac Resistry, National Surgical Quality Improvement Program, Surgical Care Outcomes Program Washington State, Michigan Surgical Quality Improvement Program, Society for Bariatric Surgery (ASBS and ACS both have bariatric programs), Trauma Surgery, Northern New England Heart Study (not to be confused with STS Cardiac) .....all of these program measure outcomes such as wound infections, pneumonia and urinary tract infections......ALL of them use different data definitions and have varied (if any) audit and data validation policies. That data is abstracted out of medical records by reviewers after the medical record is available. It is then re-entered into the respective databases. Much of it is then "harvested" a couple of times of year, reviewed for problems, sent back for correction before any analysis may be performed or any reports created. AND most of these programs have been developed with full knowledge of the existence of the other.......but each group and society "knows best". <br /><br />Okay...you get the point...this is lunacy and I believe...with all due respect fingers are getting pointed in the wrong direction......where are the medical societies and the providers in this conversation???????<br /><br />Well perhaps there is hope. Meaningful Use standards that require that clinical data be captured using the same definitions and algorithms...and get this ....IN THE CONTINUUM OF CARE...and entered and verified by clinicians is a good (and LONG overdue) start and imagine...... the prototype has been right in front of us since the early 1980s. <br /><br />This is how we will get timely, reliable, uniform, audited data. Kudos to David Blumenthal, MD and his leadership.....and to the Obama administration for making funds available for electronic health records.<br /><br />If only the medical societies had shown leadership sooner. Will they ever learn? Ever?<br /><br />And on a final note....I will say that this report (and all healthcare data reports) need more data visualizations that use the best practices of data display.....but that of course is a different topic.Katherine Rowellhttp://www.ksrowell.comnoreply@blogger.comtag:blogger.com,1999:blog-32053362.post-4180591749466830452011-09-29T07:02:32.273-04:002011-09-29T07:02:32.273-04:00I agree with the above commenters about the data. ...I agree with the above commenters about the data. In today's world this data is easily collectible in real time and even could be reviewed timely in automatic fashion, therefore there is really no excuse.<br /><br />However, the composition of the Board struck me oddly, the way the allegiance of each member is specified. Then, if one does some counting, Labor seems to be running the table, followed closely by business. Providers and payors have 5 representatives total (all with different perspectives of course) and 'consumers' only one.<br /><br />Must make for interesting votes.<br /><br />nonlocalAnonymousnoreply@blogger.comtag:blogger.com,1999:blog-32053362.post-79161903099504498842011-09-27T16:20:54.190-04:002011-09-27T16:20:54.190-04:00No doubt every hospital administration knows its a...No doubt every hospital administration knows its accounts receivable and payable, as well as inventory, down to the month (or less), as well as the price of every aspirin and syringe.<br /><br />Preventable harm is not measured in a timely way NOT because we do not know how to do it. Nor because (tertiary academic, no less) institutions lack the capacity or experience. Adverse events - and their harbingers, near misses - aren't collected or reported in a systematic, comprehensive and timely manner because it is not a priority. And data are not shared publicly, because apparently a life spared is someone else's income. How about a dashboard of protectionist morbidity and mortality? Time-lag analysis of similar disclosed and undisclosed events?<br /><br />For example, in Boston we can chalk up very similar deaths in different hospitals that made front page news. A root cause analysis would show that the first (that we know about) was due to a failed telemetry response. But every one since includes culpability by providers, hospitals, and medical schools that failed to share and seek out both the vulnerabilities and improvements of these events. It should not take a secret handshake to know how to save a life.<br /><br />The time lag in public data in your example reinforce the wasted time - and lives - involved. Interventions and quality-sensitive policies are hamstrung by this mild enthusiasm for reality checks. And every staff and every provider knows that daily volume trumps this lazy caution. I'm guessing that if the public must wait so long, the hospital staff are clueless to current conditions as well.Anonymousnoreply@blogger.comtag:blogger.com,1999:blog-32053362.post-5374740322901766982011-09-27T10:29:54.525-04:002011-09-27T10:29:54.525-04:00While I support the need for timely data, it's...While I support the need for timely data, it's fairly obvious that providers and payers are reluctant to submit data to state agencies in a timely fashion. The process that these stakeholders insist on for publicly reporting measures of payment and quality is also burdensome enough that a huge time lag becomes unavoidable. The good should not be sacrificed in favor of perfection, and it would help if providers in particular got on board with that.Anonymousnoreply@blogger.com