Making Use of Data

December 18th, 2013 / By Jeremy Zasowski

In his book Blink, Malcolm Gladwell writes about Dr. Brendan Reilly’s work at Cook County Hospital in Chicago from back in the late 1990s. At that time, the hospital was stretched thin, running low on resources and struggling to deal with roughly 250,000 patients coming through the Emergency Department every year. Patients routinely waited hours to be seen. One of the hospital’s key struggles was determining which patients coming into the Emergency Department with complaints of chest pain were actually having a heart attack and thus required expensive, resource-intensive care.

It’s an interesting case study if you get a chance to read it, but I’ll just give a brief summary here. Dr. Reilly used work that had been done from back in the 1970s by a cardiologist named Lee Goldman. Goldman took the data from hundreds of cases and ran it all through a computer program to identify what kinds of symptoms and clinical findings actually predicted a heart attack. From this analysis, Goldman came up with a formula that honed in on the key evidence that doctors should use to determine how to properly diagnose heart attacks. From the hundreds of cases and a multitude of clinical findings and symptoms each patient presented, Goldman found that along with the readings of an ECG, doctors only needed to focus on three other urgent risk factors: Was the angina stable or unstable? Was there fluid in the lungs? Was the patient’s systolic blood pressure below 100?

Goldman used his findings to create a decision tree that could help doctors better diagnose heart attacks. Reilly took that decision tree and implemented it at Cook County Hospital and found that when physicians followed this decision tree, the results were better than when physicians diagnosed heart attacks using their old methods. The decision tree was 70 percent better than the old method in determining which patients were not having heart attacks, and thus did not need more expensive treatment. Also, doctors on their own diagnosed the most serious patients correctly somewhere between 75 and 89 percent of the time. Using the formula-based decision tree, the diagnosis was correct more than 95 percent of the time.

I don’t bring up this real-life story to point out the same points that Malcolm Gladwell makes in his book, or to in any way criticize physicians and their approach to caring for patients. I tell this story because I believe it is an example that highlights what the promise of electronic data and advanced analytics can start to bring to healthcare.

As more and more healthcare data is becoming digitized through the adoption of fully electronic health records, there is going to be a flood of data available to healthcare providers. Healthcare providers are routinely faced with huge amounts of patient data and they must make increasingly complex decisions based on their analysis of all of this patient data, just like the physicians at Cook County Hospital. The power of technology such as natural language processing (NLP) and the development of predictive analytical models from huge sets of clinical data will allow healthcare to identify better diagnosis and treatment pathways, but on a much broader set of patients than just those showing up in the ED with symptoms of a potential heart attack.

The key next step to realizing this potential comes down to normalizing and standardizing clinical data through the use of standard terminology and coding sets. This is where NLP and coding engines come into play. The upfront technical investment that many healthcare organizations are making today will pay huge dividends as more and more analytical power gets applied to this clinical data.

Jeremy Zasowswki is the Marketing Manager for 3M Health Information System’s Emerging Business Team.