Data quality: does it impact the customer experience?

The short answer is yes.  Specifically, it is impossible to provide a positive experience to a customer if an organization lacks the laser-like focus and the controls necessary to ensure that their data are accurate to begin with and maintained so throughout their useful life, and if the awareness is missing as to how this impacts the customer directly.

Putting patients first

Imagine a patient who is not alerted by the hospital as to a worrisome lab result because their name was recorded as Robert J. Smith in one system and as Bob Smith in another, and the fact that they are one and the same person was not detected, or was detected too late to be of practical use. Or, what about address or email data that are entered wrongly or left blank accidentally at patient registration time, making it impossible to get in touch with a patient to inform them of a new service or procedure or of a surgery being rescheduled? Worse, what about data intentionally left blank which are later filled in by well-meaning staff who may however have no clear grasp of the reasoning behind the original decision?  And what about patient data which do not get transmitted in their entirety between clinical and billing systems that are not interfaced properly, and cause delays and rework in claims processing? Again, what about the undue reliance healthcare providers put on data feeds from third parties, without a real understanding of the data quality checks the latter may or may not have in place to begin with? These are all very real occurrences in healthcare today, and they should give us pause, because putting the patient first starts with taking proper care of their data. Indeed, one can reasonably argue that without proper care of data, effective care of patients is only wishful thinking and harm and waste are more likely.

An effective approach to addressing data quality needs to be multi-pronged, and, first and foremost, involves developing a corporate awareness that data inaccuracies likely pervade the organization, period.  Those that are able to accept this and can move past the tendency to blame someone else, should consider doing at least the following:

  • assessing the current state of data quality, by establishing standards of accuracy and operational metrics, and by carrying out data profiling in source systems and over the lifespan of data
  • doing more and better staff training but also — as I have pointed out before in this blog — mistake-proofing relevant data entry and transformation processes
  • ongoing data quality monitoring to detect significant trends or unexplained deviations from whatever levels of quality and integrity have been measured as current, and defined as desirable and achievable
  • collaborating across the enterprise, striving for standardization of nomenclature and unification of records, with a view to avoiding duplication of data and multiple, siloed entry points for the same data
  • developing a clear understanding of which raw data are used to generate derived data and how, and what are their sources — which points to the concept of a corporate data registry, similar in purpose to an employee database, except for the fact that the data registry should also include current data quality ratings (akin to employee performance reviews) and information about who uses the data and to what aim
  • carrying out checks and audits on data not generated externally to the organization but widely used internally.

The above are all the more important when trying to stand up an enterprise-wide data warehouse.

A white paper from Experian giving an overview of this topic, albeit from a financial services viewpoint, can be found here.

 

Share

Leave a Reply

Your email address will not be published. Required fields are marked *

* Copy This Password *

* Type Or Paste Password Here *