Strategic Data Quality Approach Drives Business Quality, Part III: Remediation

In this era of the primacy of data and information to any insurer’s strategic and operational efforts, it seems just to make sense to treat the management of such a valuable asset in at least the same way as any other strategic asset. 

(Image credit: Adobe Stock.)

In Parts I and II of this three-part article series, the discussions were centered on the need for creating and implementing a strategic data quality approach. The foundation for such an approach is a data quality management strategy that includes the necessary governance and processes for maintaining effective data quality throughout the data lifecycle. That involves focusing on the major components required to manage data quality: detection, notification and remediation. Part I focused on detection, Part II on event notification, and in this final part of this series, the focus turns to data quality remediation.

In the context of a data quality management program, remediation is, of course, the last line of defense and the bottom line. The defense is correcting data quality exceptions as they occur, and the bottom line is remediating data quality exceptions in such a way that they do not re-occur, at least not in an identical way. This can become a tricky thing to accomplish.

A sound strategic data quality approach should enable the identification of data quality exception patterns—how exceptions occur may point to a process, configuration, or software error that can be addressed at a more macro level across the data ecosystem.

Some examples of data quality exceptions patterns include:

  • Value domain rule: a field or column value violates a business rule by being outside of an allowed range
  • Duplicate key rule: the source data violates the business conceptual model
  • Missing or late arriving rule: this is typically caused by missing reference data

These patterns should already be known based on the system data design and implementation. An organization’s data steward or its relevant subject matter expert (SME) should be consulted in designing and implementing the appropriate strategic detection method.

Similar to the medical definition, triage refers to determining degrees of urgency; in this case, triage refers to determining data quality exceptions. Some data quality exceptions may be of interest for later analysis. These are typically assessed as informational exceptions—but they do not have a material effect on business reporting or other data usages. Other exceptions are of such a serious nature that they are considered fatal, meaning that the business must be notified not to use this or related data until the problem is corrected. The third and much more subjective category is a warning of exceptions.

The method of triage can be at least partially automated using rules established by a Data Steward or similar data expert. Ideally, such rules can be programmatically applied during detection and notification to lessen the burden of any manual triage. An exception categorized as a “warning”—whether the categorization is programmatic or manual—is typically where most manual remediation assessments will occur, so it is advisable to implement the system to minimize this category. These warnings must be evaluated, potentially along with other category exceptions for remediation.

In part I of this series, we identified the distinction between strategic and tactical data quality exceptions. It should be noted that detection and notification patterns apply mostly to the strategic variety of data quality exceptions. That said, tactical data quality exceptions will happen, and when they do, the triage is conceptually the same. Also, after a tactical exception is discovered, there’s an opportunity to enhance the detection patterns to proactively manage this new type of exception, that is, the tactical has now become a strategic variety of exception.

Data Steward
As mentioned previously, an organization’s data steward is the critical intersection where data governance and data quality meet. For most organizations, this will be a single individual, and this person will have or have access to, expert data knowledge, along with authority to develop and apply business rules in defined subject areas.

The goal of any data steward function is that any and all data quality exceptions should be ultimately traceable to a designated data steward.  This way there should never be a detected data quality exception that is not accounted for under the Data Steward function.  In the context of data quality remediation then, any and all data quality notifications are routed to the data steward, and any and all data quality remediation should be coordinated with the data steward.

Data Governance
One of the key elements of any data quality remediation process is the establishment and oversight of a data governance function. This is typically an enterprise-level group that creates the standards and processes required to manage an organization’s data assets, procedures, and practices. At the enterprise level, it should be composed of representatives from the various business functions that use the data on a daily basis, and of representatives from the technology and security functions. At larger organizations, the governance function may be further broken down and distributed across geographic regions or downstream entities, but the overall responsibility and accountability for data governance still flow up to the enterprise-level data governance function.

No matter the structure, an effective and accountable data governance function is an essential part of any organizations data quality management strategy and efforts.

The goal of this three-part article was to focus on creating a new or improving an existing, data quality management strategy and implementation approach that accounts for the elevated and strategic importance of data quality management in any insurer.

The core of any sound data quality management strategy is one that includes the key components of detection, notification and remediation. Including each of these components as a part of a data quality management strategy becomes the foundation for an effective and efficient data quality ecosystem.

In this era of the primacy of data and information to any insurer’s strategic and operational efforts, it seems just to make sense to treat the management of such a valuable asset in at least the same way as any other strategic asset.

Strategic Data Quality Approach Drives Business Quality, Part I: Detection

Strategic Data Quality Approach Drives Business Quality, Part II: Event Notification

Roy Pollack // Roy Pollack is a Data Solution Architect for X by 2, who assesses and implements complex data integration, conversion and optimizations that provide improved understanding, analytics and visualizations of enterprise information. Pollack seeks to understand the uniqueness of each organization and applies best-practice design patterns to expedite and achieve quality solutions by using the most optimal and pragmatic implementation approach. Over the past 25 years, Pollack has delivered data solutions across many industries, including insurance, finance, retail, and healthcare, as well as managing system conversions.

Leave a Comment