Customer Master Data Quality is relative

skyscrapers with reflecting walls in modern megapolis

Data are not fuel, not energy, not a life force. In itself, customer data, in particular, holds no innate value to your business beyond the purpose for which your business acquired it and uses it.

However, all that said, the decisions that an enterprise takes based on insights from customer data can lead to different outcomes some of which can be very detrimental to the health of the business. For this reason, the quality of the customer master data may be the one thing that you really want to focus on, in relation to customer master data overall.

In the absence of good data quality, data-driven initiatives are sub-optimal and may even be rendered useless. This is why, ensuring that you have quality customer master data is pretty foundational to avoiding missteps, reducing risk and harvesting meaningful benefits from your customer master data.

Absolute data quality is probably an unattainable goal and quite honestly, the value of absolute data quality is relative to the nature of the data itself. Data that is sourced as zero-party and first-party data should typically be of better quality but the effectiveness of capturing good quality 0PD and 1PD is dependent on the data capture and collection methods and the control mechanisms put in place to provide data quality assurance.

One of the ways that the Pretectum Customer Master Data Management (CMDM) platform helps, is with the ability to capture and edit new and existing records through interactive screens with built-in data validations. Another way is to leverage the Pretectum CMDM APIs and integrate these with your business applications to provide added data quality assurance at the time of customer master data capture and edit. Rules and data quality measures are all driven by the schema definitions that overarch a given dataset.

The $1-$10-$100 Rule

Capturing the best possible data at the time of origination is the first prize in approaches to customer master data collection. According to Gartner and D&B, the costs associated with the collection of a single record can be as much as US$1, this cost not including other acquisition costs. But, if there are any fatally bad data elements, the resolution costs, which are basically investigations and workarounds, could be as much as US$10 per record. Correcting those records at the sources reaches a whopping US$100 per record.

This is commonly known as the 1-10-100 rule. It is a rule-of-thumb model illustrating the hard costs to an organization of chasing mistakes and reinforces the argument that failure to take notice and correct mistakes early on, escalates costs the later they are realized.

The counter-position is that a shared source prevents the time and costs of rekeying and verifying information entered into separate disconnected systems.

A single source for customer master data also eliminates the costly and embarrassing mistakes that are created with disconnected systems and the absence of real-time or near real-time synchronization and integration.

All this comes down to the simple calculation that if your company holds 1M consumer records, it would have cost you potentially US$1M to acquire them over their lifetime. If there is 10% inaccuracy in any customer master data dataset, you’ll be spending just as much on data issue triage during the lifetime of those records and millions more on correcting those same records to avoid the triage costs.

A couple of studies cited in the MITSloan Management Review based on research by Experian plc, as well as consultants James Price of Experience Matters and Martin Spratt of Clear Strategic IT Partners Pty. Ltd. estimated the cost of bad data to be 15% to 25% of revenue for most companies. That’s astonishingly high but seems to align with the high cost to triage and remediate just customer data alone.

Multifaceted data quality

What becomes pretty clear the moment you start reviewing data quality across your systems estate, is that data quality is multifaceted. Consistency, accuracy, recency, completeness, and de-duplication are obvious aspects, but when you consider the typically siloed nature of systems you quickly come to the realization that consistency, for example, doesn’t carry quite the same weight for all business use cases.

Many organizations field ten or more systems of record. These range from ERP, CRM, CDP, POS, Service, Support, and warranties through to the many spreadsheets and Access databases as well as other specialist systems of record that a given organization might have.

The data quality problem is further compounded when you examine data ownership and who is designated as the most responsible person for customer data and which systems are considered the true authority in relation to the customer master.

A customer master data management platform like the Pretectum CMDM provides an organization with the ability to define what good customer master data should look like and then, is able to assess data added or loaded into the system in relation to its conformity with the customer master data quality definitions.

The adoption of the CMDM platform affords the average organization not just a centralized customer master collation and insights point, it can also serve as the hub to a multispoked approach to servicing customer master data to different systems with different needs and usage.

Through the combination of manual data stewardship and automation, a CMDM like a Pretectum Customer MasterData Management platform can also reduce the cost of triage and remediation, depending on the implementation approach.

The relative quality of customer master data records can be assessed holistically and compared and contrasted with data sourced from other upstream data collection systems and repositories.

Most importantly, the records, even with variations in their content, can be consolidated and converge on a single source of truth. To learn more about how Pretectum can help with your Customer Master Data Management challenges, contact us today.