While the creation of master data is important, and the seamless dissemination of it to end users is even more important, it is the accuracy and quality of that data that is crucial to the success of your master data management (MDM) strategy.
Yet, few companies carefully consider data quality as they are developing their MDM plan, and fail to put the proper validation mechanisms in place upon execution of that plan. This cannot only seriously hinder MDM success, it can have a severe impact on core business operations.
Why are validation and quality control so vital? Because information is generated from many sources. There is application data, which is maintained in various back-end business systems, as well as the metadata that describes its attributes. There is transaction data, which is created in the course of “live” events or automated messages, and the reference data that provides detail about it. Then finally, there is master data, which links these together to facilitate the creation and centralization of a single, consistent set of values across all sources.
Take, for example, a client’s location. While a customer relationship management (CRM) system may display one address, an accounting package may show another. Yet a third address may be included in an electronic document, such as a purchase order, transferred during the course of a business-to-business transaction. These types of inconsistencies, if not detected and corrected in a timely manner, can cause major setbacks in MDM projects. In other words, bad data will ultimately lead to bad master data.
And, when master data is poor, businesses won’t achieve the levels of flexibility and agility they set out to reach, since they’ll be basing both tactical and strategic decisions on information of sub-par quality.
How does validation work? Automated validation can work in several ways. It can scan the environment to uncover inaccuracies, such as those mentioned in the above example, across multiple data sets, and flag them for review. An IT staff member can then manually take a look, and make any needed corrections to promote accuracy throughout the business.
The more advanced quality control techniques allow for the use of dynamic business rules. These rules can be proactively applied to back-end systems, to ensure that bad information doesn’t enter the environment in the first place. For example, it can prevent end users from entering client last names that include numbers, or mailing addresses that don’t have enough characters. These business rules can also be used to automatically “cleanse” bad data after the fact, instantly reformatting or altering it, based on pre-set guidelines, once it has been discovered.
In order for an MDM initiative to deliver optimum returns, fully-automated controls and validation must be put into place, to ensure that master data is accurate and up-to-date at all times. However, these controls must be broad-reaching, governing not only how data is handled once it has been created, but how it is generated and updated throughout its lifecycle.
Comments