Problem

Inaccurate, inconsistent, and duplicate versions of master data cause a multitude of problems for businesses.

Most businesses rely on a variety of systems to conduct business which means the same master data may be in more than one system. Customers may be defined in an ERP (Enterprise Resource Planning) system to enable the business to take customer orders, ship products, and invoice the customer, and process payments. A separate CRM (Customer Relationship Management) system might track leads, prospects, and customer interactions. And sales reps might use an SFA (Sales Force Automation) system to record customer orders and transmit them to the ERP system.

Without proper controls, there is virtually no way to provide a single source of the truth. A customer asks their sales rep to correct an incorrect address, which the sales rep does in the SFA system. But the incorrect address is still in the ERP and CRM systems causing customer dissatisfaction when they don’t receive an order and expense to the business to reship the order to the correct address. There will also be expense to research and correct the data.

A customer service rep out of one branch creates a new customer record without realizing the customer was already created at another branch. The business now has duplicate customer records making it difficult to aggregate sales for the customer. If a special marketing program is initiated for customers with a certain minimum level of sales, the customer may be inadvertently omitted because the sales are split across two different customer records.

Cross selling and up selling opportunities may be lost because the sales reps do not have complete visibility into what the customer is actually buying.

Preventing duplicate customers can be difficult with many systems. If a customer service rep receives a request to add the customer “Board of Water and Light” and does not find an existing customer record with a name search, they create a new one. Unbeknownst to them, the customer already exists but the company name was entered as “BWL”.

Inaccurate master data can also negatively impact a business. When creating or changing a raw material, imagine the impact if an employee enters the proper unit of measure of ounces for the reorder point, but enters a quantity representing the number of pounds. Substantial costs will be incurred when eventually the raw material has to be expedited or worse yet the line has to be shut down or production rescheduled.

The amount a business might invest in securing business with a company might be influenced by how highly successful the company is. But if inaccurate data portrays a company as far more successful that it actually is, the business may be wasting money better spent elsewhere.

Solution

First of all, let’s see what the “single source of the truth”, “system of record”, and “golden copy” mean.

In theory, all of these are addressing the MDM goal of having consistent data across the enterprise. When you look at a customer name, it should be the same whether you look in your ERP system or your CRM system. It does not necessarily mean the customer name should be stored in one and only one place. While keeping in one place would provide a single source of the truth, it is generally not a practical approach because of the myriad of systems used by most enterprises. It does mean that there need to be controls in place regarding the maintenance of customer name and it must be synchronized across all systems that contain it.

To complicate matters just a bit, it is important to understand what one means when talking about the customer name. There could in fact be several different types of customer names. There could be the customer name as the customer does business with us. There could be a legal name that could be different. And there might be an abbreviated name used on particular types of internal scheduling or other documents.

The legal name might be maintained in one system, the abbreviated name in a second system, and the “doing business as” name in a third system. The hub would pull down all three names. The “source of the truth” would be the source systems but the data hub would always have a copy. In this scenario, if a fourth system needed one of the customer names, the data hub would provide the required information.

Alternatively, customer name maintenance could be disabled in all three source systems and maintained only in the data hub. The hub would push out updates to the appropriate systems as they are made. The “source of the truth” would be the data hub but again the other three systems would always have a copy.

The point is, master data may be maintained in the source system or in the data hub but not both.

And it’s not just controlling in which system a customer can be created or changed – or retired or archived for that matter. A robust Data Governance process (preferably workflow driven) must clearly spell out the processes to determine who may request a change, who has the authority to approve the request, and what is it they can approve. When a customer is added, different people or groups might be required to approve the customer name, credit limit, and the assigned sales rep. It can get equally if not more complicated when creating Items.

Before the request is processed, the Data Quality level required and rules for cleansing, consolidating, and harmonizing data should be spelled out.

Once you have established Data Governance and defined the scope, e.g. we are going to address Customers and Products first, you will need to identify the data sources for your customer and product master data. In addition, you need a method of mapping fields and reference codes from one system to another.

Sex might be stored in one system in the table CUSTOMER in the column SEX with a value of “M” for male and “F” for female. Sex might be stored in another system in the table CMPCM in the column CMSEX with a value of ‘1’ for male and ‘2’ for female. These inconsistencies can be corrected by programming the data transformation in a customer conversion program. Some MDM systems provide cross reference tables for this purpose. And yet other MDM tools simplify this process by providing pre-defined templates for both systems so you don’t have to manually map the data.

Many MDM tools predefine a data hub in which the master data from various systems will be housed. The data from the various source systems is loaded into the data hub in a common format. Then the work really begins.

Data Standardization would be done across applications to bring data into common format, e.g. addresses.

Data Consolidation would be done on an ongoing basis to build rules to identify potential duplicates. A decision would be made to keep, merge, or eliminate the duplicates. Some MDM systems have complex algorithms to facilitate identification of potential duplicates.

Data Harmonization facilitates data consistency across multiple environments by cross referencing merged child data against parent data. This ensures child data does not get orphaned while merging or eliminating a duplicate master (parent) record.

Data Profiling would analyze Master Data and provide statistical summaries, e.g. minimum, maximum, average, median, number of unique values, distribution of values, etc. If 99% of the values for a column fall within the range of 0-100 and less than 1% have a value in excess of 1,000,000, that is a pretty good indication the outlier data needs to be evaluated.

Data Cleansing would generally be based on user defined rules. Once MDM is set up, corrections could be made in the source systems and pulled into the data hub, or data could be corrected in the data hub and pushed back into the source system.

Some legacy systems will not have as robust data as do more modern systems. Data Enrichment may be required to add missing information deemed critical.

A Single Source System is designated as the “System of Record” for Master Data.

appMDM provides “Central Data Hub” to process Master Data Maintenance Transactions (Add, Change, Archive, Purge).

A “Master Data Governance” group controls the Master Data Management process and safeguards the data.

appMDM consolidates master data from multiple sources into a Central Data Hub (Data Mart) using Data Quality Management features. The Data Hub allows consolidated reporting and query of master data, inventory, cost, and account balances etc.

Work Flow functionality supports data governance and approval processes and procedures.

Control of privileges at organization/object/attribute level ensures data integrity.

Chain-Sys provides a configurable Platform for users to create their Data Model, User Interfaces, and Business Validations for the MDM Hubs. The Platform automates the code creation for the MVC architecture. The Data Architects can create any complex Data Hubs using this platform rapidly.

appMDM provides complete flexibility on the Domains. It provides the following out of the box domains for Products (Products, BOM Structures, Formulas, Routing, and Recipe), Contacts (Customers, Customer Sites, Customer Contacts, Suppliers, Supplier Sites, Supplier Contacts, Employee and Employee Contacts) and Operations (Fixed Assets, and GL Code Combination). Designing a custom domain is easy. Users can create domain hubs for their specific needs with no programming efforts, by simple configurations. Chain-Sysstrongly believes in the template based approach for creating Hubs. appMDM provides templates for all their standard domains against major applications such as: SAP, Oracle EBS, Oracle Fusion, JD Edwards, PeopleSoft, Microsoft and Chain-Sys ERP.

“DQM is a prime mover of a successful SAP Data migration project. It is a mandatory process and cannot be wished away.”

Chain-Sys provides a state-of-the-art ETL tool for configuring Migration and Integration objects. Chain-Sys offers the Industry Gold Standards: appMIGRATE and appINTERFACE tools for performing the Data Migration and Integration for MDM efforts.

“For a large document management company we reduced Item Master data size to about 5% of the original size.”

Chain-Sys provides multiple roles to the MDM users such as: Operations, Data owner, Data Steward, Data Guardian, Data Architect etc. All these roles are used in configuring the workflow for any new Master Data introduction into the hub and also for processing the inbound master data that is interfaced from other Consumer Systems.

Establish a centralized Data Governance Organization to:

  • Design and maintain the enterprise data model that supports desired business application landscape and capabilities
  • Ensure that Data Management efforts (Business or IT) are accelerated and consistent with enterprise core process performance
  • Accountable for data quality; resolve information and cross process issues
  • Enforce global data rules and standards monitoring master data quality metrics and initiating corrective action Correct Pre-validation Errors
  • Maintain an enterprise wide consolidated “data mart” for reporting , query and analysis

“For a large document management company we reduced Item Master data size to about 5% of the original size.”

appMDM’s Data Quality Management comprises Data Consolidation, Data Standardization, and Data Harmonization activities. Consolidating the data into the Hubs is a critical activity. The system collects data from multiple source systems and sends it to target MDM Hubs to perform Consolidation. Hadoop technology is extensively used. appMDM provides the “Matches”, and the Data Stewards or the Data Owners can classify the matches as “False Positive” or “False Negative” and perform the merge, drop or migrate actions.

“Major Japanese consumer electronics firm’s European Division built over 200 Coexistence integration objects using appINTERFACE Module of appMIGRATE and saved over 70% in cost and time. The integrations were built in record times where the traditional ETL approaches had failed.”

Most ERP and other Enterprise Systems cause great confusion and frustration to users who have to wade through screen after screen of multiple fields just to find the two or three fields for which they are responsible. Hundreds of master data attributes are maintained by multiple departments within a company. Adequate checks and controls may not be in place in all screens. Training becomes much more difficult with complex screens with multiple views and dependencies with other modules.

The Chain-Sys Platform Suite’s appMDM component comes to your rescue with its ability to build rapidly user role based customized and simple Master Data Screen. The adapters (3000+) across 200+ Applications, lets you build Simplified Master Screens in a matter of hours or even minutes. All the checks of the target Applications are carried out upfront as Prevalidations and (Master Data) transactions are posted correctly to one or more target systems.

Along with built in Workflow and Governance of appMDM, you can ensure that incorrect, inconsistent or duplicate data are not created in the first place. The above concepts can be utilized for Transactional Data also. For example, Invoice creation. If multiple people have to collaborate to create an invoice and supervisors have to cross check and okay them, appMDM provides the right features to perform your tasks/transactions correctly.