KEMBAR78
Master Data Management | PDF | Information Technology | Information Management
0% found this document useful (0 votes)
72 views4 pages

Master Data Management

This document discusses master data management which involves processes, governance and tools to define and manage critical organizational data to provide a single point of reference. It discusses what types of data are mastered, tools that can help, common issues around inconsistent or duplicate data, and solutions involving processes to identify, transform, consolidate and distribute master data.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
72 views4 pages

Master Data Management

This document discusses master data management which involves processes, governance and tools to define and manage critical organizational data to provide a single point of reference. It discusses what types of data are mastered, tools that can help, common issues around inconsistent or duplicate data, and solutions involving processes to identify, transform, consolidate and distribute master data.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

In business, master data management (MDM) comprises the processes, governance, policies,

standards

and tools that consistently define and manage the critical data of an organization to provide

a single point of reference.

The data that is mastered may include:

• reference data – the business objects for transactions, and the dimensions for analysis

• analytical data – supports decision making

In computing, a master data management tool can be used to support master data management

by removing duplicates, standardizing data (mass maintaining), and incorporating rules to eliminate

incorrect data from entering the system in order to create an authoritative source of master

data. Master data are the products, accounts and parties for which the business transactions are

completed. The root cause problem stems from business unit and product line segmentation, in

which the same customer will be serviced by different product lines, with redundant data being

entered about the customer (aka party in the role of customer) and account in order to process the

transaction. The redundancy of party and account data is compounded in the front to back office

life cycle, where the authoritative single source for the party, account and product data is needed

but is often once again redundantly entered or augmented.

Master data management has the objective of providing processes for collecting, aggregating,

matching, consolidating, quality-assuring, persisting and distributing such data throughout an

organization to ensure consistency and control in the ongoing maintenance and application use of

this information.

The term recalls the concept of a master file from an earlier computing era.

Definition

Master data management (MDM) is a comprehensive method of enabling an enterprise to link

all of its critical data to one file, called a master file, that provides a common point of reference.

When properly done, master data management streamlines data sharing among personnel and

departments. In addition, master data management can facilitate computing in multiple system

architectures, platforms and applications.

At its core Master Data Management (MDM) can be viewed as a “discipline for specialized quality

improvement” defined by the policies and procedures put in place by a data governance
organization.

The ultimate goal being to provide the end user community with a “trusted single version of
the truth” from which to base decisions.

Issues

At a basic level, master data management seeks to ensure that an organization does not use multiple

(potentially inconsistent) versions of the same master data in different parts of its operations,

which can occur in large organizations. A typical example of poor master data management is

the scenario of a bank at which a customer has taken out a mortgage and the bank begins to send

mortgage solicitations to that customer, ignoring the fact that the person already has a mortgage

account relationship with the bank. This happens because the customer information used by the

marketing section within the bank lacks integration with the customer information used by the

customer services section of the bank. Thus the two groups remain unaware that an existing
customer

is also considered a sales lead. The process of record linkage is used to associate different

records that correspond to the same entity, in this case the same person.

Other problems include (for example) issues with the quality of data, consistent classification and

identification of data, and data-reconciliation issues. Master data management of disparate data

systems requires data transformations as the data extracted from the disparate source data system

is transformed and loaded into the master data management hub. To synchronize the disparate

source master data, the managed master data extracted from the master data management hub is

again transformed and loaded into the disparate source data system as the master data is updated.

As with other Extract, Transform, Load-based data movement, these processes are expensive and

inefficient to develop and to maintain which greatly reduces the return on investment for the master

data management product.

One of the most common reasons some large corporations experience massive issues with master

data management is growth through mergers or acquisitions. Any organizations which merge will

typically create an entity with duplicate master data (since each likely had at least one master
database

of its own prior to the merger). Ideally, database administrators resolve this problem through

deduplication of the master data as part of the merger. In practice, however, reconciling several

master data systems can present difficulties because of the dependencies that existing applications

have on the master databases. As a result, more often than not the two systems do not fully merge,

but remain separate, with a special reconciliation process defined that ensures consistency between

the data stored in the two systems. Over time, however, as further mergers and acquisitions
occur, the problem multiplies, more and more master databases appear, and data-reconciliation

processes become extremely complex, and consequently unmanageable and unreliable. Because

of this trend, one can find organizations with 10, 15, or even as many as 100 separate, poorly
integrated

master databases, which can cause serious operational problems in the areas of customer

satisfaction, operational efficiency, decision support, and regulatory compliance.

Solutions

Processes commonly seen in master data management include source identification, data collection,

data transformation, normalization, rule administration, error detection and correction, data

consolidation, data storage, data distribution, data classification, taxonomy services, item master

creation, schema mapping, product codification, data enrichment and data governance.

The selection of entities considered for master data management depends somewhat on the nature

of an organization. In the common case of commercial enterprises, master data management

may apply to such entities as customer (customer data integration), product (product information

management), employee, and vendor. Master data management processes identify the sources

from which to collect descriptions of these entities. In the course of transformation and
normalization,

administrators adapt descriptions to conform to standard formats and data domains, making

it possible to remove duplicate instances of any entity. Such processes generally result in an

organizational master data management repository, from which all requests for a certain entity

instance produce the same description, irrespective of the originating sources and the requesting

destination.

The tools include data networks, file systems, a data warehouse, data marts, an operational data

store, data mining, data analysis, data visualization, data federation and data virtualization. One

of the newest tools, virtual master data management utilizes data virtualization and a persistent

metadata server to implement a multi-level automated master data management hierarchy.

Transmission of Master Data

There are several ways in which master data may be collated and distributed to other systems. This

includes:

• Data consolidation – The process of capturing master data from multiple sources and in-

tegrating into a single hub (operational data store) for replication to other destination systems.

• Data federation – The process of providing a single virtual view of master data from one or
more sources to one or more destination systems.

• Data propagation – The process of copying master data from one system to another, typically

through point-to-point interfaces in legacy systems.

You might also like