Triple R IT Power, Precision and Control in Data Integration and Aggregation

Master Data Management

Master Data Management (MDM) comprises a set of processes and tools that consistently defines and manages the non-transactional data entities of an organization (which may also include reference data).

The objective of MDM is to provide processes for the collection, aggregation, quality-assurance persistance and distribution of such data throughout an organization to ensure consistency and control in the usage and maintenance of this information.

MDM seeks to ensure that an organization does not use multiple (potentially inconsistent) versions of the same master-data in different parts of its operations, which can occur in large organizations.

The selection of entities considered for MDM depends on the nature of an organization. In the common case of commercial enterprises, MDM may apply to entities related to customers (Customer Data Integration), products (Product Information Management), employees, and suppliers. MDM processes identify the sources from which to collect descriptions of these entities. In the course of transformation and normalization, administrators adapt descriptions to conform to standard formats and data domains, making it possible to remove duplicate instances of any entity.

The Data Source Integrator uses abstraction layers in data modeling to create an MDM metadata catalog; this is done without the actual movement or consolidation of the data from the sources. The Data Source Integrator is a substitute for Data Warehouse storage repositories and dynamically process requests at the run time. By utilizing it's virtual abstraction layer and multi-entity approach to object modeling, the Data Source Integrator is capable of dynamically processing incoming ad-hoc requests from Business Analysis departments, or from applications related to BI, CRM, ERP, SCM, etc.

More information related to this topic can be found in:

Business Intelligence

Business Intelligence (BI) aims to support better business decision-making. BI uses the data generated by business processes and external sources to analyze the current state of the business. By using BI practices, one can get historical, current, and predictive views of the state of business operations. Common functions of BI are reporting, online analytical processing (OLAP), data mining and business performance management.

Ideally, BI is used to continueously query the organization's data for KPI values. The values of these KPIs can in turn be used make an informed-decision on company operations.

However, within the organization, various systems of different make operate alongside each other. In order to query these systems often an abstraction layer is put on top of these system. Commonly, this layer will need to be extended (adapted) for each system and change thereof. The development and maintenance of such a layer therefore requires a significant amount of time from your resources.

The DataSource Integrator provides an easy-to-use interface which allows you to integrate and aggregate the information from your systems directly, i.e. without the need of an abstraction layer. By doing so KPIs can be generated near real-time, enabling you to proactively manage your organizations resources.

Business Process Management and Reengineering

Business Process Management (BPM) is a customer-focussed management approach focused on aligning all company processes by applying a cross-functional perspective to process optimization. It is a holistic management approach that promotes business effectiveness and efficiency while striving for innovation, flexibility, and integration with technology. Business process management attempts to improve processes continuously. It could therefore be described as a "process optimization process".

Business Process Reengineering (BPR) is the analysis and design of workflows and processes within an organization. A business process is a set of logically related tasks performed to achieve a defined business outcome. Reengineering focuses on redesigning the process as a whole in order to achieve the greatest possible benefits to the organization and their customers.

BPM and BPR are closely related. The difference between them can be made clear by comparing the scopes of the two. While BPR merely focuses on the redesign of the planning and organizational part of the process. BPM encapsulates the complete feedback cycle of process planning, execution and montitoring and control.

Because the DataSource Integrator faciliates both data integration (the process execution step) and data aggregation (used for monitoring and controlling processes), it is an excellent tool for executing both BPM and BPR practices.

More information related to this topic can be found in:


With the dramatic expansion of IT, and the desire for increased competitiveness within corporations, there has been an increase in the use of computing power to produce unified reports which join different views of the enterprise in one single place.

Termed Enterprise Reporting, this process involves querying data sources with different logical models to produce a human readable report. For example, a computer user has to query the Human Resources databases and the Capital Improvements databases to show the office-space efficiency across an entire corporation.

In most organizations, departments, business units or employees create spreadsheets alongside the organization's systems, apparently for efficiency purposes. The problem however with these spreadsheets it the their lack of structure and governance.

Did you ever base your conclusions on a spreadsheet containing out-dated information? The DataSource Integrator allows for efficient creation of continueous data integration and aggregation processes. This mitigates the need for ad-hoc spreadsheets thereby reducing the proliferation of data within the organization.

More information related to this topic can be found in:


As BPM and data integration processes progress within your organization, special care must be taken to safeguard your data against unintended usage. This does not only concern usage outside of the organization, also internal use of data sources might need to be restricted as a result of BPM practices.

Access control in computer systems and networks relies on access policies. The access control process can be divided into two phases: the policy definition phase, and the policy enforcement phase.

Authorization is the function of the policy definition phase which precedes the policy enforcement phase where access requests are granted or rejected based on the previously defined authorizations.

Even when access is controlled through a combination of authentication and access control lists, the problems of maintaining the authorization data is not trivial, and often represents as much administrative burden as managing authentication credentials.

The DataSouce Integrator provides a comprehensive but nevertheless granular mechanism to control data access, to make sure your data is used and modified in the way you intended.

More information related to this topic can be found in: