A rapidly growing engineering and manufacturing organization, located in the midwest, with a turnover of a little over $4 billion.
The organization is engaged in designing, building, and servicing infrastructure for data centers, communication towers, and commercial/industrial facilities.
The organization has over twenty thousand employees worldwide, twenty five six countries.
The scope of the project was to migrate all the enterprise data from multiple SAP ECC instances to Oracle Cloud. The target landscape included all the applications of Oracle Cloud in Supply Chain, Planning, Financials, Projects, and HR.
The implementation was to be carried out in multiple waves based on the business units. The applications were grouped based on the business processes as Quote To Delivery (QTD), Plan To Manufacture (PTM), Purchase To Receipt (PTR), Project To Close (PTC), Accounts To Report (ATR) and HR. First the HR data was to be migrated and carried over to all instances so that Users can test during the implementation life cycle.
The master data of Customer and Product were cleansed and governed by ChainSys dataZen data management tool and formed the basis for validation for all in all data migration.
Historical data were archived from SAP ECC to Hadoop Data Lake for reporting and analytics.
There were more than 1.8 million Items and 8 million BOM rows. There were around 90 business transactions each with multiple objects. The Transformation program was running in multiple waves and in each wave, there were four iterations – Demo, SIT, UAT and PROD.
Data were to be extracted based on the extraction criteria set by the business. The List of Values (LoVs) of SAP ECC and the Oracle Cloud were not uniform. There was a need to synchronize LoVs for creating the cross-references. Some of the SAP transactions were not complete and broken or incomplete. Some older transactions did not have status updated. All these lead to orphan records. Some critical business values were hidden, concatenated and stored as long texts. Scope of business data included all open transactions from Order Management, Inventory, Logistics, Planning, Production, Procurement, Projects, Costing, AP, AR, Billing , Fixed Assets, and GL and all HR data.
All the cloud applications have to be simultaneously migrated in each wave.
Every single attribute of SAP ECC was mapped with the attribute of Oracle Cloud. The mapping was carried out in three passes and multiple iterations. ChainSys pre-built templates helped to speed up the whole mapping process. All the cloud applications were simultaneously migrated to maintain data integrity.
Extracting from SAP, Validating with Master data, Enrichment & Transformation of selected attributes, Pre-validation and Loading of data were carried out in multiple cycles for each track, in each iteration and in each wave.
Data structure and standard interface of data load for each business transaction were different. ChainSys pre-built templates were used to speed up the operation. The pre-built adapters and BAPIs were used for extracting data from SAP. Chainsys dataZap tool is used for data extraction, transformation, and load and dataZense was used for all validations and reconciliation dashboards and reports. Apart from migrating the data, Oracle setup corrections and fine tuning of the LoVs were carried out in each data migration iteration. This included updating item attributes, inventory org assignments, BOM updates and routing corrections.
Historical data were required for reference and were archived and moved to the data warehouse - Hadoop Data Lake.
dataZap’s standard templates were used for mapping SAP attributes with Oracle Cloud, standard FBDI / web services for data loading and adapters for extracting data from SAP. The templates were updated to include custom attributes both from the SAP and Oracle applications.
Business rules for the extraction, transformation and loading were configured for reuse. Preload and post load data validations were carried out. Post load data reconciliation reports were provided to the business for verification and validation. Functional checks and validations were carried out on the completeness of the transactions. In addition, open transactions in each business process were tested for data integrity and completeness. The sequence of loading of transactions were maintained in each cycle so that end to end testing could be carried out. ChainSys dataZap and dataZense were used for ETL and quality improvements and validations.
Key benefit to the project was using the prebuilt templates delivered by ChainSys for data extraction, transformation and load (ETL) processes without starting to build them from scratch. Data was successfully migrated through testing in iterative rounds/sprints, with data validation and using best business practices in data management.
Other key benefits included:
All Business functions were able to continue and operate normally after the successful data migration.
End to End business process with consolidated reporting and analytics was possible because of the data integrity in all the migrated transactions.
Considerable savings in time and effort as the ChainSys tools were repeatedly used without reconfiguration for all the waves and test cycles during the life cycle of implementation.
All meaningful, essential and critical business data was captured in Oracle cloud applications. Uniform LoVs across the application increased user experience and productivity.
dataZap - Pre-Configured Templates & Migration Engine to Extract, Transform, Pre-Validate, Load, Reconcile & Report.
dataZen - To 'Get Clean' and 'Stay Clean', and Introduce Master Data Governance.