Data Migration from a business (client) perspective is a game about numbers. How many records were migrated? How many records failed? How long did it take?
One hurdle faced is how to successfully report this information, especially when there exist transformation mappings between source and target, whether it be due to data quality issues or even the result of agreed on business rules.
For example, I recently found myself working on a project which required us to migrate OSS inventory data, and in particular location data. The source held a single record for each property, which included the Country, State, Suburb and Street address details; whereas the target system modeled a Location hierarchy resulting in one record for each location type. If one location in the target system hierarchy fails migration for some reason, what should happen to the other locations in that hierarchy? How is the single source record tracked through the lifecycle of the migration to multiple records (perhaps objects) in the target system.
As a result, one record in the source resulted in 4 or 5 records in the target. Even in simple scenarios like this, the reconciliation is forced to be more intricately woven when compared to a simple one to one mapping. With this complexity comes the responsibility to report correctly what exactly was migrated with clarity to the business from source to target.
Migration of OSS Network Inventory of course generally requires far more complex reconciliation than the example I gave above.
I would like to hear anyone else’s experience in reporting on reconciliation in scenarios like this where the business (client) wants to see the numbers in terms of what they are used to, which is the legacy systems, whereas comparing to target is multi dimensional.
I have experienced this problem on a number of transformation projects. To re-enforce the point made by Monty, data transformation is almost never a 1:1 mapping between legacy objects and target objects. This complicates your ability to track the success rate of your transformation project.
My recommendation is to identify the reporting metrics during the design and analysis phase of your project. These designs need to be reviewed by the development and test teams so that both teams have a similar view of what data is being reported. Defects are often raised according to the test teams report metrics, and being able to relate these back to the report metrics communicated to the business can be very valuable.