Joined: 28 Sep 2005 Posts: 210 Location: St Katherine's Dock London
Hi,
Currently shop is going through some data migration from a open system database to existing mainframe + Siebel CRM, of which mainframes data is stored in VSAM and DB2. Incoming data is going to adapt to the existing legacy functionality, so only data has to be extracted, transformed and loaded to the legacy database.
I am the guy who takes care of the VSAM part and as part of this data migration, I need to analyse the age old database (huge number of KSDS files having enormous data fields) and pave a way towards data mapping between source and target databases.
Task, is very unexciting (as it seems, because of so many years of changes in the data layout, base code and application design) but at the same time very critical because the data integrity must remain between source and target databases for which we are ultimately getting paid.
I am in desperate need of a word of advice, motivation from people who have done data migration in past. Please share your experiences for the benefit of a brain overpowered with reengineering.
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
Hello,
Yup, been involved with data conversion /migration for very many years. And yup, mapping can be tedious. Hopefully, there are people around/available who have what i call institutional knowledge (they know the data and maybe even how it evolved to the current state). These people may be part of the information technology group or the "power users" who are most familiar with the data in their systems.
One good thing is that going from the server to the mainframe, you shouldn't have the issue of how to handle repeating groups and redefines. Those probably don't exist on the "open system" databases.
How was the open system built originally? Was their a conversion/migraton from the mainframe vsam or databases? Reversing that process might be worth looking into.
How much data (volume) must be moved to the mainframe? Something i'd recommend paying attention to early on is how much must be done for the "real" conversion. Several systems i've been inivited to participate in have been near the "live" conversion when "full volume" testing lead to the discovery that the physical conversion would take longer than there was clock time to do it.
Joined: 28 Sep 2005 Posts: 210 Location: St Katherine's Dock London
Thanks d.sch. I think, to get views, I will keep adding my doubts/thoughts in this thread as work progresses.
I feel, in a facade, data migration is more likely a business migration and the database to be migrated carries not only data but also long history of business changes, things outdated, things added up.
Quote:
How was the open system built originally?
The source database is oracle on a solaris and front end is built using oracle forms. The application is a replica of what happens in mainframe (only for a few countries) but incurs a good amount of maintenance and operational costs every year. But even then the way entities are defined in this system are a lot different than the way they are in mainframes.
Quote:
How much data (volume) must be moved to the mainframe?
Currently we are doing the volumetric analysis in order to gauge the space required to hold the incoming data.
Main challenge which I feel, right now, is understanding the data modeling of the two systems and building a transformation rule for the incoming data. Since source system is oracle, it is easier to get ER diagrams etc to understanding the data model. But in VSAM it is all manual process, and there are hardly any latest documents on the mainframe data model.