View previous topic :: View next topic
|
Author |
Message |
Manshadi
New User
Joined: 31 Aug 2005 Posts: 82
|
|
|
|
Hi,
We are trying to move a table space from DSN3 to DSN2 DB2 subsystem.
After alter the name of related linear VSAM file and repairing LEVELID we tried to replace PSID on page 0 with following command:
REPAIR LOCATE TABLESPACE DUMPDB.TSACCM PAGE X'00000000'
VERIFY OFFSET 12 DATA X'01080029'
REPLACE OFFSET 12 DATA X'01050006'
But the result of job is unsuccessful as follow:
DSNUCBRL - LOCATE TABLESPACE DUMPDB.TSACCM PAGE X'00000000'
DSNUCBRP - VERIFY OFFSET 12 DATA X'01080029'
DSNUCBRR - VERIFY OPERATION SUCCESSFUL
DSNUCBRP - REPLACE OFFSET 12 DATA X'01050006'
DSNUCBRR - REPLACE OPERATION SUCCESSFUL, DATA WAS X'01080029'
GBAC - UTILITY DATA BASE SERVICES MEMORY EXECUTION ABENDED, REASON=X'00C200C1'
Please let me know if you have any experiences about this matter.
Usually some of the our friends in this forum use causal DB2 messages and codes to reply my questions while I really need your experiences which went to successful result.
Best regards
Manshadi |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
Quote: |
Usually some of the our friends in this forum use causal DB2 messages and codes to reply my questions while I really need your experiences which went to successful result. |
I have no idea what "casual" means here. . . The "messages and codes" are the formal info from IBM on what is wrong and should be read and understood.
I also have no idea why you are trying to work with the underlying lds files. The typical way to do what you say you want to do (take something from one subsystem and install it in another) is to unload the data, create the corresponding entries in the target subsystem, load the data into the target subsystem, and verify that everything is correct. Then drop the entries and data from the original subsystem.
Suggest backups be taken of both subsystems before and after the relocation. |
|
Back to top |
|
|
GuyC
Senior Member
Joined: 11 Aug 2009 Posts: 1281 Location: Belgium
|
|
|
|
or maybe you could use DSN1COPY which does this altering/repairing for you. |
|
Back to top |
|
|
enrico-sorichetti
Superior Member
Joined: 14 Mar 2007 Posts: 10873 Location: italy
|
|
Back to top |
|
|
Manshadi
New User
Joined: 31 Aug 2005 Posts: 82
|
|
|
|
Hello Dick,
FYI,
Procedure for moving data
Method 1: Unload or Reload
Method 2: DSN1COPY with OBID translation
Method 3: Use DB2 VSAM data sets from the source system in the target system
I am trying to use method 3 because :
Because this method does not require copying the data, it is the fastest of the three methods, in terms of performance. It is, however, the most complicated. Another advantage of this method is that it requires less disk space than the other methods; this method is generally used for very large table or index spaces.
We have big database so taking unload and reload to the other system taking around 7 hour time and it not good for us.
we are doing this way at the moment and I am trying to find better solution.
Please let me know if you have better solution.
I am trying to use Flashcopy or XRC or PPRC-XD also.
Best regards
Manshadi |
|
Back to top |
|
|
GuyC
Senior Member
Joined: 11 Aug 2009 Posts: 1281 Location: Belgium
|
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
Quote: |
I am trying to use method 3 because :
Because this method does not require copying the data, it is the fastest of the three methods, in terms of performance. |
Basically, you just want to avoid as much personal effort as possible.
We (multiple times) have used the unload/load to get data from one subsystem to another. We've handled several million rows in just a few minutes. . .
Also, if there is a need to work with the data prior to the reload, the unloaded data is quite easy to work with.
Consider also that if some database objects are currently in some environment, there is a good chance they will be needed there again (or in a third) . . . I'd recommend having the jobs already set up and simply run them when/if needed.
PersonalOpinion On:
Well supported environments have this kind of jobstreams set up when the tables are first created (when i'm the dba, i generate these). When something like this is needed, they are simply customized (if necessary) and run.
PersonalOpinion Off |
|
Back to top |
|
|
|