View previous topic :: View next topic
|
Author |
Message |
deepak_munjal
New User
Joined: 30 May 2008 Posts: 43 Location: Mumbai
|
|
|
|
Hi,...
I have a file which i need to load in a table in such a way that---
>Data inserted if duplicate not found. (We can use resume option in Load card)
>Duplicate records should be override by the load i.e. should update the existing duplicate record.<Means if we find a same primary key row in the table we should update with the new data from file>.
I need to do it using load card...
Appreciate your help on this..
<Sorry.. i have posted this in ICETOOL forum>
Thanks,
Deepak |
|
Back to top |
|
|
Escapa
Senior Member
Joined: 16 Feb 2007 Posts: 1399 Location: IL, USA
|
|
|
|
First thing is you have posted it to wrong forum.
Secondly you need to tell which utility you are using to load data in table
Also what about rows which are there only in the table but not in the file. Should they be kept or deleted? |
|
Back to top |
|
|
enrico-sorichetti
Superior Member
Joined: 14 Mar 2007 Posts: 10873 Location: italy
|
|
|
|
moved where it belongs |
|
Back to top |
|
|
deepak_munjal
New User
Joined: 30 May 2008 Posts: 43 Location: Mumbai
|
|
|
|
Its a BMC'S LOAD PLUS FOR DB2 LOAD
Utility used-- D2BTLD0E
Thanks enrico!
Records should be kept in the table.
No replace.... ONLY DUPLICATE RECORDS SHOULD BE REPLACED. |
|
Back to top |
|
|
Escapa
Senior Member
Joined: 16 Feb 2007 Posts: 1399 Location: IL, USA
|
|
|
|
Best way is to look for BMC LOAD utility manual...
Since not all shops have this you need to wait for somebody who has worked on this to get the answer. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
Or implement some other solution. . .
Many systems have the same requirement and do not have the BMC tools available yet they still get this done with minimal effort. |
|
Back to top |
|
|
deepak_munjal
New User
Joined: 30 May 2008 Posts: 43 Location: Mumbai
|
|
|
|
Other solution i can't look because this I need to do with 7 load jobs.
7 load jobs are already being implemented long back using Resume option without taking care of REPLACE DUPLICATE ROWS.
Dick- Can you please let us know if you have any other way to implemente this without using a cobol program ?
It would be apprectiated if any one from moderator group can able to look into this and provide the option in load card with which we can achieve our goal. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
Quote: |
Can you please let us know if you have any other way to implemente this without using a cobol program ? |
How many rows are in these tables?
If the volume is not too large, the tables could be unloaded to qsam files, the "update" data merged with the unloaded data removing duplicates, and the output from the merges loaded back into the tables. |
|
Back to top |
|
|
deepak_munjal
New User
Joined: 30 May 2008 Posts: 43 Location: Mumbai
|
|
|
|
Thanks Dick!
Good idea.. but i can't use this as the number of records is more than a million in each table to process for daily run.
Do we have any other utiltiy used for load the data apart with which we can able to achieve this ?
I m looking to complete this in one JCL JOB without using COBOL program.
. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
Rather than the concern of processing a million rows, i'd suggest looking at how long it takes to unload and reload these . . .
We regularly do this with a (small for us) 2.7 million row table and it takes only a few minutes. Several things i work with are measured in tera-bytes and no, these would not be good candidates for a daily unload/reload. . .
What % of the data is updated? If it is only a thousand rows (or some other small number), it makes no sense to exclude a simple coded program. The wasted system resources would be required "forever" to save a day or less actual coding. . . |
|
Back to top |
|
|
|