View previous topic :: View next topic
|
Author |
Message |
Help-Me-Out
New User
Joined: 09 Dec 2006 Posts: 56 Location: Pune
|
|
|
|
Hi,
I have requirement where i need to unload the data from the table with the help of input file which has accounts. I trying to explore the better option to get it done either thru rexx or platinum unload, my preference is platinum unload.
Requirement -
I am trying to retrive the rows from the table with some conditions. I have list of accounts in the flat file. I want to unload the accounts macthing in the file. File has around 11 million records.
I am thinking to acheive this with the IN claus. Will the select query with IN clause cause performance issue?
I want to know whether this can be achived through platinum unload (performance wise) since the account are huge in number? The table can have multiple rows as well. This is only one time activity.
Is there any other suggetion on this? |
|
Back to top |
|
|
Pandora-Box
Global Moderator
Joined: 07 Sep 2006 Posts: 1592 Location: Andromeda Galaxy
|
|
|
|
When you can talk about options why not take an extra step and try it ?
Platinum is a third party product and what if your shop decides to stop using it |
|
Back to top |
|
|
Pandora-Box
Global Moderator
Joined: 07 Sep 2006 Posts: 1592 Location: Andromeda Galaxy
|
|
|
|
Also why dont you try DSNTIAUL where the split query in different files can be cocatenated and executed |
|
Back to top |
|
|
gylbharat
Active Member
Joined: 31 Jul 2009 Posts: 565 Location: Bangalore
|
|
|
|
I think if you put everything in the IN clause it may exceed the Sql query limit.
How big is the table? I mean how many rows does the table has?
If you are unloading around 80% of the data, then its better unload the entire table and then use sort join to filter your records. |
|
Back to top |
|
|
GuyC
Senior Member
Joined: 11 Aug 2009 Posts: 1281 Location: Belgium
|
|
|
|
Unload/Sort merge is an option.
Another option (if available) is to upload your file into a personal/temporary table and unload from a join. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
If there are to be 11 milliion "matches" and you choose to do this within db2, plan on it running for a while . . . If you upload the accounts number file and then join, this will also add a considerable load on the db2 resources used.
I'd suggest you unload the entire table and then using your sort product, match the table rows against the account number file (look at JOINKEYS).
Do you really want the entire row or only selected columns? The less volume to be passed (maybe multiple times) the better will be the performance. When JOINing the files, you can output some or all of the columns depending on what you need to do with the extracted data. |
|
Back to top |
|
|
Help-Me-Out
New User
Joined: 09 Dec 2006 Posts: 56 Location: Pune
|
|
|
|
Big thanks to all for your reply on this. Now I have decided to go with the COBOL-DB2 program. This is the only way suitable to me for this query.
Thanks a Lot!!!! |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
You're welcome.
If there are performance problems with the cobol/db2 process, you may want to reconsider the JOINKEYS approach.
d |
|
Back to top |
|
|
Escapa
Senior Member
Joined: 16 Feb 2007 Posts: 1399 Location: IL, USA
|
|
|
|
I would go with GuyC...... |
|
Back to top |
|
|
|