View previous topic :: View next topic
|
Author |
Message |
Rijit
Active User
Joined: 15 Apr 2010 Posts: 168 Location: Pune
|
|
|
|
My cobol pgm loads a full table into an array using cursor..The purpose is - read input file and then search the array to get the values and write into the O/P file. But day by day the size of the table is increasing and so the job is abending due to array size limit. Currently the array size limit is > 300000.
One way I could think is to use a simple select instead of usng huge internal array..and for each record read in the input file do a select in the table. Please let me know any possible disadvantage of this method.
Can anyone advise a better way of handling this logic.
Thanks, |
|
Back to top |
|
|
Bill O'Boyle
CICS Moderator
Joined: 14 Jan 2008 Posts: 2501 Location: Atlanta, Georgia, USA
|
|
|
|
Not so fast.
Is this a QSAM flat-file or a VSAM KSDS or ESDS that's being loaded into the internal COBOL array?
Bill |
|
Back to top |
|
|
Rijit
Active User
Joined: 15 Apr 2010 Posts: 168 Location: Pune
|
|
|
|
QSAM simpel flat file which is input. Logi is to read the input file sequentially get some field from it and match with the array element by serach operation. The Array conains a fully loaded tabele.. My main objectie is to remove the array from the pgm and use some alternative and effecient way to handle this condition. |
|
Back to top |
|
|
Bill O'Boyle
CICS Moderator
Joined: 14 Jan 2008 Posts: 2501 Location: Atlanta, Georgia, USA
|
|
|
|
REPRO the flat-file into a VSAM ESDS (or KSDS if applicable) in the step prior to executing the program.
In the next step and within the COBOL program, call an Assembler utility program, which OPENS the VSAM file, issues a SHOWCB Macro, loads the file's Number-of-Records (and if needed, the LRECL) into passed-parms, CLOSES the VSAM file and returns to the caller.
You now have the number of records to dynamically calculate the amount of storage that you'll need.
Move the array to LINKAGE (with an OCCURS DEPENDING ON) and Call LE Callable Service routine "CEEGTST" to obtain the amount of storage needed.
There's a little more to this, but does this sound feasible or is management scared to death of Assembler?
Bill |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
If one of the keys to the table is the same that is being matched against this internal array, it should be trivial to eliminate the array. . . If not, a key may need to be added to the table to support this.
Then, make sure the sequential file is in order by this key when looking in the table for a match and do not do the lookup for duplicates - the answer is already known. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
Unless i misunderstand, the input "file" is already sequential and the other data is in a database table. . . |
|
Back to top |
|
|
Bill O'Boyle
CICS Moderator
Joined: 14 Jan 2008 Posts: 2501 Location: Atlanta, Georgia, USA
|
|
|
|
You can handle a maximum of 300,000 array-entries. What's the size of each entry?
Bill |
|
Back to top |
|
|
Rijit
Active User
Joined: 15 Apr 2010 Posts: 168 Location: Pune
|
|
|
|
dick scherrer wrote: |
Hello,
Unless i misunderstand, the input "file" is already sequential and the other data is in a database table. . . |
You are 100 % right.. The input file is already a sequential file sorted based on the key field which is to be used to read the table..At present the table is populated into the array as a whole in the starting of the program. Then down the line the input file is read in a sequential way and fr each record it does a search in the array based on that key field..I hope I am able to communicate my thoughts clearly..
U said it is trivial to eliminate the array..how can we accomplish that pls suggest..
Thanks |
|
Back to top |
|
|
Rijit
Active User
Joined: 15 Apr 2010 Posts: 168 Location: Pune
|
|
|
|
Bill O'Boyle wrote: |
You can handle a maximum of 300,000 array-entries. What's the size of each entry?
Bill |
The array consists of 3 varaiables and the size is 3 lakh..This is causing issue in production as the table has no of rows > 3 lakhs so it is failing frequently. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
Instead of loading/searching the array, use the key to read the data from the dabatase table.
When "this" input key is the same as the "previous" input key, no need to even access the database - this has already been done for this key. |
|
Back to top |
|
|
Rijit
Active User
Joined: 15 Apr 2010 Posts: 168 Location: Pune
|
|
|
|
Bill,
The VSAM concept is good..I am thinking is it wise to unload the required fields (including key) from the table and load it into a KSDS prior to the step which calls my COBOL pgm. Then in the pgm read the KSDS using the key obtained from the input flat file. But the problem is the tbale has > 3 lakh rows so the VSam will also become bulky and there will be an overhead of the delete/define everytime the job runs..What are your thoughts on this? |
|
Back to top |
|
|
Rijit
Active User
Joined: 15 Apr 2010 Posts: 168 Location: Pune
|
|
|
|
dick scherrer wrote: |
Hello,
Instead of loading/searching the array, use the key to read the data from the dabatase table.
When "this" input key is the same as the "previous" input key, no need to even access the database - this has already been done for this key. |
Wow! that makes sense Ihave noticed that theere are duplicates in the input flat file when I sorted it based on the key field...So what you mean to say is that read the input flat file sequentially and hit the database with a select query only when the key field changes..IN remaining times just use the values saved in the working storage fields to write in the O/P file..Correct me if my understanding is wrong!
But Dick I have question here, we will be running into the risk of hitting the DB2 database each time we do a select for the record read frrom the input file..Is it good coding standards?
Thanks, |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
Quote: |
read the input flat file sequentially and hit the database with a select query only when the key field changes.. |
Yup, that's it
Quote: |
we will be running into the risk of hitting the DB2 database each time we do a select for the record read frrom the input file..Is it good coding standards? |
Well, one reason data is stored in a databse is so that data may be directly accessed when needed. . . I believe reading the database rows only as needed is the better alternative between the internal array or unload/load to get the data into a vsam file. |
|
Back to top |
|
|
Rijit
Active User
Joined: 15 Apr 2010 Posts: 168 Location: Pune
|
|
|
|
Yes I am convinced..and for my select can I use the isolation level UR or CS?
I guess UR has some disadvantage as well as advantage right? I find this part very confusing |
|
Back to top |
|
|
Rijit
Active User
Joined: 15 Apr 2010 Posts: 168 Location: Pune
|
|
|
|
In the original cursor declared to load the internal array in the program had a "FETCH ONLY WITH UR " clause.. |
|
Back to top |
|
|
Bill O'Boyle
CICS Moderator
Joined: 14 Jan 2008 Posts: 2501 Location: Atlanta, Georgia, USA
|
|
|
|
I think I interpreted your question "BASS ACKWARDS"
Bill |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
Quote: |
I guess UR has some disadvantage as well as advantage right? |
Depends on specifics. If i understand this, your process will not update the table, so you shouldn't need to lock anything. If you read a row and it will change 10 seconds from now, does this matter?
Be careful that something rather straightforward does not become unnecessarily complex/convoluted. . .
@Bill,
Must be the time of year. I've done the same a couple of times lately. . .
As i said there - Aaaaaargh. . . |
|
Back to top |
|
|
don.leahy
Active Member
Joined: 06 Jul 2010 Posts: 765 Location: Whitby, ON, Canada
|
|
|
|
I wonder if the process was using SEARCH or SEARCH ALL to do its table lookups. |
|
Back to top |
|
|
GuyC
Senior Member
Joined: 11 Aug 2009 Posts: 1281 Location: Belgium
|
|
|
|
If I understand correctly you already have program logic to loop & fetch the rows of the DB2 table sequentially.
Why not keep that and instead of searching the WS-table read or not read the next record(s) in the file.
Pretty standard coding technique. It's been done for merging two sequential files, not much difference for a db2 cursor and a sequential file.
Dick even has an example program posted as a sticky. |
|
Back to top |
|
|
|