View previous topic :: View next topic
|
Author |
Message |
pravinj
New User
Joined: 10 Jun 2008 Posts: 24 Location: india
|
|
|
|
Hey, i have an internal working storage array which will hold the entire DB2 table records in it. Currently the internal table size is fixed to be of some X times size. Now the size of the actual DB2 table (number of records) has increased beyond the specified X number of records in array declaration size.
Is only the manual updation of internal table size in the COBOL module a remedy?
Can you recommend any ways that can automate the increase in internal table size declared in program relatively to the increase in number of records in DB2 table? Thanks in advance.. |
|
Back to top |
|
|
William Thompson
Global Moderator
Joined: 18 Nov 2006 Posts: 3156 Location: Tucson AZ
|
|
|
|
Quote: |
Is only the manual updation of internal table size in the COBOL module a remedy? |
Yes, unless you want to cobble together a GETMAIN function and work in LINKAGE instead of WORKING STORAGE.... |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
Suggest you come up with a new design. At some point the table will probably become too large to fit into an internal array. And then it will be a crisis. . . You and your management and users will surely be less than happy. . .
It is almost always rather poor design to load an entire table into memory. . . |
|
Back to top |
|
|
Terry Heinze
JCL Moderator
Joined: 14 Jul 2008 Posts: 1249 Location: Richfield, MN, USA
|
|
|
|
Dick's advice is good, but I have some questions. Are you using ALL of the DB2 columns or are you loading unused columns into your table? Will the DB2 table usually increase or will it ever decrease? |
|
Back to top |
|
|
pravinj
New User
Joined: 10 Jun 2008 Posts: 24 Location: india
|
|
|
|
Thanks.
Module that i am working on is from vintage days . I cannot alter the module's design much.
It's a COBOL-BATCH module and they use internal array table to load the internal table using multifetch. The values in the internal array are later used at different points of the module multiple times. Now it's the situation that the array holding the DB2 records is overflowing which causes serious issues of leaving behind the vital information behind.
Instant solution is increasing the array size to an X. But this situation may incur again, so is there any design approach that could be handles to prevent this situation anymore.
Thompson,
Can you elaborate me on that? Am a new entrant here to this domain! |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
Quote: |
so is there any design approach that could be handles to prevent this situation anymore. |
You might consider using a temporary db2 table or 2 for this process. Instead of loading an in-core array, you would load this process-specific table and use it for the duraton of the process. When the process complets, the temporary table(s) would be dropped. |
|
Back to top |
|
|
|