Joined: 14 Jan 2008 Posts: 2501 Location: Atlanta, Georgia, USA
In a current program, here's what we do -
Code:
01 QUEUE-DEFINITION-TABLE.
05 QUEUE-DEF-MAX PIC S9(08) COMP-5.
05 QUEUE-DEF-TBL OCCURS 1 TO 10000 TIMES
This array loads a VSAM file, with the number of records obtained from an Assembler sub-program, which issues a SHOWCB Macro. The current 10000 seems to be a good threshold.
We dynamically calculate the max number of records in this array, by dividing THE LENGTH OF QUEUE-DEFINITION-TABLE (minus THE LENGTH OF QUEUE-DEF-MAX) by THE LENGTH OF QUEUE-DEF-TBL (1). If the number of records returned from the sub-program is greater than the calculated-max, then we have a problem.
Now, in COBOL 5.1, we have the UNBOUNDED keyword, which substitutes for the hard-coded max of 10000.
But when UNBOUNDED is used THE LENGTH OF QUEUE-DEFINITION-TABLE in the compiler doesn't have a length, but rather a "0CL*", which basically says "I don't know the length because you're using UNBOUNDED".
I was told (by Tom Ross) that UNBOUNDED has a max based upon the picture clause and definition, which in this case is a signed binary-fword.
So, if I wanted to define THE LENGTH OF QUEUE-DEFINITION-TABLE, I'd use X'7FFFFFFF' which is the max value for a binary-fword.
I'd like to get some feedback on this approach. Otherwise, I'll leave it at 10000 and that will be that.
Joined: 09 Mar 2011 Posts: 7309 Location: Inside the Matrix
The maximum value your ODO item can hold would be the maximum number of entries in the table.
Depending on the maxima for LINKAGE SECTION items for V5.1, which I haven't checked, that would be it.
However, it would require that the ASM program is returning a value in a field which is longer/contains a higher maximum value (ie unsigned) than the ODO item, else your test for over-the-top will never be true, and other stuff within the table will probably get overwritten, or something outside the table will.
On the other hand, if the figure from the ASM can never big bigger than the maximum of the ODO, you don't even need to test it...
Joined: 14 Jan 2008 Posts: 2501 Location: Atlanta, Georgia, USA
I've decided to leave it at 10000 as it would cause changes to the program that (for the most part) seem unnecessary. Besides, the new method incorporated into 5.1, to calculate BLL's at approximately a 512K chunk allocation has certainly sped up large table addressability.
All and all, 5.1 looks like the compiler folks did a good job....