View previous topic :: View next topic
|
Author |
Message |
arun nehra
New User
Joined: 29 Nov 2008 Posts: 62 Location: mumbai
|
|
|
|
Hi All,
I want to create a 1-D variable length array in cobol using 'DEPENDING' Clause. I want to know how the array length will vary dynamically in the cobol program; if i go with this.
Regards |
|
Back to top |
|
|
CICS Guy
Senior Member
Joined: 18 Jul 2007 Posts: 2146 Location: At my coffee table
|
|
|
|
Code: |
01 data-area.
05 data-size pic s9(4) comp.
05 data-array occurs (maximum) times depending on data-size pix x(?).
move (number greater than zero and not greater than maximum) to data-size. |
|
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8696 Location: Dubuque, Iowa, USA
|
|
|
|
If the variable with the OCCURS DEPENDING ON clause is in WORKING STORAGE, the array length will not vary. If you explain more of what you are wanting to do, we can provide guidance. |
|
Back to top |
|
|
Bill O'Boyle
CICS Moderator
Joined: 14 Jan 2008 Posts: 2501 Location: Atlanta, Georgia, USA
|
|
|
|
Are you using a FILE to populate the array? If so, is this a VSAM file?
Bill |
|
Back to top |
|
|
arun nehra
New User
Joined: 29 Nov 2008 Posts: 62 Location: mumbai
|
|
|
|
Hi,
Below is the 1-D variable length array that i have created. I am loading this array by reading a flat file. The problem with the input file is it has variable amount of records (part of daily job) daily. Further the array created below is used to update a second (sorted file) file based on the match if found. My aim is to reduce the CPU time flactuation that may happen because of above variation in input file records. The Array declaration for the same is as below:
Code: |
01 CONFIG-TABLE.
05 CFG-ENTRIES OCCURS 1 TO 100000 TIMES DEPENDING
ON MAX-VALUE ASCENDING KEY IS
CFG-TBL-A
CFG-TBL-B
CFG-TBL-C
INDEXED BY CFG-INDX.
10 CFG-TBL-A PIC X(4).
10 CFG-TBL-B PIC X(8).
10 CFG-TBL-C PIC X(3).
10 CFG-TBL-D PIC X(1).
10 CFG-TBL-E PIC X(2).
10 CFG-TBL-F PIC X(2).
10 CFG-TBL-G PIC X(1). |
Also i am using binary search technique while looking for a match in the above array. |
|
Back to top |
|
|
Bill O'Boyle
CICS Moderator
Joined: 14 Jan 2008 Posts: 2501 Location: Atlanta, Georgia, USA
|
|
|
|
Are you anticipating up to a maximum of 100,000 records on the flat-file at any given time? Is the flat-file in the sorted-order that you're expecting?
Would you know actual number of records on the file before the program gets invoked?
Bill |
|
Back to top |
|
|
arun nehra
New User
Joined: 29 Nov 2008 Posts: 62 Location: mumbai
|
|
|
|
Hi Bill,
I am putting '100000' as a maximum limit to be just on safer side. Normally this varies between 1000-1500 records. And yes the input file is a sorted file based on the three fields (A,B and C) as mentioned above. Yes i can get the number (total input records) before the program get invokes if this can help me in any way.
Regards |
|
Back to top |
|
|
dbzTHEdinosauer
Global Moderator
Joined: 20 Oct 2006 Posts: 6966 Location: porcelain throne
|
|
|
|
Quote: |
I want to know how the array length will vary dynamically in the cobol program |
length / size . COBOL does not have 'dynamic' arrays. If the COBOL Internal Table (array) is defined in Working-Storage with the ODO (occurs depending on) phrase, it will always be mapped to max potential size.
The main purpose of ODO is to increase the efficiency of the Binary Search.
The only way to have a 'dynamic' array, is to acquire storage (via the LE equivalent of a getmain) at runtime basing your size requirement on the number of items you will have.
even at 100,000 items at 21 bytes per is only a 2 meg table, which isn't that big - but is a lot bigger that 2000 items at 21 per (42,000 bytes).
though I am not a big proponent of dynamically allocating storage at run time (program stops if not enough available memory) as apposed to the job waiting to start until enough memory is available, the point is moot since most jobs have more than one step.
I would spend more time analyzing your process.
If both files are sorted,
a simple 2-file match merge would allow your program to be smaller, the repeated I/O would suspend your program (therefore I/O bound)
thus being more user friendly to other jobs in the system - size and interrupts. |
|
Back to top |
|
|
arun nehra
New User
Joined: 29 Nov 2008 Posts: 62 Location: mumbai
|
|
|
|
Actually the second file referred above contains Approx 40 Million records. And i need to compare each of that record with input (usually 1500-2000 in number) file and doing this comparison i need to reduce no of transactions. I am beliving binary search will give me the fastest search. Please let me know how i should proceed. |
|
Back to top |
|
|
Bill O'Boyle
CICS Moderator
Joined: 14 Jan 2008 Posts: 2501 Location: Atlanta, Georgia, USA
|
|
Back to top |
|
|
arun nehra
New User
Joined: 29 Nov 2008 Posts: 62 Location: mumbai
|
|
|
|
Thanks Bill,
I will see if i can get something out from this for my use.
Regards |
|
Back to top |
|
|
dbzTHEdinosauer
Global Moderator
Joined: 20 Oct 2006 Posts: 6966 Location: porcelain throne
|
|
|
|
Arun Nehra,
it appears that you have decided on the commands that you will use
and are just looking for justification for your process.
I ask again,
will any record in your 2nd INPUT file be matched with more that 1 record of your 1st input file.
IF so, yes you need to table your first input file.
IF NOT, then you need to go to the trouble of writing a 2 file match and process,
which will make you program smaller and more user friendly to the rest of the system. |
|
Back to top |
|
|
CICS Guy
Senior Member
Joined: 18 Jul 2007 Posts: 2146 Location: At my coffee table
|
|
|
|
arun nehra wrote: |
I am putting '100000' as a maximum limit to be just on safer side. Normally this varies between 1000-1500 records. |
No matter how high you set the max, you will want to check the current index against the max while loading the array and abort before exceeding it.
Personally, if 2000 was the probable max, I'd set the max to 3000 and start warning messages as the filled array approached something like 2700. Warnings would allow the max to be raised in a timely manner without effecting production. |
|
Back to top |
|
|
arun nehra
New User
Joined: 29 Nov 2008 Posts: 62 Location: mumbai
|
|
|
|
I am still unclear about the concept of 'DEPENDING' clause and the 'MAX' value that is put with variable length array declaration.
As suggested above i reduce the MAX record limit for occurs clause to 3000. And ran my program with the expection that it will reduce my CPU/Lapse time significantly; but to my surprise there was no significant reduction in CPU time and Elapse time was almost same.
As can be seen above (top) the two records corresponding to two files are just 21 bytes long and i need to compare all records of File A (1500-2000 records) with each record of File B (40 M records) and update File B if there is match found.
Please help. |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8696 Location: Dubuque, Iowa, USA
|
|
|
|
CPU time and elapsed time do not depend upon the maximum number of occurrences in an array, they depend upon how many elements of the array you are using. |
|
Back to top |
|
|
arun nehra
New User
Joined: 29 Nov 2008 Posts: 62 Location: mumbai
|
|
|
|
Hi Robert,
05 CFG-ENTRIES OCCURS 1 TO 100000 TIMES DEPENDING
ON MAX-VALUE ASCENDING KEY IS
Does that mean (referring above) changing 100000 to 3000 will not make any difference when i make a binary search on the array after loading.
Also how MAX-VALUE will impact the loading of array.
Regards |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8696 Location: Dubuque, Iowa, USA
|
|
|
|
Changing the number of occurrences will change the amount of memory required by the program. It will have no impact on CPU time used, nor on the elapsed time -- other than the small amount required to load the program from the load library. The COBOL Language Reference manual in section 6.2.32.3 states that
Quote: |
To ensure correct execution of a SEARCH statement for a variable-length table, make sure the object of the OCCURS DEPENDING ON clause (data-name-1) contains a value that specifies the current length of the table. |
so it does not appear that the extra table elements would be searched -- so no impact on CPU time or elapsed time. |
|
Back to top |
|
|
Bill O'Boyle
CICS Moderator
Joined: 14 Jan 2008 Posts: 2501 Location: Atlanta, Georgia, USA
|
|
|
|
As an experiment, remove the DEPENDING ON and change it to a hardcoded OCCURS 3000 or whatever maximum you choose.
Before populating the ARRAY, initialize it to all HIGH-VALUES and then load it. By initializing to HIGH-VALUES, the SEARCH ALL will work for you.
This is just a suggestion and I'm unsure whether CPU will be increased or reduced using a hardcoded OCCURS as opposed to a DEPENDING ON.
However, a COBOL SEARCH ALL always causes a BALR (CALL) to a COBOL run-time routine, regardless.
YMMV....
Bill |
|
Back to top |
|
|
arun nehra
New User
Joined: 29 Nov 2008 Posts: 62 Location: mumbai
|
|
|
|
Thanks Bill for your suggestion.
Also another thing that i am currently looking for; As per Dick in the previous post:
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------
length / size . COBOL does not have 'dynamic' arrays. If the COBOL Internal Table (array) is defined in Working-Storage with the ODO (occurs depending on) phrase, it will always be mapped to max potential size.
The main purpose of ODO is to increase the efficiency of the Binary Search.
The only way to have a 'dynamic' array, is to acquire storage (via the LE equivalent of a getmain) at runtime basing your size requirement on the number of items you will have.
-------------------------------------------------------------------------------------------------------------------------------------------------------------------------
Thus even though i am writting 'Depending' on clause above, the table size still will be maximum size as i am doing it in WS section. How i should declare my array (don't know the syntax) as Dick has suggested to get the array size dynamically allocated. |
|
Back to top |
|
|
Bill O'Boyle
CICS Moderator
Joined: 14 Jan 2008 Posts: 2501 Location: Atlanta, Georgia, USA
|
|
|
|
Yes, as others (including Dick) have said, if you have an ARRAY with a fixed-number or an ODO (both of these are defined to WS), then the amount of storage occupied is the same.
Only an ARRAY in LINKAGE is a true ODO. If you define this ARRAY as an ODO 1 to 3000 and you determine you only have 100 entries, then only the actual storage will be allocated.
Example: (100 * entry-length) + 4. The four is the binary-fullword (this is my own preference) that contains the high-water mark for the ODO (a value of 100).
FWIW (and IMHO) a SEARCH ALL is not worth it if the number of entries does not exceed 128. You'd be better off with either a sequential SEARCH or your own in-line PERFORM.
Note that in some cases, a sequential SEARCH can cause the compiler to BALR to a COBOL Run-Time routine.
Bill |
|
Back to top |
|
|
dbzTHEdinosauer
Global Moderator
Joined: 20 Oct 2006 Posts: 6966 Location: porcelain throne
|
|
|
|
If your second file is sorted on the same key/keys as the first,
before you perform a search, you should check if the last found is equal to the new record-2.
but,
again,
if both files are sorted the same (no reason not to be),
then you can accomplish the requirement with a simple match of two files
and not have to deal with the problem of table overflow due to a very large number of file 1 records. |
|
Back to top |
|
|
arun nehra
New User
Joined: 29 Nov 2008 Posts: 62 Location: mumbai
|
|
|
|
Hi Bill,
In my case normally no of records remains more then 128 and i guess i can go with the one you have suggested above. Also I went through one of your old post where you had mentioned SYNTAX for the array declaration; but that was for a CICS program.
www.ibmmainframes.com/about43934.html
I need it for my program which is a normal COBOL module. Thanks in advance. |
|
Back to top |
|
|
dbzTHEdinosauer
Global Moderator
Joined: 20 Oct 2006 Posts: 6966 Location: porcelain throne
|
|
|
|
Arun Nehra wrote: |
I need it for my program which is a normal COBOL module |
and the difference in syntax of an ARRAY in a CICS Cobol module and a normal
(what would be an abnormal COBOL module?)
COBOL module would be what? |
|
Back to top |
|
|
arun nehra
New User
Joined: 29 Nov 2008 Posts: 62 Location: mumbai
|
|
|
|
Specifically:
what will be the equivalent of below code in cobol for getting space from main.
EXEC CICS GETMAIN
SET (WS-DYNAMIC-POINTER)
FLENGTH(WS-DYNAMIC-FLENGTH)
INITIMG(WS-DYNAMIC-INITIMG)
NOHANDLE
END-EXEC.
IF EIBRESP NOT = DFHRESP(NORMAL)
PERFORM GETMAIN-ERROR-RTN
GO TO END-OF-PROGRAM
END-IF.
SET ADDRESS OF LS-DYNAMIC-TABLE-REC TO WS-DYNAMIC-POINTER.
MOVE WS-DYNAMIC-NBR-RECS TO LS-DYNAMIC-NBR-RECS.
and also the size of linkage section variable
LS-DYNAMIC-REC PIC X(16777211).
as there was some confusion on this at that time. Thus what will be exact value i need to assign, (i remember it was 128 M) but what will be the exact figure similar to the one Bill has mentioned in above case. |
|
Back to top |
|
|
dbzTHEdinosauer
Global Moderator
Joined: 20 Oct 2006 Posts: 6966 Location: porcelain throne
|
|
|
|
I realize that you don't like reading my posts,
as I tend to force people to think about things,
and to stop with their own preconceived notions about how to do things.
As well as the fact that your question was about syntax with respect to defining an ARRAY,
not about acquisition of storage:
LE Dynamic storage callable services
some other links that my be helpful:
LE Overview
All LE Documents
still did not answer the question about abnormal COBOL. |
|
Back to top |
|
|
|