IBM Mainframe Forum Index
 
Log In
 
IBM Mainframe Forum Index Mainframe: Search IBM Mainframe Forum: FAQ Register
 

Handling Linkage section data more than 134217727 bytes


IBM Mainframe Forums -> COBOL Programming
Post new topic   Reply to topic
View previous topic :: View next topic  
Author Message
GOWTHAM PAINENI

New User


Joined: 16 Aug 2011
Posts: 7
Location: USA

PostPosted: Thu Oct 27, 2011 4:59 am
Reply with quote

Hi,
I've two Batch COBOL modules, Program A & Program B. Here Program A make a Call to Program B using Linkage section copybook called XYZ.

The highlevel structure of my Linkage copybook (XYZ) is mentioned below.. At high level
1) Program A make a call to Program B, with DATE as input, Module B will
return all Parents & it's corresponding Childs that are meeting certain business logic for that DATE. (fyi There is lot of business logic in both programs, so many other data fields, I just simpliefied here to come stright to the point where I've a problem).
-------------------------------------------------------------------------------
01 LK-LINKAGE.
05 INPUT-DATA.
10 DATE PIC X(10)
05 OUTPUT-DATA.
10 DETAIL-INFO.
15 PARENT-ACCTS-ARRAY OCCURS 9999 TIMES.
20 PARENT-ACCT#
20 PARENT-NAME
......
......

20 PARENT-ACCT-HOURLY-ARRAY OCCURS 24 TIMES.
25 HR-START-TIME
25 HR-END-TIME
25 HR-QTY
..........
..........
20 CHILD-ACCT-ARRAY OCCURS 25 TIMES
25 CHILD-ACCT#
25 CHILD-NAME
..........
..........
25 CHILD-ACCT-HOURLY-ARRAY OCCURS 24 TIMES.
30 CHLD-HR-START-TIME
30 CHLD-HR-END-TIME
30 CHLD-HR-QTY
............
............
-------------------------------------------------------------------------------------

The total length of a ONE data record at a PARENT level (PARENT-ACCTS-ARRAY) in the OUTPUT data section is 34405 bytes for One occurence of the ARRAY.
My requirement is I need to suppport max of 9999 occurences of PARENT-ACCTS-ARRAY. When I use this number the 'Program B' is failing during compilation with the following error message.
-----------------------------------------------------------------
Error:-
-------------------------------------------------------------------
A data item exceeded the maximum supported length 134217727 for an item in the "WORKING-STORAGE SECTION",
"LOCAL-STORAGE SECTION" or "LINKAGE SECTION". The item was truncated to 134217727 characters.

The size of the "LINKAGE SECTION" exceeded the compiler limit of 128 megabytes of memory. Execution results
are unpredictable.

The length of a table exceeded the maximum supported length of 134217727 characters for a data record in the
"WORKING-STORAGE SECTION", "LOCAL-STORAGE SECTION" or "LINKAGE SECTION". The length of the table was
truncated to 134217727 characters.
--------------------------------------------------------------------------
If I change the OCCURENCES of PARENT-ACCTS-ARRAY from 9999 to 3500 it is getting compiled properly. Because total size of the output data comes around 120,417,500 (3500*34405 ) bytes which is less than max limit (134K).

After doing some R&D in the internet, I tried by inserting the below two compiler options before the IDENTIFICATION SECTION of the program.
000100 CBL TRUNC(OPT)
000200 CBL NOSSRANGE

However it didn't work.

Can any body help me with any other alrenate solution/break fix/trick :-) to fix this size limit issue? I need to support maxi of 9999 occurences of the PARENT-ACCTS-ARRAY.
Thanks in advance for your help !!
Back to top
View user's profile Send private message
Bill O'Boyle

CICS Moderator


Joined: 14 Jan 2008
Posts: 2501
Location: Atlanta, Georgia, USA

PostPosted: Thu Oct 27, 2011 7:11 am
Reply with quote

How are you defining your NUMERIC fields; Display-Numeric or Packed-Decimal (COMP-3).

For example, if you have an account-number defined as PIC 9(16), you can save 7-Bytes by defining it as PIC 9(17) COMP-3, which is 9-Bytes. Defining it as 9(16) COMP-3 is a waste and causes the compiler to adjust addressability of the high-order nibble of the first-byte; it's still 9-Bytes, regardless.

Unless you REALLY need a signed COMP-3 field, you should define your COMP-3 fields are Unsigned (as illustrated). The resulting sign-nibble will be a X'F' as opposed to a X'C'. The compiler uses additional packed-decimal instructions to ensure the integrity of packed-signed data.

Try this COMP-3 prototype approach and see what you come up with.

I would also highly recommend assigning INDICES to a given array as opposed to using subscripts. If you do decide to use subscripts, define them as unsigned binary-fullwords, PIC 9(08) COMP/COMP4/BINARY, with a TRUNC option of OPT (if at all possible). The TRUNC option only affects binary data. If your compiler supports COMP-5 (Native Binary) use this instead of the previous suggestions. It's basically binary T/W TRUNC(BIN), but more efficient (BIN is a dog with fleas). The TRUNC option has no affect on COMP-5 variables.

As a last resort, COMP/COMP-4/BINARY might be an alternative for SOME fields (other than subscripts), as Display-Numeric substitution, but again, this should be your last choice.

I also think that this post needs to be moved to the Mainframe COBOL forum.

Mr. Bill
Back to top
View user's profile Send private message
Bill Woodger

Moderator Emeritus


Joined: 09 Mar 2011
Posts: 7309
Location: Inside the Matrix

PostPosted: Thu Oct 27, 2011 11:17 am
Reply with quote

You are more than twice, nearly three times, over the limit.

I think you have to go back to the design. Why take everthing out of the DB, rather than reference it as needed? Do you need all the data-items? Can you live with 2,000 for the OCCURS (to allow for some business expansion, so you have time to come up with something else)?

You've hit a limit. There are no compiler options which are going to help.

You have a problem defining the data, in Prog A, as well as referencing the data in Prog B.

There are some ways "around" the limits, but I wouldn't even consider putting them in a complex program. Every time there is a problem with one of those programs, you'd have a nagging feeling that it was to do with the get-around. And it could be. Since the Cobol compiler knows the limits won't be broken, there might be code generated that doesn't work if they are.

Really, get the design sorted out. I'm sure we're full of suggestions if you need them.

If you want to look at the redesign the topic is probably best staying here. If you really want to tie a rope around your neck and stand on top of a cliff in a high wind, waiting for the inevitable which may never come, then it is better in the Cobol forum.
Back to top
View user's profile Send private message
GuyC

Senior Member


Joined: 11 Aug 2009
Posts: 1281
Location: Belgium

PostPosted: Thu Oct 27, 2011 12:50 pm
Reply with quote

DB2?
Back to top
View user's profile Send private message
Bill Woodger

Moderator Emeritus


Joined: 09 Mar 2011
Posts: 7309
Location: Inside the Matrix

PostPosted: Thu Oct 27, 2011 1:17 pm
Reply with quote

GuyC wrote:
DB2?


TBC. Profile says DB2, forum selected is DB2 - teminology, not.
Back to top
View user's profile Send private message
Akatsukami

Global Moderator


Joined: 03 Oct 2009
Posts: 1788
Location: Bloomington, IL

PostPosted: Thu Oct 27, 2011 7:13 pm
Reply with quote

Allocate all your storage in A. Pass B the address of the level 01 variable.
Back to top
View user's profile Send private message
GOWTHAM PAINENI

New User


Joined: 16 Aug 2011
Posts: 7
Location: USA

PostPosted: Thu Oct 27, 2011 10:36 pm
Reply with quote

Bill O'Boyle:-
1) YES, Where ever I've a possibility of using COMP-3 variables, I declared them as COMP-3 signed/unsigned based on the data type. That part has been taken care...

2) It's my bad I posted my query under DB2 rather than COBOL, I realized it after hitting Submit button :-).


Bill Woodger:-
What ever the data items that I can exclude from the OUTPUT has been removed. All data fields that are extracting are mandatory fields as per Business requirements... (fyi.. the output of my program A is a csv data file, which is a source of data for another application).

And, I can not limit the no.of. occurences to 2,000 for the OCCURS clause , because my current application can have max of 9000 entries of PARENT items for a particular day. that is reason why we thought of making the limit to 9999.
Back to top
View user's profile Send private message
Akatsukami

Global Moderator


Joined: 03 Oct 2009
Posts: 1788
Location: Bloomington, IL

PostPosted: Thu Oct 27, 2011 11:04 pm
Reply with quote

GOWTHAM PAINENI wrote:
Bill Woodger:-
What ever the data items that I can exclude from the OUTPUT has been removed. All data fields that are extracting are mandatory fields as per Business requirements... (fyi.. the output of my program A is a csv data file, which is a source of data for another application).

If I understand Sr. Woodger correctly, he asked why you need to have thousands of occurences in memory at a given time. Are you seriously doing some operation across all of them? Or is this the execrable modern excuse for design that assumes that hardware, software, and wetware are all available in infinite amounts at zero cost?

Incidentally, on a similar topic (unknown number of occurences) the suggestion was made of creating a linked list, with a dynamically allocated array of pointers, each containing the address of a node. This is overkill in your case, but does show that there's more than one way to flay this feline.
Back to top
View user's profile Send private message
Bill Woodger

Moderator Emeritus


Joined: 09 Mar 2011
Posts: 7309
Location: Inside the Matrix

PostPosted: Fri Oct 28, 2011 12:09 am
Reply with quote

That is indeed what I am wondering.

The thing is, Gowtham, your problem is not new. Cobol doesn't (really) do dynamic tables. You have to define a limit for the number of entries in any Cobol table. You are applying a current limit, extending it slightly (10% or so does not allow much space for business growth) and then run into the problem of the "width" of the table.

You have already, it seems, spent time (presumably after the first compile) cutting down everything you can and excluding everything you can, but each table element is 34,000 or so bytes. That, along with wanting 9,999 occurences, takes you well over the limits Cobol applies to storage areas.

My original underlying point is that this should not have got this far. The design is wrong. You are manufacturing and delivering a Grand Piano and have decided, before the ink is dry on the plans, that you are going to deliver it through the Bathroom window. You have arrived outside the bathroom and proceeded to unscrew everything that you can, setting the pieces neatly beside you to use for kindling. The piano is still massively oversized for delivery, but you have no option...

Look at it another way. You are massively oversized already, are you able to tell the business people to slow down so they don't break your table very soon? Are you going to tell people, no, you can't have that data in your CSV because we have nowhere to store it in our massive table? As well as wanting to break the Cobol limits, you are also imposing limits on your system, needlessly reducing its flexibility.

It's a tough one for you, but the short of it is the design does not warrant the use of the term - it is something not worthy of the word "design".

There are many approaches which can be considered, at the design stage, to overcome the I-don't-really-know-how-big-it-can-get and the we've-got-need-far-too-much-to-hold-in-a-table problems - I suppose even if one is caused by the other and vice versa as in your case.

Possible solutions include: putting the data on a keyed file, pretty flexible; on a database, even more flexible; on flat files in an order that is the most use to you; selecting and outputting the data to file(s) and post-processing to complete the business requirement; etc.

As has already been mentioned by Mr. Akatsukami, there are things that can be used which can be especially useful if you have no reasonable maximium to the number of occurences if the data were to be represented as a table ("there are mostly only two or three items, but we have one store than can have up to 300,000 or more").

So, Gowtham, do you have time/resources for a redesign now/later? Or are you in a deep whole, System Testing due to start on Monday and you can't get a clean compile?

The more you can tell us, the better we can suggest. You don't have to take any of our advice. At the end of the day, if it needs to go through that bathroom window, it can be done, but it will forever look like it has been through a bathroom window as well as being a "beast" program, one which (nearly) the whole department fears to touch.
Back to top
View user's profile Send private message
GOWTHAM PAINENI

New User


Joined: 16 Aug 2011
Posts: 7
Location: USA

PostPosted: Fri Oct 28, 2011 5:50 am
Reply with quote

Thanks for your responses... !!

I changed my design..
1) First Fetch the list of all PARENT Accounts in Program A in a CURSOR...for the given input date
2) And then Use a simple PERFORM loop , that makes a call to the Program B ,which fetches the data for each Parent & it's corresponding Child accounts & returns it back to Program A.

This PERFORM will end, once it reaches end of Cursor.

I know it will slow down the Performance of the request (at max Program A makes 9999 calls to Program B), since it's a Batch process my business users are NOT worried about the turn around time.

Thanks to you all who participated in this discussion . Thanks !!
Back to top
View user's profile Send private message
Bill Woodger

Moderator Emeritus


Joined: 09 Mar 2011
Posts: 7309
Location: Inside the Matrix

PostPosted: Fri Oct 28, 2011 12:10 pm
Reply with quote

I'm glad you have come to a solution that works for you.

You could even continue the process, at each stage removing part of the table and replacing with database access. Then you get your full flexibility back.

Quote:
I know it will slow down the Performance of the request (at max Program A makes 9999 calls to Program B), since it's a Batch process my business users are NOT worried about the turn around time.


Be careful about what you know. Against this perceived less-performant solution, you had a solution which wouldn't even compile :-)

Let's assume you somehow got it in below the 128 meg limit. For each call your compile would generate code to load 32,000 BLLcells to make your entire table individually addressable. If you are now down to 34,000 bytes, that'll be eight BLL cells.

Now, unless you dummy-up each proposed solution (db only, 128m storage, 34k storage) on your machine, you don't know which one is "faster". As long as the performance is acceptable, I'd go with massively more flexible over faster any day.

If the aim of this design was performance, read, and re-read, Robert Sample's signature on this forum.
Back to top
View user's profile Send private message
View previous topic :: :: View next topic  
Post new topic   Reply to topic View Bookmarks
All times are GMT + 6 Hours
Forum Index -> COBOL Programming

 


Similar Topics
Topic Forum Replies
No new posts Store the data for fixed length COBOL Programming 1
No new posts COBOL -Linkage Section-Case Sensitive COBOL Programming 1
No new posts Data set Rec-Cnt and Byte-Cnt Testing & Performance 2
No new posts SCOPE PENDING option -check data DB2 2
No new posts Using Dynamic file handler in the Fil... COBOL Programming 2
Search our Forums:

Back to Top