View previous topic :: View next topic
|
Author |
Message |
meenakshi_forum Warnings : 1 Active User
Joined: 27 May 2008 Posts: 121 Location: India
|
|
|
|
I have a space issue in my job and it is abending with SB37 return code.
Whatever amount of space is supplied next time when it runs it wants more space so in this way the job is becoming unpredictable and it is abending at different steps different times.
Please suggest what is the best way to provide maximum space in a job to any step, would appreciate if code can be given. |
|
Back to top |
|
|
murmohk1
Senior Member
Joined: 29 Jun 2006 Posts: 1436 Location: Bangalore,India
|
|
|
|
Meenakshi,
Unlike SE37 where you increase the space, SB37 is end of volume. To resolve this either make your dataset as multivolume or ask your storage team to increase the space in that vol (by whatever means). |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19243 Location: Inside the Matrix
|
|
|
|
Hello,
There is no "best way to provide maximum space. . .". Your jcl should allocate the amount of space needed (not always an easy thing to do).
Has your job ever run successfully or does it abend each time it is tried with a larger volume? If it has not yet ever completed successfully, it may be in a loop.
Does the volume of data to be written increase with each run? Have you calculated how much space shyould be needed (number of records * lrecl) and converted this to cyls? If not, you need to do so. There are multiple topics in the forum that show how to calculate space for various dasd devices.
Have you talked with your storage management people to make sure you are using the proper storage device/class for your output file(s)? |
|
Back to top |
|
|
expat
Global Moderator
Joined: 14 Mar 2007 Posts: 8796 Location: Welsh Wales
|
|
|
|
Is this a cyclic trend, where data is appended during a given cycle, say monthly, and at some stage the dataset is cleared down for the next cycle, or just a plain old getting bigger dataset ?
Do you know the maximum record expectation of the file ? If so you can perform a realistic calculation to estimate the space requirements, of which I have posted various examples on this site. |
|
Back to top |
|
|
meenakshi_forum Warnings : 1 Active User
Joined: 27 May 2008 Posts: 121 Location: India
|
|
|
|
It is successfully running in the night and early morning but not during Business hours(especially peak Business hours). |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19243 Location: Inside the Matrix
|
|
|
|
Hello,
Have you explained this to the storage management people?
It sounds like you are simply running into "traffic" during the peak time of the day. You may need to consider changing when the job runs, changing the dasd used for your problem file(s), changing how you specify space in your jcl, etc. All of these would be best addressed by your storage management people. |
|
Back to top |
|
|
meenakshi_forum Warnings : 1 Active User
Joined: 27 May 2008 Posts: 121 Location: India
|
|
|
|
Storage management people increase space but next time it runs even that space is not enough and it abends at some other steps. |
|
Back to top |
|
|
Terry Heinze
JCL Moderator
Joined: 14 Jul 2008 Posts: 1248 Location: Richfield, MN, USA
|
|
|
|
Please post the DD that's getting the SB37 and the exact error msg you're getting. |
|
Back to top |
|
|
meenakshi_forum Warnings : 1 Active User
Joined: 27 May 2008 Posts: 121 Location: India
|
|
|
|
Thanks All.
How can we determine Cylinders,i.e,
converting --(number of records * lrecl) , how it is converted?
PLease let me know. |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8700 Location: Dubuque, Iowa, USA
|
|
|
|
To determine cylinders, you need DSORG (VB or FB), LRECL, BLKSIZE, and number of records.
For variable length files, (BLKSIZE-4) / LRECL = minimum # of records per block while (BLKSIZE-4) / (average LRECL + 4) yields average records per block.
For fixed length files, BLKSIZE / LRECL gives records per block.
Divide number of records by average records per block (or records per block) to get number of blocks. For disk, assuming half-track blocking (BLKSIZE as close to 27998 as you can get), number of blocks * 2 gives number of tracks. Number of tracks * 15 gives number of cylinders. For other than half-track blocking, you need to find the 3390 block size table and use it to convert block size to # of blocks per track. |
|
Back to top |
|
|
meenakshi_forum Warnings : 1 Active User
Joined: 27 May 2008 Posts: 121 Location: India
|
|
|
|
I have done the calculation as follows--
Record length . . . : 1023
Block size . . . . : 27998
No. of records--- 07803523
Records per block= 27.3685,i.e, 27998/1023 .
7803523/27.3685=num of blocks,---285127.9025
tracks=2*285127.9025---->570255.805
cylinders= tracks*15--->8553837.075
Please correct if i am wrong.
Earlier I had given an override for space as
SPACE=(CYL,(1000,1000),RLSE) and it was not accepted by storage department as they say they can't afford to give so much space.
Please advice what shall be done. |
|
Back to top |
|
|
Craq Giegerich
Senior Member
Joined: 19 May 2007 Posts: 1512 Location: Virginia, USA
|
|
|
|
It should be Cylinders = tracks / 15. |
|
Back to top |
|
|
Terry Heinze
JCL Moderator
Joined: 14 Jul 2008 Posts: 1248 Location: Richfield, MN, USA
|
|
|
|
That many records is going to require 9634 cylinders of a 3390 (assuming FB records). That's a lot. I can see why your storage folks object unless you can justify the business requirement. |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8700 Location: Dubuque, Iowa, USA
|
|
|
|
I didn't clearly specify the formula earlier. First, any fractions are discarded when calculating records per block. So 27998 / 1023 gives 27 records per block, not 27 and a fraction. 27 records per block * 2 = 54 records per track. 54 * 15 = 810 records per cylinder. 7,803,523 records divided by 810 records per cylinder gives 9633.97 cylinders, which is rounded up (you cannot allocate .97 of a cylinder -- either you get one or you don't), giving 9634 cylinders for your dataset.
This dataset will require approximately 3 full 3390-3 disk packs; if you're having problems getting 1000 cylinders you're not going to get it on disk. If your storage management people can't give you this much space, your most likely alternative is to put it on tape. |
|
Back to top |
|
|
|