IBM Mainframe Forum Index
 
Log In
 
IBM Mainframe Forum Index Mainframe: Search IBM Mainframe Forum: FAQ Register
 

How to Calculate Control Interval Size


IBM Mainframe Forums -> JCL & VSAM
Post new topic   Reply to topic
View previous topic :: View next topic  
Author Message
n459760

New User


Joined: 13 Oct 2009
Posts: 8
Location: Albany,NY

PostPosted: Fri Mar 05, 2010 12:31 am
Reply with quote

Hi,

I did research and read through VSAM Demystified again before posting this but I am getting incorrect output.

How to calculate or what should be the optimum control interval sizes of data and index files for a VSAM file with LRECL of 94 and assuming a record count of 500,000,000 (.5 billion) with a monthly percentage increase of 3% of records.

I checked with some of the formulae's but I am getting conflicting results. Your thoughts and inputs are really appreciated.
Back to top
View user's profile Send private message
Robert Sample

Global Moderator


Joined: 06 Jun 2008
Posts: 8696
Location: Dubuque, Iowa, USA

PostPosted: Fri Mar 05, 2010 12:38 am
Reply with quote

Will the file be accessed randomly or sequentially or a mixture of access?

How often will the file be purged (3% monthly is 36% annually if not compounded or 180 million extra records per year -- you cannot keep that growth rate up forever)?

What is the key length? Index CI size depends at least partly on key length.

I'm assuming you want a KSDS here, although you don't specify which of the several types of VSAM files you're dealing with.

In general, for sequential access 18432 data CI size works well for almost any conceivable record size. However, if you're planning on accessing the file online randomly, the large CI size may cause issues with locking and buffer space. The answers you're looking for are going to involve trade offs between the various factors.
Back to top
View user's profile Send private message
n459760

New User


Joined: 13 Oct 2009
Posts: 8
Location: Albany,NY

PostPosted: Fri Mar 05, 2010 12:57 am
Reply with quote

Hi Robert,

Thank you for your quick reply:

How often will the file be purged (3% monthly is 36% annually if not compounded or 180 million extra records per year -- you cannot keep that growth rate up forever)?

- The file will be deleted and defined daily with the data getting loaded from a BDAM file.

What is the key length? Index CI size depends at least partly on key length.

- The key length is 12.

Yes, its KSDS. Sorry I did not specify this in original post. The file will be accessed randomly and then sequentially to get all records of that key.

I am not able to recall or find but I think I did read somewhere that while defining the file we must take into account the increase in size of the file so as to account of CI/CA splits. Additionally control interval size definition should also take into account these increases.

I was trying to get a consensus here regarding any thumb rule/approximations/generalization that may exist when we define a KSDS file specially with respect to the CISZ.

I did use the 18432 data CI but I am getting error VSAM error 28. I must also include here that this is an extended format VSAM.
Back to top
View user's profile Send private message
Robert Sample

Global Moderator


Joined: 06 Jun 2008
Posts: 8696
Location: Dubuque, Iowa, USA

PostPosted: Fri Mar 05, 2010 2:03 am
Reply with quote

Quote:
The file will be accessed randomly and then sequentially to get all records of that key.
Unless you are talking about alternate indexes, you do need to be aware that a primary key must be unique in a VSAM KSDS -- there will be no other "records of that key". If you were talking about a compound key, where you're doing generic retrievals, I could see it but you haven't mentioned a lot about the file.

CI/CA splits will be irrelevant if you are defining the file daily -- unless you've got some very unusual update patterns. On the other hand, deleting defining and loading this file every day is going to take quite some time.

Are you working with your site support group to define this file? 500 million 94 byte records is going to run 40 gigabytes or 14 to 15 3390-3 disk packs, which would require advance planning with the storage management people usually. Since the limit for a VSAM file is 4 GB unless it is extended format, you didn't really tell us anything by mentioning that it is extended format -- we assumed that.

How many volumes are you defining for the data component? Are they in a storage pool of their own, or shared with other files? Are you having problems getting enough space -- a 28 return code can mean you've run out of disk packs (or extents) and the file still needs to grow some.

My handy little Excel VSAM tuner tells me 26624 is probably your best data CI size (as long as there's an online pool big enough for that CI size). Index CI size won't matter since the rule of thumb says even a 512 byte CI size will hold all the data pointers.

When I said "purged" in my earlier post, I meant records removed -- permanently -- from this file, not just a delete / define of it. 180 million new records a year is adding 16 gigabytes of data or 6 new 3390-3 disk packs each year. Sooner or later you're going to hit the 59 disk pack limit for files, unless records are removed from the file sooner or later. But that is an application design issue, not a technical issue.
Back to top
View user's profile Send private message
raghavmcs

Active User


Joined: 14 Jul 2005
Posts: 105

PostPosted: Fri Mar 05, 2010 3:31 am
Reply with quote

Hello Robert,

I usually visit the expert answers on this forum,I really liked your this post backed up with statistics and full of experience.
I am wondering if you could share your handy excell on this forum.thanks
Back to top
View user's profile Send private message
Robert Sample

Global Moderator


Joined: 06 Jun 2008
Posts: 8696
Location: Dubuque, Iowa, USA

PostPosted: Fri Mar 05, 2010 3:43 am
Reply with quote

I'll look into it. I've actually got 2 I've written over the years -- the one I used today allows input of record size and gives unused bytes and percentage for each possible CI size to help with tuning. The other one, which I don't seem to have on this machine, estimates file sizes based on record size, key length, free space figures, and number of records -- for each possible CI size.
Back to top
View user's profile Send private message
n459760

New User


Joined: 13 Oct 2009
Posts: 8
Location: Albany,NY

PostPosted: Fri Mar 05, 2010 7:16 pm
Reply with quote

I would also be greatly indebted if you can share how you arrived at 26624 for data CI. Thank you.
Back to top
View user's profile Send private message
enrico-sorichetti

Superior Member


Joined: 14 Mar 2007
Posts: 10873
Location: italy

PostPosted: Fri Mar 05, 2010 7:26 pm
Reply with quote

1024 * 26
but I wonder why not 27648 ( 1024 * 27 ) the best half track approximation
Back to top
View user's profile Send private message
View previous topic :: :: View next topic  
Post new topic   Reply to topic View Bookmarks
All times are GMT + 6 Hours
Forum Index -> JCL & VSAM

 


Similar Topics
Topic Forum Replies
No new posts Using Dynamic file handler in the Fil... COBOL Programming 2
No new posts how to calculate SUM value for VB fil... DFSORT/ICETOOL 1
No new posts how to calculate SUM for VB file usin... JCL & VSAM 1
No new posts Using Java/C/C++ to retrieve dataset ... Java & MQSeries 6
No new posts Find the size of a PS file before rea... COBOL Programming 13
Search our Forums:

Back to Top