IBM Mainframe Forum Index
 
Log In
 
IBM Mainframe Forum Index Mainframe: Search IBM Mainframe Forum: FAQ Register
 

PS file storage capacity


IBM Mainframe Forums -> JCL & VSAM
Post new topic   Reply to topic
View previous topic :: View next topic  
Author Message
depakjan

New User


Joined: 14 Apr 2008
Posts: 7
Location: chennai

PostPosted: Mon Apr 06, 2009 10:29 am
Reply with quote

How many million records can a PS file contain?

I have a requirement of merging 32 files having around 5 million records each... each record will inturn have a length of 3500... is this possible in a single step? .. is the storage capacity of a single PS file mainframe dependent.. i mean will size differ with the configuration of mainframe?
Back to top
View user's profile Send private message
himanshu7

Active User


Joined: 28 Aug 2007
Posts: 131
Location: At Desk

PostPosted: Mon Apr 06, 2009 11:09 am
Reply with quote

Hi,

Quote:
have a requirement of merging 32 files having around 5 million records each... each record will inturn have a length of 3500... is this possible in a single step?



Why don't you try and see if merging 32 files are possible or not?

I believe this is possible with the IEBCOPY utility not really very sure..Give it a shot..
Quote:

i mean will size differ with the configuration of mainframe?


This would certainly dependent upon the type of DASD used for ex 3390 etc...
Back to top
View user's profile Send private message
depakjan

New User


Joined: 14 Apr 2008
Posts: 7
Location: chennai

PostPosted: Mon Apr 06, 2009 11:24 am
Reply with quote

yea i could have... but i dont have a sample file... thats my prob.. i just wanna know if any of guys have done such merging ...

also which would be a better approach....

i need to merge three group of 32 files and get three files individually... running three merge steps sequentially in a single JCL or running them as three jobs at the same time?
Back to top
View user's profile Send private message
himanshu7

Active User


Joined: 28 Aug 2007
Posts: 131
Location: At Desk

PostPosted: Mon Apr 06, 2009 12:01 pm
Reply with quote

Quote:
i need to merge three group of 32 files and get three files individually... running three merge steps sequentially in a single JCL or running them as three jobs at the same time?


You can do either of this.
What i would suggest is to create 3 diff JCL to merge,coz of suggestion is I/O operation done on Mainframe.

Quote:

but i dont have a sample file

What sample file you are looking out for?

For the sample utility for merging you can search in the forum for IEBCOPY utility which will help in merging the files..
Back to top
View user's profile Send private message
Anuj Dhawan

Superior Member


Joined: 22 Apr 2006
Posts: 6250
Location: Mumbai, India

PostPosted: Mon Apr 06, 2009 12:12 pm
Reply with quote

Hi,
depakjan wrote:
I have a requirement of merging 32 files having around 5 million records each... each record will inturn have a length of 3500... is this possible in a single step?
Once done how this file will be used?
Quote:
.. is the storage capacity of a single PS file mainframe dependent.. i mean will size differ with the configuration of mainframe?
I'm not very sure what is being asked here? Do you want to merge this file on one Mainframe while once "created" it'll be used on other?
Back to top
View user's profile Send private message
Robert Sample

Global Moderator


Joined: 06 Jun 2008
Posts: 8696
Location: Dubuque, Iowa, USA

PostPosted: Mon Apr 06, 2009 5:05 pm
Reply with quote

32 files times 5 million records times 3500 bytes per record equals 560 billion bytes.

A sequential file on DASD (up through z/OS 1.9) is limited to 65,535 tracks and 59 volumes. 65535 * [27998/3500] * 2 * 59 volumes yields just under 190 billion bytes (the [] refers to truncation). Your file will not fit on DASD under z/OS unless you're running z/OS 1.10 with Extended Address Volume. Even if you are, putting a 560 billion byte file on disk suggests consulting your site support group to ensure the DASD space is available and how to get the file allocated; a special storage group may need to be defined for the file.

If you're not running z/OS 1.10, you'll have to put the file on tape, or give up the idea of creating a single file.
Back to top
View user's profile Send private message
depakjan

New User


Joined: 14 Apr 2008
Posts: 7
Location: chennai

PostPosted: Tue Apr 07, 2009 9:22 am
Reply with quote

Thanks for the gr8 replies guys... and if i am storing that single file in the tape... i can still do some manipulations using ICETOOL comparison right?.. guessing it would take a hell lot of time..
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19244
Location: Inside the Matrix

PostPosted: Tue Apr 07, 2009 9:53 am
Reply with quote

Hello,

Depending on your exact requirements, you may want to extract the actual needed data from the whole file. I suspect that of 3500 bytes per record, only a few fields are actually needed for this process.

A lazy way to code is pass all of the bytes from all of the records.

Some time spent designing the process to reduce the sheer volume of data to be processed is in order (if not required).

If you post more details of your processing requirement, someone may have suggestons on how to more efficiently implement.
Back to top
View user's profile Send private message
Robert Sample

Global Moderator


Joined: 06 Jun 2008
Posts: 8696
Location: Dubuque, Iowa, USA

PostPosted: Tue Apr 07, 2009 5:28 pm
Reply with quote

Be aware there's a limit of 255 volumes to one file, so each tape needs to store at least 2.2 billion bytes to get the 560 billion bytes into 255 volumes. Otherwise, you'll have to extract data (either fields to reduce size of each record, or records to reduce record count, or both) to get the file down to a manageable size.

And processing half a terabyte would definitely take one hell of a long time. I'm seeing FICON transfer rates of 100 MB per second, which is 3.6 GB per hour (actually less due to channel contention and two-way traffic, so let's assume 3 GB per hour). 521 GB divided by 3 GB per hour yields elapsed time to read the file of 174 hours or just over one week. ESCON is 17 MB per second so if you're on an ESCON channel the time becomes 6 weeks to read the file one time.
Back to top
View user's profile Send private message
Terry Heinze

JCL Moderator


Joined: 14 Jul 2008
Posts: 1249
Location: Richfield, MN, USA

PostPosted: Tue Apr 07, 2009 10:17 pm
Reply with quote

Quote:
521 GB divided by 3 GB per hour yields elapsed time to read the file of 174 hours or just over one week. ESCON is 17 MB per second so if you're on an ESCON channel the time becomes 6 weeks to read the file one time.

Not good news if the job runs daily. icon_lol.gif
Back to top
View user's profile Send private message
Robert Sample

Global Moderator


Joined: 06 Jun 2008
Posts: 8696
Location: Dubuque, Iowa, USA

PostPosted: Tue Apr 07, 2009 10:39 pm
Reply with quote

Back when I worked for a software vendor, one of our customers was getting ready to implement our system into production but backed off when they realized the projected daily batch job run time was 30 hours. That required a bit of redesign to resolve ....
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19244
Location: Inside the Matrix

PostPosted: Tue Apr 07, 2009 11:29 pm
Reply with quote

If it takes more than 24 hours to run the "daily", how long will it take to run the "weekly" . . .?

Heaven help us at year-end. . . .

icon_razz.gif
Back to top
View user's profile Send private message
Robert Sample

Global Moderator


Joined: 06 Jun 2008
Posts: 8696
Location: Dubuque, Iowa, USA

PostPosted: Tue Apr 07, 2009 11:33 pm
Reply with quote

We weren't running any weekly jobs, but the month-end job was not fun.
Back to top
View user's profile Send private message
View previous topic :: :: View next topic  
Post new topic   Reply to topic View Bookmarks
All times are GMT + 6 Hours
Forum Index -> JCL & VSAM

 


Similar Topics
Topic Forum Replies
No new posts How to split large record length file... DFSORT/ICETOOL 10
No new posts Extracting Variable decimal numbers f... DFSORT/ICETOOL 17
No new posts SFTP Issue - destination file record ... All Other Mainframe Topics 2
No new posts Access to non cataloged VSAM file JCL & VSAM 18
No new posts Need help for File Aid JCL to extract... Compuware & Other Tools 23
Search our Forums:

Back to Top