View previous topic :: View next topic
|
Author |
Message |
depakjan
New User
Joined: 14 Apr 2008 Posts: 7 Location: chennai
|
|
|
|
How many million records can a PS file contain?
I have a requirement of merging 32 files having around 5 million records each... each record will inturn have a length of 3500... is this possible in a single step? .. is the storage capacity of a single PS file mainframe dependent.. i mean will size differ with the configuration of mainframe? |
|
Back to top |
|
|
himanshu7
Active User
Joined: 28 Aug 2007 Posts: 131 Location: At Desk
|
|
|
|
Hi,
Quote: |
have a requirement of merging 32 files having around 5 million records each... each record will inturn have a length of 3500... is this possible in a single step? |
Why don't you try and see if merging 32 files are possible or not?
I believe this is possible with the IEBCOPY utility not really very sure..Give it a shot..
Quote: |
i mean will size differ with the configuration of mainframe? |
This would certainly dependent upon the type of DASD used for ex 3390 etc... |
|
Back to top |
|
|
depakjan
New User
Joined: 14 Apr 2008 Posts: 7 Location: chennai
|
|
|
|
yea i could have... but i dont have a sample file... thats my prob.. i just wanna know if any of guys have done such merging ...
also which would be a better approach....
i need to merge three group of 32 files and get three files individually... running three merge steps sequentially in a single JCL or running them as three jobs at the same time? |
|
Back to top |
|
|
himanshu7
Active User
Joined: 28 Aug 2007 Posts: 131 Location: At Desk
|
|
|
|
Quote: |
i need to merge three group of 32 files and get three files individually... running three merge steps sequentially in a single JCL or running them as three jobs at the same time? |
You can do either of this.
What i would suggest is to create 3 diff JCL to merge,coz of suggestion is I/O operation done on Mainframe.
Quote: |
but i dont have a sample file |
What sample file you are looking out for?
For the sample utility for merging you can search in the forum for IEBCOPY utility which will help in merging the files.. |
|
Back to top |
|
|
Anuj Dhawan
Superior Member
Joined: 22 Apr 2006 Posts: 6248 Location: Mumbai, India
|
|
|
|
Hi,
depakjan wrote: |
I have a requirement of merging 32 files having around 5 million records each... each record will inturn have a length of 3500... is this possible in a single step? |
Once done how this file will be used?
Quote: |
.. is the storage capacity of a single PS file mainframe dependent.. i mean will size differ with the configuration of mainframe? |
I'm not very sure what is being asked here? Do you want to merge this file on one Mainframe while once "created" it'll be used on other? |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8700 Location: Dubuque, Iowa, USA
|
|
|
|
32 files times 5 million records times 3500 bytes per record equals 560 billion bytes.
A sequential file on DASD (up through z/OS 1.9) is limited to 65,535 tracks and 59 volumes. 65535 * [27998/3500] * 2 * 59 volumes yields just under 190 billion bytes (the [] refers to truncation). Your file will not fit on DASD under z/OS unless you're running z/OS 1.10 with Extended Address Volume. Even if you are, putting a 560 billion byte file on disk suggests consulting your site support group to ensure the DASD space is available and how to get the file allocated; a special storage group may need to be defined for the file.
If you're not running z/OS 1.10, you'll have to put the file on tape, or give up the idea of creating a single file. |
|
Back to top |
|
|
depakjan
New User
Joined: 14 Apr 2008 Posts: 7 Location: chennai
|
|
|
|
Thanks for the gr8 replies guys... and if i am storing that single file in the tape... i can still do some manipulations using ICETOOL comparison right?.. guessing it would take a hell lot of time.. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19243 Location: Inside the Matrix
|
|
|
|
Hello,
Depending on your exact requirements, you may want to extract the actual needed data from the whole file. I suspect that of 3500 bytes per record, only a few fields are actually needed for this process.
A lazy way to code is pass all of the bytes from all of the records.
Some time spent designing the process to reduce the sheer volume of data to be processed is in order (if not required).
If you post more details of your processing requirement, someone may have suggestons on how to more efficiently implement. |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8700 Location: Dubuque, Iowa, USA
|
|
|
|
Be aware there's a limit of 255 volumes to one file, so each tape needs to store at least 2.2 billion bytes to get the 560 billion bytes into 255 volumes. Otherwise, you'll have to extract data (either fields to reduce size of each record, or records to reduce record count, or both) to get the file down to a manageable size.
And processing half a terabyte would definitely take one hell of a long time. I'm seeing FICON transfer rates of 100 MB per second, which is 3.6 GB per hour (actually less due to channel contention and two-way traffic, so let's assume 3 GB per hour). 521 GB divided by 3 GB per hour yields elapsed time to read the file of 174 hours or just over one week. ESCON is 17 MB per second so if you're on an ESCON channel the time becomes 6 weeks to read the file one time. |
|
Back to top |
|
|
Terry Heinze
JCL Moderator
Joined: 14 Jul 2008 Posts: 1248 Location: Richfield, MN, USA
|
|
|
|
Quote: |
521 GB divided by 3 GB per hour yields elapsed time to read the file of 174 hours or just over one week. ESCON is 17 MB per second so if you're on an ESCON channel the time becomes 6 weeks to read the file one time. |
Not good news if the job runs daily. |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8700 Location: Dubuque, Iowa, USA
|
|
|
|
Back when I worked for a software vendor, one of our customers was getting ready to implement our system into production but backed off when they realized the projected daily batch job run time was 30 hours. That required a bit of redesign to resolve .... |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19243 Location: Inside the Matrix
|
|
|
|
If it takes more than 24 hours to run the "daily", how long will it take to run the "weekly" . . .?
Heaven help us at year-end. . . .
|
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8700 Location: Dubuque, Iowa, USA
|
|
|
|
We weren't running any weekly jobs, but the month-end job was not fun. |
|
Back to top |
|
|
|