View previous topic :: View next topic
|
Author |
Message |
aryanpa1
New User
Joined: 26 May 2007 Posts: 45 Location: Chennai
|
|
|
|
Hi Team,
One of my new job is creating Millions of records in a single DASD file.I want to know what is the maximum no. of records I can place in this file. The count may go to 50 millions sometimes.
Record length is 300 bytes.
Can you please let me know the maximum no. of records I can store in this and the process how I can calculate this.
Thank you,
Pavan |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
Suggest you create a file with 5 million records using the recfm and blksize your "real" file will use. Then all you need to do is multiply.
One of my transaction files has around 8 million records that are 1052 bytes long and this file typically requires about 1 3390 volume.
If you search in the forum, there have been a few topics on space calculation.
I suspect that the largest possible file would be far more than you could afford to process. . . |
|
Back to top |
|
|
CICS Guy
Senior Member
Joined: 18 Jul 2007 Posts: 2146 Location: At my coffee table
|
|
Back to top |
|
|
aryanpa1
New User
Joined: 26 May 2007 Posts: 45 Location: Chennai
|
|
|
|
Quote: |
I suspect that the largest possible file would be far more than you could afford to process. . . |
Thanks Dick, Is 50 million records file can be processed easily |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8696 Location: Dubuque, Iowa, USA
|
|
|
|
If I did the math correctly, and you're not running large or extended sequential files under z/OS 1.10, you can fit about 719 million 300-byte records on a z?OS system.
Quote: |
Is 50 million records file can be processed easily |
another of those questions that can be answered
Yes, no, maybe
because it depends -- what do you consider "easily"? Has the JCL been optimized for buffers? Are there any constraints on the processing (for example, it is a daily process and you discover it's taking 32 hours)? |
|
Back to top |
|
|
aryanpa1
New User
Joined: 26 May 2007 Posts: 45 Location: Chennai
|
|
|
|
"Easily" in my sense , the time it is going to take for simple read of the file. |
|
Back to top |
|
|
CICS Guy
Senior Member
Joined: 18 Jul 2007 Posts: 2146 Location: At my coffee table
|
|
|
|
Robert Sample wrote: |
z?OS system. |
Typo or subtle hint? |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8696 Location: Dubuque, Iowa, USA
|
|
|
|
Flat file, sequential read -- easy enough. How long (in elapsed time) it's going to take will depend upon the site. Factors that have a bearing include (but are not limited to) Workload Manager policy, number of active address spaces in the system, channel contention, disk contention, I/O speed, CPU speed, program processing, buffering, block size. The only way to know for sure is going to be run a significant test (say 10-100% of the file) during the normal expected window. |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8696 Location: Dubuque, Iowa, USA
|
|
|
|
CICS guy ... typo, definitely typo. Sigh. Although subtle hint could work, too. Now, back to the LPAR I toasted this morning to see if it can be recovered or if it's time to restore from backup ... |
|
Back to top |
|
|
|