View previous topic :: View next topic
|
Author |
Message |
Nileshkul
New User
Joined: 09 May 2016 Posts: 43 Location: India
|
|
|
|
Dear All,
I have a VSAM file which is fairly constant volume - sort of control data.
The file is read in batch. It has around 200 K records. The file is read in batch modules via keys. If I reduce record count to 20 K (i.e. 10% of original), will the keyed reads perform faster in batch? And if yes will that be considerable improvement to CPU time?
Thanks,
Nilesh. |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8697 Location: Dubuque, Iowa, USA
|
|
|
|
Have you read the Demystifying VSAM Redbook? If not you need to. Direct (keyed) reads use the index component. A LISTCAT will tell you how many levels the index has and hence how many accesses will be needed for each record retrieval. However, you will NOT be reducing CPU usage much by this change. |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8697 Location: Dubuque, Iowa, USA
|
|
|
|
Follow-up: have you talked to your site support group? If the data set (only Unix System Services has files on a z/OS system) is read only by key, BLSR should provide some performance improvement -- as long as BLSR is installed and functional at your site. Whether or not it is, only someone at your site can tell you.
And is your goal to reduce CPU usage? If so, reducing the number of records in a VSAM data set that is directly read anyway should be just about the LAST place you should look for CPU reductions! |
|
Back to top |
|
|
Rohit Umarjikar
Global Moderator
Joined: 21 Sep 2010 Posts: 3053 Location: NYC,USA
|
|
|
|
Welcome!!
Quote: |
The file is read in batch. It has around 200 K records. The file is read in batch modules via keys. If I reduce record count to 20 K (i.e. 10% of original), will the keyed reads perform faster in batch? And if yes will that be considerable improvement to CPU time? |
1. Why would you reduce the records at first place? Do you not then lose the accounts by doing so, if not then explain why and what are these 200k records?
2.If it is read by keys then, why do you think that a performance issue?
3.Why do not try doing what you say so and let us know?
4. What is current CPU usage for 200k records? What is your expectation that you should see after your change?
5. Take a help here as well as here and check the VSAM definition.
6. Last but not least , look at Robert's Signature and that should suffice the discussion.
Answers to the above question will make others to suggest you in a specific and much better way. |
|
Back to top |
|
|
Bill Woodger
Moderator Emeritus
Joined: 09 Mar 2011 Posts: 7309 Location: Inside the Matrix
|
|
|
|
Have you for certain determined that the keyed-access is a performance problem? If no, then you need to do so before proceeding.
If yes, lots of questions about the number of reads, the order of the reads and stuff like that. Index levels, number of index buffers, size of CI, data buffers, all of those can affect performance. If you are doing a lot of reads, you can expect significant savings by doing things differently. With 20,000 records, you can consider storing the keys and data you need in a COBOL table, in key sequence, and using SEARCH ALL or a hand-coded "binary chop".
Lots of things you can do, but exactly what depends on the data, and exactly what you are doing with it.
If it is one of those situations where "I can't change the program", reducing the size of the file and looking to see if you can use BLSR as Robert suggested can help performance, but not necessarily with significant saving of CPU usage. |
|
Back to top |
|
|
Nileshkul
New User
Joined: 09 May 2016 Posts: 43 Location: India
|
|
|
|
Thanks all - we have BSLR, THANKS FOR ALL HELP |
|
Back to top |
|
|
|