View previous topic :: View next topic
|
Author |
Message |
ashim prodhan
New User
Joined: 28 Apr 2013 Posts: 7 Location: india
|
|
|
|
Hello
I need your suggestion/comments for increasing a vsam file record length. Present scenario is like below -
In our life insurance admin application we have the masterfile of 16000 bytes. This file is allocated in cics and all the updation is done by online only. Very few report jobs use this file in batch window. Its trailer/segment based file. Now every year a new trailer being added in this file and now it has reached the maximum record length and failing to process the policy. This number is increasing day by day.
This application was developped as a part of a product. We thought of writing the new trailers to another file and changing the processing logic accordingly. The way the IO processing has been coded it is very difficult to understand as there is no documentation is available. Also there are many background long running transaction which updates the master file as well. We tried to do some sort of poc on that but no luck.
So we are thinking of increasing the record length. Though it will be a change across almost all the programs but that will be a doable thing from our side. We are thinking to make it as 35000 bytes. We dont have any idea how much the performance will be degraded for this. Would like to know your comments what are the things we need to consider as far as performance is considered. Will there be any other aspect we need to consider. Your comments/guidance is highly appreciated. We have approx 150,000 records. No new records will not be inserted in future. Thank You!
Ashim |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8696 Location: Dubuque, Iowa, USA
|
|
|
|
Have you talked to your site support group? There are many issues -- for example, if the VSAM data set is accessed via LSR pool in CICS, increasing the record length may change which LSR pool can be used with the data set. Only someone in your site support group can determine whether or not the data set uses LSR and if so which LSR pool to use after the change. |
|
Back to top |
|
|
Bill O'Boyle
CICS Moderator
Joined: 14 Jan 2008 Posts: 2501 Location: Atlanta, Georgia, USA
|
|
|
|
A length of 35000 bytes? You need to re-think this strategy.... |
|
Back to top |
|
|
Marso
REXX Moderator
Joined: 13 Mar 2006 Posts: 1353 Location: Israel
|
|
|
|
Bill O'Boyle wrote: |
A length of 35000 bytes? You need to re-think this strategy.... |
Can't agree more with Bill.
Can you loose the oldest year?
For example, if you have:
Code: |
|fixed data|1988 trailer|1989 trailer|...|2012 trailer|2013 trailer| |
a simple program could shift one year to the left:
Code: |
|fixed data|1989 trailer|1990 trailer|...|2013 trailer|avail for 2014| |
Advantages: record size stays unchanged, programs stay unchanged, will work for at least the next 50 years.
Disadvantage: loosing one year each year. |
|
Back to top |
|
|
ashim prodhan
New User
Joined: 28 Apr 2013 Posts: 7 Location: india
|
|
|
|
Hi Marso, Bill and Robert,
Thank You all for your response!
@ Marso - We thought of that approach initially. But after having a discussion with business, they are not agreed to delete the oldest year data. Its the TAX trailer data which is being added every year.
We need to maintain the tax data for every years. Shifting the oldest trailer in a different file was another option. But after doning some kind of POC we did not get expected result. So we are thinking of expanding this file. Thank You! |
|
Back to top |
|
|
steve-myers
Active Member
Joined: 30 Nov 2013 Posts: 917 Location: The Universe
|
|
|
|
Your proposed record size is too large. See the discussion of the RECORDSIZE parameter in the "DEFINE CLUSTER" chapter in the DFSMS AMS for Catalogs manual for your z/OS release.
It is very difficult to quantify performance issues for this proposed change. It very much depends on how the data is referenced. If the data is referenced as a single record, by the record's key, it is unlikely that any performance change will be observed. However, if the data is processed sequentially, and all the data has been expanded, the time required to process the data set will substantially increase, as will the data storage requirements for the data. However, if you can make a business case for this change, the increased processing time will be accepted, as will the cost to store the data. Note, too, that probably all programs that process this data will have to be changed; the cost of this change must be part of the presentation for justifying the business case for this change. |
|
Back to top |
|
|
|