IBM Mainframe Forum Index
 
Log In
 
IBM Mainframe Forum Index Mainframe: Search IBM Mainframe Forum: FAQ Register
 

Increasing a record length of vsam file and impact on perfor


IBM Mainframe Forums -> JCL & VSAM
Post new topic   Reply to topic
View previous topic :: View next topic  
Author Message
ashim prodhan

New User


Joined: 28 Apr 2013
Posts: 7
Location: india

PostPosted: Sat Jun 21, 2014 10:05 pm
Reply with quote

Hello
I need your suggestion/comments for increasing a vsam file record length. Present scenario is like below -
In our life insurance admin application we have the masterfile of 16000 bytes. This file is allocated in cics and all the updation is done by online only. Very few report jobs use this file in batch window. Its trailer/segment based file. Now every year a new trailer being added in this file and now it has reached the maximum record length and failing to process the policy. This number is increasing day by day.
This application was developped as a part of a product. We thought of writing the new trailers to another file and changing the processing logic accordingly. The way the IO processing has been coded it is very difficult to understand as there is no documentation is available. Also there are many background long running transaction which updates the master file as well. We tried to do some sort of poc on that but no luck.
So we are thinking of increasing the record length. Though it will be a change across almost all the programs but that will be a doable thing from our side. We are thinking to make it as 35000 bytes. We dont have any idea how much the performance will be degraded for this. Would like to know your comments what are the things we need to consider as far as performance is considered. Will there be any other aspect we need to consider. Your comments/guidance is highly appreciated. We have approx 150,000 records. No new records will not be inserted in future. Thank You!

Ashim
Back to top
View user's profile Send private message
Robert Sample

Global Moderator


Joined: 06 Jun 2008
Posts: 8696
Location: Dubuque, Iowa, USA

PostPosted: Sat Jun 21, 2014 10:49 pm
Reply with quote

Have you talked to your site support group? There are many issues -- for example, if the VSAM data set is accessed via LSR pool in CICS, increasing the record length may change which LSR pool can be used with the data set. Only someone in your site support group can determine whether or not the data set uses LSR and if so which LSR pool to use after the change.
Back to top
View user's profile Send private message
Bill O'Boyle

CICS Moderator


Joined: 14 Jan 2008
Posts: 2501
Location: Atlanta, Georgia, USA

PostPosted: Sun Jun 22, 2014 12:18 am
Reply with quote

A length of 35000 bytes? You need to re-think this strategy.... icon_eek.gif
Back to top
View user's profile Send private message
Marso

REXX Moderator


Joined: 13 Mar 2006
Posts: 1353
Location: Israel

PostPosted: Mon Jun 23, 2014 2:38 pm
Reply with quote

Bill O'Boyle wrote:
A length of 35000 bytes? You need to re-think this strategy.... icon_eek.gif
Can't agree more with Bill.

Can you loose the oldest year?
For example, if you have:
Code:
|fixed data|1988 trailer|1989 trailer|...|2012 trailer|2013 trailer|

a simple program could shift one year to the left:
Code:
|fixed data|1989 trailer|1990 trailer|...|2013 trailer|avail for 2014|

Advantages: record size stays unchanged, programs stay unchanged, will work for at least the next 50 years.
Disadvantage: loosing one year each year.
Back to top
View user's profile Send private message
ashim prodhan

New User


Joined: 28 Apr 2013
Posts: 7
Location: india

PostPosted: Mon Jun 23, 2014 3:35 pm
Reply with quote

Hi Marso, Bill and Robert,

Thank You all for your response!

@ Marso - We thought of that approach initially. But after having a discussion with business, they are not agreed to delete the oldest year data. Its the TAX trailer data which is being added every year.

We need to maintain the tax data for every years. Shifting the oldest trailer in a different file was another option. But after doning some kind of POC we did not get expected result. So we are thinking of expanding this file. Thank You!
Back to top
View user's profile Send private message
steve-myers

Active Member


Joined: 30 Nov 2013
Posts: 917
Location: The Universe

PostPosted: Mon Jun 23, 2014 4:39 pm
Reply with quote

Your proposed record size is too large. See the discussion of the RECORDSIZE parameter in the "DEFINE CLUSTER" chapter in the DFSMS AMS for Catalogs manual for your z/OS release.

It is very difficult to quantify performance issues for this proposed change. It very much depends on how the data is referenced. If the data is referenced as a single record, by the record's key, it is unlikely that any performance change will be observed. However, if the data is processed sequentially, and all the data has been expanded, the time required to process the data set will substantially increase, as will the data storage requirements for the data. However, if you can make a business case for this change, the increased processing time will be accepted, as will the cost to store the data. Note, too, that probably all programs that process this data will have to be changed; the cost of this change must be part of the presentation for justifying the business case for this change.
Back to top
View user's profile Send private message
View previous topic :: :: View next topic  
Post new topic   Reply to topic View Bookmarks
All times are GMT + 6 Hours
Forum Index -> JCL & VSAM

 


Similar Topics
Topic Forum Replies
No new posts How to split large record length file... DFSORT/ICETOOL 8
No new posts PARSE Syntax for not fix length word ... JCL & VSAM 7
No new posts Extracting Variable decimal numbers f... DFSORT/ICETOOL 17
No new posts SFTP Issue - destination file record ... All Other Mainframe Topics 2
No new posts Access to non cataloged VSAM file JCL & VSAM 18
Search our Forums:

Back to Top