IBM Mainframe Forum Index
 
Log In
 
IBM Mainframe Forum Index Mainframe: Search IBM Mainframe Forum: FAQ Register
 

Job Logic Dependent On Total Number Of GDG Versions Existing


IBM Mainframe Forums -> JCL & VSAM
Post new topic   Reply to topic
View previous topic :: View next topic  
Author Message
shitij

New User


Joined: 09 Sep 2005
Posts: 31
Location: Delhi

PostPosted: Sat Sep 29, 2007 6:29 am
Reply with quote

Hi,

I have the following requirement :

I have a certain step in my job - that shall run when the limit of GDG generations reaches an alarming level - May be say version count has reached 240 versions (very close to limit of 255) - so this step will delete the GDG and redefine that - so that new versions can be written afresh.


The crux is that we shud make sure max limit is never reached.

So now the logic I am following is as follows:

Code:
//STEP0001 EXEC PGM=IDCAMS                   
//SYSPRINT DD DSN=IDCAMS.GDG.LIMIT.REPORT,DISP=SHR
//SYSIN    DD   *                           
  LISTCAT LVL(ACDM.INDCHNGE) ALL       
/*                                           
//*


My Dataset - IDCAMS.GDG.LIMIT.REPORT will generate the report for GDG - ACDM.INDCHNGE - describing the number of versions currently existing.

In IDCAMS.GDG.LIMIT.REPORT - I see a line called :

Code:
TOTAL ----------------22


This line starts at 20th postion and writes the count at 41st position i.e. --- (so count of 22 is written as -22).

I am planning to filter this line and read the version number and decide the future.

I was thinking is there a better way to do this - or my current logic is good enough?.


Thanks,
Shitij
Back to top
View user's profile Send private message
murmohk1

Senior Member


Joined: 29 Jun 2006
Posts: 1436
Location: Bangalore,India

PostPosted: Sat Sep 29, 2007 7:15 am
Reply with quote

Shitij,

First, Is this a HOMEWORK?

Secondly -

Quote:
May be say version count has reached 240 versions (very close to limit of 255) - so this step will delete the GDG and redefine that - so that new versions can be written afresh.

Did you go thru gdg declaration syntax? Generation maintenance will be done automatically. Why do you want to delete-define gdg (base)?

And one more question, do you really want your gdg generation # below the specified limit (say G0255V00) always? What if the generation goes beyond above limit?

Whats wrong in letting the MVS do maintenance for you?

If its a "real" situation, could we know your shop limitations?

Quote:
I was thinking is there a better way to do this - or my current logic is good enough?.

Nay....
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19244
Location: Inside the Matrix

PostPosted: Sat Sep 29, 2007 7:51 am
Reply with quote

Hello,

How often are new generations created? What kind of process is served by accumulating some large number of generations to oinly delete all of them when some high-water mark is reached?

Rather than trying to prevent "problems" by "watching" the versions, why not define some nunber of generations to be kept (say 100) and as new ones are added, old ones would "fall off"? This is the way gdg data is usually handled.

You may have some confusion with the 255 - you can only have 255 relative generations. Relative generatins allow you to refer to a file by the relative number of the dataset (i.e. -1).

If you have questions, someone will be here.
Back to top
View user's profile Send private message
shitij

New User


Joined: 09 Sep 2005
Posts: 31
Location: Delhi

PostPosted: Sat Sep 29, 2007 9:53 am
Reply with quote

Hi Murmohk1,


Quote:
Shitij,

First, Is this a HOMEWORK?


Aaaooooooucchhhhhh!!!! - That Hurt...:-)

No its certainly no homework - Its supposedly serious office work (but yes I agree - I am still a kid).

********

Here's the issue we have :

We are working on a system - where loads of feeds come in daily and as we are planning to write the details of feeds received into versions of GDG's........as in each feed info gets written into a new version - we expect around 300 feeds/hour.

All these feeds get concatenated and report is made on hourly basis - would not like to over-write versions of GDG's (as in I dont want the 256th feed to over-write my 1st GDG version - as it contains a feed information)....so I wrote :

Quote:
I have a certain step in my job - that shall run when the limit of GDG generations reaches an alarming level - May be say version count has reached 240 versions (very close to limit of 255) - so this step will delete the GDG and redefine that - so that new versions can be written afresh.


So we will make an early report for the feeds (in case GDG limit running close) - rather than the usual hourly report - Once report written - the GDG stack is deleted - and redefined and we start again.

Ofcouse we will take backup of full GDG before deleting it.[/quote]
Back to top
View user's profile Send private message
Devzee

Active Member


Joined: 20 Jan 2007
Posts: 684
Location: Hollywood

PostPosted: Sat Sep 29, 2007 10:02 am
Reply with quote

Quote:
each feed info gets written into a new version - we expect around 300 feeds/hour.

I'm just thinking is it you need to create new GDG version to just store the feed info?
How about making DISP=MOD and keep appending all the feeds, and while creating report select the feed info and after that take backup of the feed and delete this file.
Back to top
View user's profile Send private message
shitij

New User


Joined: 09 Sep 2005
Posts: 31
Location: Delhi

PostPosted: Sat Sep 29, 2007 10:10 am
Reply with quote

Hi Devzee,

Well actually the problem with DISP=MOD will be that - there would be contention issues - and we would certainly not like jobs going down due to contention.

I am pretty sure we would have 2-3 feeds making in at the same time/instance - and then all wanting to write to the same file...so that way it would not be feasible appending the records.

Thanks,
Shitij
Back to top
View user's profile Send private message
murmohk1

Senior Member


Joined: 29 Jun 2006
Posts: 1436
Location: Bangalore,India

PostPosted: Sat Sep 29, 2007 11:21 am
Reply with quote

Shitij,

Quote:
we expect around 300 feeds/hour.


Quote:
I am pretty sure we would have 2-3 feeds making in at the same time/instance

What is the size of each feed? You can't create furhter generations (actually the jobs go in waiting mode) while +1 gen is being created. so think about this also.
Back to top
View user's profile Send private message
Devzee

Active Member


Joined: 20 Jan 2007
Posts: 684
Location: Hollywood

PostPosted: Sat Sep 29, 2007 12:14 pm
Reply with quote

For each feed do you have same or different job?

If diff job then each job should create its own GDG naming convention correct. If you say that all diff jobs are creating same GDG versions - then there will be contention.
If you have same job for all feeds then also you cannot create multiple GDG at same time.
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19244
Location: Inside the Matrix

PostPosted: Sat Sep 29, 2007 9:16 pm
Reply with quote

Hello,

As a caution, if you use MOD, you may have data consistency issues if anything abends in the middle of a run.

With your arrival rate (300 an hour), you might want to consider routing the incoming data to a queue and then pull them from the queue.

If you want to continue the way the current process runs, you could cut the interval to 3/4 or 1/2 hour. . . .
Back to top
View user's profile Send private message
View previous topic :: :: View next topic  
Post new topic   Reply to topic View Bookmarks
All times are GMT + 6 Hours
Forum Index -> JCL & VSAM

 


Similar Topics
Topic Forum Replies
No new posts Pulling a fixed number of records fro... DB2 2
No new posts Substring number between 2 characters... DFSORT/ICETOOL 2
No new posts Generate random number from range of ... COBOL Programming 3
No new posts Increase the number of columns in the... IBM Tools 3
No new posts Adding first / last acct numerber to ... DFSORT/ICETOOL 7
Search our Forums:

Back to Top