View previous topic :: View next topic
|
Author |
Message |
Karthikeyan Subbarayan
New User
Joined: 24 Feb 2008 Posts: 62 Location: Boston
|
|
|
|
Hi,
I have a requirement where my regular job runs multiple times a day (approx.. the job runs every 5-10 min ) and process a output file. Is there a way where i can take backup of the output file on daily basis?
To rephrases it. I should get a single backup of output file which runs multiple times a day. |
|
Back to top |
|
|
Akatsukami
Global Moderator
Joined: 03 Oct 2009 Posts: 1788 Location: Bloomington, IL
|
|
|
|
Let the output data set have a disposition of MOD,CATLG in the regular job; have the back-up job delete it. |
|
Back to top |
|
|
Karthikeyan Subbarayan
New User
Joined: 24 Feb 2008 Posts: 62 Location: Boston
|
|
|
|
If we keep DISP=MOD then it will update the outputfile till some one / new job delete it. Is there a way without adding any new job's to backup the output file one per day. |
|
Back to top |
|
|
Karthikeyan Subbarayan
New User
Joined: 24 Feb 2008 Posts: 62 Location: Boston
|
|
|
|
With out adding any new job's, i want to take a backup in my regular job itself. |
|
Back to top |
|
|
Akatsukami
Global Moderator
Joined: 03 Oct 2009 Posts: 1788 Location: Bloomington, IL
|
|
|
|
Why do you want to write a job step that will only do something useful less than 1% of the time? |
|
Back to top |
|
|
Karthikeyan Subbarayan
New User
Joined: 24 Feb 2008 Posts: 62 Location: Boston
|
|
|
|
My user does not want to create a new job's for this. He is fine to have a new step, more over in the regular job the output file is created in a GDG.
If we have a new job then we won’t be sure about the GDG version created for a day, It may vary.
In the regular job instead of GDG if we add the flat file with DISP=MOD then we have to create a new job to delete and refresh every day.
Does it is possible to have a flat file with naming something like MY.DATA.D"130422" with DISP=MOD then every day the output file changes automatically? |
|
Back to top |
|
|
Akatsukami
Global Moderator
Joined: 03 Oct 2009 Posts: 1788 Location: Bloomington, IL
|
|
|
|
Try something like this:
Code: |
//STEP1 EXEC PGM=RORA
//FOO DD DSN=A.B.C,DISP=(MOD,CATLG),etc.
//STEP2 EXEC PGM=IKJEFT1B,PARM='%BAR'
//CNTLFILE DD DSN=D.E.F,DISP=OLD
//SYSTSPRT DD SYSOUT=*
//SYSPRINT DD SYSOUT=*
//SYSTSIN DD DUMMY
//SYSIN DD DUMMY |
The BAR exec
Code: |
/* Rexx */
today = date(s)
"EXECIO 1 DISKRU CNTLFILE"
pull record
lastrun = strip(record)
if (today>lastrun) then do
/* Your back-up process here */
"DELETE 'A.B.C'"
record = today
push record
"EXECIO 1 DISKW CNTLFILE (FINIS"
end
exit 0 |
|
|
Back to top |
|
|
Karthikeyan Subbarayan
New User
Joined: 24 Feb 2008 Posts: 62 Location: Boston
|
|
|
|
Sure let me try this tomorrow !! |
|
Back to top |
|
|
Nic Clouston
Global Moderator
Joined: 10 May 2007 Posts: 2455 Location: Hampshire, UK
|
|
|
|
Are you currently creating a new output file each execution or just adding to the first one of the day? What happens to this output file after the last has been created? Will the backup contain all the data for the day or just from the last run? How do you know which is the last run?
If each run created a +1 GDG (starting with G0001V00) then you can run an extra job (or extra steps in the last execution) to read the GDG base using some copy program e.g. IEBGENER and then delete the GDGs. Of course, if the individual output datasets are required for downstream processing then this approach may not be entirely feasible but you have given no details as to how all this fits into the overall scheme of things. |
|
Back to top |
|
|
Anuj Dhawan
Superior Member
Joined: 22 Apr 2006 Posts: 6250 Location: Mumbai, India
|
|
|
|
Quote: |
How do you know which is the last run? |
This is the first question came to mind when I first read this thread in morning - and 'am still curious... |
|
Back to top |
|
|
Karthikeyan Subbarayan
New User
Joined: 24 Feb 2008 Posts: 62 Location: Boston
|
|
|
|
@Anuj: We will not know which is our first run or last run for the day. We can take per the calendar time 00:00 to 23:59
@Nic: In regular job on each run it will create a GDG output file, Once processed data will not there in the next version of GDG. So each GDG will have unique processed data on it. We will not know how many GDG version created for the data, it may vary. We need all the individual GDG in downstream processing.
We need this backup because if any data in our regular job has issues then it is taking lot of manual time to find the right version of the GDG. Since we are not sure about the no of GDG version created for the day, it makes more complex to find the version. Processed data will not have duplication on the same date, we might be able to reprocess on the next day but not on the same day.
My user does not wanted to create a new job for the daily backup, if it is possible in the same job with different step.
Hope I answer your questions. |
|
Back to top |
|
|
Anuj Dhawan
Superior Member
Joined: 22 Apr 2006 Posts: 6250 Location: Mumbai, India
|
|
|
|
Well, I'm not sure if you're on 'solution-side' or on 'probelm-side'.
OTOH, so far what you've posted - I've got vague idea, see if it works for you:
1. Create another GDG - call it HLQ.SECOND.GDG.
2. You already have 'your GDG base in question' - call it HLQ.FIRST.GDG
3. Have first step which will copy HLQ.FIRST.GDG(+1) to HLQ.SECOND.GDG(+1).
4. Use this as another step to take back-up of SECOND GDG Base:
Code: |
//STEP00 EXEC PGM=EZACFSM1
//SYSIN DD DATA,DLM=@@
//HLQB JOB ,'BKUP',CLASS=0,MSGCLASS=1,
// NOTIFY=&SYSUID,REGION=4096K
//STEP010 EXEC PGM=SORT
//SYSOUT DD SYSOUT=*
//SORTIN DD DSN=HLQ.SECOND.GDG,DISP=SHR
//SORTOUT DD DSN=HLQ.SECOND.GDG.BKUP.D&LYYMMDD,
// DISP=(,CATLG,DELETE),SPACE=(CYL,(10,10),RLSE),
// UNIT=DISK
//SYSIN DD *
OPTION COPY
//*
@@
//SYSOUT DD SYSOUT=(*,INTRDR),LRECL=80,BLKSIZE=80,BUFNO=1,RECFM=F |
D&LYYMMDD - will be a new date for every day. So ever next day you've a new file. And change HLQ to what is permitted.
HTH... |
|
Back to top |
|
|
Karthikeyan Subbarayan
New User
Joined: 24 Feb 2008 Posts: 62 Location: Boston
|
|
|
|
Anuj : Thanks, It is working fine. I have few questions to clarify before i proceed further.
1. Why we need to take the back up of 1st GDG to 2nd GDG?
2. Why we need to copy the base of GDG?
3. Does the " HLQ.SECOND.GDG.BKUP.D&LYYMMDD " change the date automatically, so for each date it build a new dataset.
To confirm:
If in my regular job just adding your step will run an inbuilt job to create a backup dataset?
Instead of giving the second GDG base to copy, can i give the version created on my regular job? so when ever my regular job run it update the backup dataset and create a new file for the next day. |
|
Back to top |
|
|
Karthikeyan Subbarayan
New User
Joined: 24 Feb 2008 Posts: 62 Location: Boston
|
|
|
|
@Anuj, Thanks in advance !!.
Please confirm my above post is right for my understanding. |
|
Back to top |
|
|
Akatsukami
Global Moderator
Joined: 03 Oct 2009 Posts: 1788 Location: Bloomington, IL
|
|
|
|
Karthikeyan Subbarayan wrote: |
@Anuj, Thanks in advance !!.
Please confirm my above post is right for my understanding. |
IIRC, it is 01:00 Wednesday for Anuj; you will likely have to exercise patience. |
|
Back to top |
|
|
Karthikeyan Subbarayan
New User
Joined: 24 Feb 2008 Posts: 62 Location: Boston
|
|
Back to top |
|
|
Gary McDowell
Active User
Joined: 15 Oct 2012 Posts: 139 Location: USA
|
|
|
|
If I read your original topic requirement correctly…
Process regular job output.file as you do now.
In same job, copy output.file to output.file.GDG(+1) [IDCAMS DEFINE GDG LIMIT(255)]
At end of day job / after last regular job, copy all generations of output.file.GDG to output.file.DAILY.GDG(+1) [IDCAMS DEFINE GDG LIMIT(???)]
Delete all versions of output.file.GDG to get ready for the next day [IDCAMS DELETE output.file.GDG.*] |
|
Back to top |
|
|
Karthikeyan Subbarayan
New User
Joined: 24 Feb 2008 Posts: 62 Location: Boston
|
|
|
|
@Gary: I am already using GDG only in my regular job. Also we will not come to know how many time the regular job run for a day.
Having a new GDG will need to schedule a new JOB to run for end of day, which my user does not want ot have a new job. |
|
Back to top |
|
|
Anuj Dhawan
Superior Member
Joined: 22 Apr 2006 Posts: 6250 Location: Mumbai, India
|
|
|
|
Quote: |
1. Why we need to take the back up of 1st GDG to 2nd GDG? |
You said that the '1st GDG' will be used in downstream Jobs - so not to 'disturb' the original GDG, picked a second GDG. And when you want to look back at the back-up -- number-of-2nd-GDGs will tell you how many GDGs are backed-up. There was another thing which came to mind initially on why I choose a second GDG...but I forgot it now. Am I getting old!?
Quote: |
2. Why we need to copy the base of GDG? |
Because you said that you don't know how many generations are attached to base when the last execution of Job is done with.
Quote: |
3. Does the " HLQ.SECOND.GDG.BKUP.D&LYYMMDD " change the date automatically, so for each date it build a new dataset. |
Yes, EZACFSM1 will take care about that.
Quote: |
If in my regular job just adding your step will run an inbuilt job to create a backup dataset? |
Yes and I hope that does not break the 'rules of the game' - but as long it 'works' for you, all is well.
Quote: |
Instead of giving the second GDG base to copy, can i give the version created on my regular job? so when ever my regular job run it update the backup dataset and create a new file for the next day. |
I'd say - experiment around it. |
|
Back to top |
|
|
Anuj Dhawan
Superior Member
Joined: 22 Apr 2006 Posts: 6250 Location: Mumbai, India
|
|
|
|
And for the record - it's generation/s not version/s! |
|
Back to top |
|
|
Gary McDowell
Active User
Joined: 15 Oct 2012 Posts: 139 Location: USA
|
|
|
|
Karthikeyan Subbarayan wrote: |
@Gary: I am already using GDG only in my regular job. Also we will not come to know how many time the regular job run for a day.
Having a new GDG will need to schedule a new JOB to run for end of day, which my user does not want ot have a new job. |
You do not need a new job for a new output.file.DAILY.GDG. Just a new step to your current job. The new step can have a condition code [ie ,COND=(0,LE)] to bypass on "regular job" runs, and then take the condition code out on the last run of the day (or as close to midnight as possible). |
|
Back to top |
|
|
Karthikeyan Subbarayan
New User
Joined: 24 Feb 2008 Posts: 62 Location: Boston
|
|
|
|
Thanks for all!! I try to experiment it and see which suits me best. |
|
Back to top |
|
|
Anuj Dhawan
Superior Member
Joined: 22 Apr 2006 Posts: 6250 Location: Mumbai, India
|
|
|
|
Good Luck and keep us posted! |
|
Back to top |
|
|
Karthikeyan Subbarayan
New User
Joined: 24 Feb 2008 Posts: 62 Location: Boston
|
|
|
|
Thanks a lot Anuj, Currently checking license for EZACFSM1 utility. Once they confirm in approval, I may proceed in use of EZACFSM1 utility. |
|
Back to top |
|
|
Akatsukami
Global Moderator
Joined: 03 Oct 2009 Posts: 1788 Location: Bloomington, IL
|
|
|
|
EZACFSM1 is part of Communications Server. |
|
Back to top |
|
|
|