View previous topic :: View next topic
|
Author |
Message |
vinu78
Active User
Joined: 02 Oct 2008 Posts: 179 Location: India
|
|
|
|
Hi All,
My requirement is as follows.
I will manually submit JOB1, that automatically submits JOB2 (FTP step) in DEV region. Then we need to wait for approximately 10 min to get a flat dataset populated with data. Once the dataset has some data, we need to have JOB3 run automatically.
I have used SUBMIT job concept at the end of JOB1 that automatically submits JOB2. However I am not sure, how to make JCL wait or check for the dataset populated with data again and then automatically submit JOB3.
We can check empty dataset using IDCAMS, but please help me to understand, how to queue JCL to look for existence of data in dataset in DEV region.
Thanks
Vinu |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8696 Location: Dubuque, Iowa, USA
|
|
|
|
Unless you have a scheduler available in your DEV region, what you want to do cannot be done -- period. Batch jobs waiting are a VERY BAD IDEA, and there is no (easy) way to determine when a data set has data placed in it. |
|
Back to top |
|
|
steve-myers
Active Member
Joined: 30 Nov 2013 Posts: 917 Location: The Universe
|
|
|
|
Robert Sample wrote: |
Unless you have a scheduler available in your DEV region, what you want to do cannot be done -- period. Batch jobs waiting are a VERY BAD IDEA, and there is no (easy) way to determine when a data set has data placed in it. |
- Batch jobs waiting are a VERY BAD IDEA.
Agreed. - there is no (easy) way to determine when a data set has data placed in it.
Yes and no, though some agent has to be waiting and periodically checking - Check DS1LSTAR in the format 1 DSCB.
- Check DS1REFD in the format 1 DSCB
- Intercept SMF writes. Assume the data set has been modified when a type 15 record appears for the data set.
|
|
Back to top |
|
|
PeterHolland
Global Moderator
Joined: 27 Oct 2009 Posts: 2481 Location: Netherlands, Amstelveen
|
|
|
|
Quote: |
there is no (easy) way to determine when a data set has data placed in it.
Yes and no, though some agent has to be waiting and periodically checking
Check DS1LSTAR in the format 1 DSCB.
Check DS1REFD in the format 1 DSCB
Intercept SMF writes. Assume the data set has been modified when a type 15 record appears for the data set.
|
And that is (using SMF/JES2) what a scheduler is doing. |
|
Back to top |
|
|
Arun Raj
Moderator
Joined: 17 Oct 2006 Posts: 2481 Location: @my desk
|
|
|
|
vinu78 wrote: |
Then we need to wait for approximately 10 min to get a flat dataset populated with data |
Just curious what is going on in this 10 mins, and how you arrived at this number. |
|
Back to top |
|
|
vinu78
Active User
Joined: 02 Oct 2008 Posts: 179 Location: India
|
|
|
|
Thanks all for the information.
The dataset normally gets populated with data approximately within 5-6 min after JOB2 has run. So another idea (will check for DS1LSTAR and DS1REFD option) that came to my mind is, whether we can start JOB3 exactly 10 min after running JOB2.
Please let me know whether is it possible to delay running JOB3 for 10 min, even if JOB2 automatically triggers JOB2.
Thanks
Vinu |
|
Back to top |
|
|
Nic Clouston
Global Moderator
Joined: 10 May 2007 Posts: 2455 Location: Hampshire, UK
|
|
|
|
How is the data sert being populated? |
|
Back to top |
|
|
vinu78
Active User
Joined: 02 Oct 2008 Posts: 179 Location: India
|
|
|
|
Dataset gets populated from some 3rd party system.
Our intention is, once the dataset gets data (within 5-6 min), we should run JOB3 whose source is this dataset.
In short, I will submit JOB1 only. Then it automatically submits JOB2 and then JOB2 submits JOB3. However JOB3 will run exactly 10 min after it gets submitted. |
|
Back to top |
|
|
UmeySan
Active Member
Joined: 22 Aug 2006 Posts: 771 Location: Germany
|
|
|
|
Perhaps read about "event-triggered tracking" called ETT.
Regards, UmeySan |
|
Back to top |
|
|
Arun Raj
Moderator
Joined: 17 Oct 2006 Posts: 2481 Location: @my desk
|
|
|
|
Since manual submission is required anyways, so why don't we just submit JOB1 (and possibly make it submit the other 2 jobs) after you know you have the '3rd party data set'. (assuming you don't have a scheduler available in DEV) |
|
Back to top |
|
|
daveporcelan
Active Member
Joined: 01 Dec 2006 Posts: 792 Location: Pennsylvania
|
|
|
|
I agree with everyone above, having a batch Job wait is a bad idea.
However it would be pretty easy to do this with a Rexx exec.
Assuming the dataset in question has been created empty.
This exec would work.
It should be an added step in your JCL.
Use the return code to determine if you should run the Submit step for Job3.
You may want to put the check dataset as DD CHKFILE in your JCL then the ALLOC and FREE can be removed.
Code: |
/* REXX EXEC TO CHECK FOR DATA EXISTENCE */
/* MAXRC = 0 - DATA FOUND */
/* MAXRC = 4 - DATA NOT FOUND MAX-SECS REACHED */
/* THIS IS TO PREVENT AN INFINITE LOOP */
INREC. = ''
TOTAL_SECS = 0
SLEEP_SECS = 3
MAX_SECS = 20
MAXRC = 0
CHECK_FILE = 'CHECK.DATASET'
"ALLOC DD(CHECKFIL) DA('"CHECK_FILE"') SHR REUSE"
"EXECIO 1 DISKR CHECKFIL(STEM INREC. OPEN FINIS)"
"FREE DD(CHECKFIL)"
SAY 'CHECKING' CHECK_FILE 'AFTER' TOTAL_SECS 'SECONDS'
IF INREC.1 = '' THEN DO FOREVER
SLEEP SLEEP_SECS
TOTAL_SECS = TOTAL_SECS + SLEEP_SECS
SAY 'CHECKING' CHECK_FILE 'AFTER' TOTAL_SECS 'SECONDS'
"ALLOC DD(CHECKFIL) DA('"CHECK_FILE"') SHR REUSE"
"EXECIO 1 DISKR CHECKFIL(STEM INREC. OPEN FINIS)"
"FREE DD(CHECKFIL)"
IF INREC.1 /= '' THEN DO
MAXRC = 0
LEAVE
END
ELSE IF TOTAL_SECS > MAX_SECS THEN DO
MAXRC = 4
LEAVE
END
SAY 'CHKDATA COMPLETED MAXRC =' MAXRC
ZISPFRC = MAXRC
"ISPEXEC VPUT (ZISPFRC) SHARED"
EXIT (ZISPFRC)
|
|
|
Back to top |
|
|
vinu78
Active User
Joined: 02 Oct 2008 Posts: 179 Location: India
|
|
|
|
Not sure whether I have explained it correctly.
JOB2 will FTP some data to 3rd party system and then only 3rd party system populates data in the dataset. After which we need to run JOB3.
So at the time of submission of JOB2, we can't check for the data in dataset. It will be empty only. Only after submitting JOB2, the dataset will get data (after 5-6- min).
Dave - Thanks for the Rexx code. Whether we need to have the abvoe mentioned REXX code at the end of JOB2. |
|
Back to top |
|
|
daveporcelan
Active Member
Joined: 01 Dec 2006 Posts: 792 Location: Pennsylvania
|
|
|
|
I got it.
After the FTP step, add a step to execute this Rexx step.
Set your Sleep seconds to 300 (5 minutes) and Max seconds to 1200 so that after twenty minutes you stop checking.
You should be able to take it from here (or not). |
|
Back to top |
|
|
Terry Heinze
JCL Moderator
Joined: 14 Jul 2008 Posts: 1249 Location: Richfield, MN, USA
|
|
|
|
Won't the REXX step in Job2 tie up the initiator until data arrives from the 3rd party? |
|
Back to top |
|
|
daveporcelan
Active Member
Joined: 01 Dec 2006 Posts: 792 Location: Pennsylvania
|
|
|
|
Yep, it sure will. Not saying it is a good idea. It is not.
Just showing it is not very difficult to do.
What should be done, is to use your scheduler to recognize the arrival of the file from an external site.
Then have Job3 submitted by the scheduler. |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8696 Location: Dubuque, Iowa, USA
|
|
|
|
Another option would be to have the process putting the data into the file use FTP after the data is transferred to submit JOB3 into JES (which can be done depending upon the site options); no waits are required with this approach. |
|
Back to top |
|
|
vinu78
Active User
Joined: 02 Oct 2008 Posts: 179 Location: India
|
|
|
|
Thanks Dave for the clarification and everybody for pouring in ideas.
I will try the REXX code. Also I find some code like below
Code: |
//STEP010 EXEC PGM=AOPBATCH,PARM='/sh/bin/sleep 600'
//STDOUT DD SYSOUT=*
//STDERR DD SYSOUT=*
//STDIN DD DUMMY |
Whether this works. If yes, whether having this code at the end of JOB2 will hold the resources for that specified time of 10 min? |
|
Back to top |
|
|
daveporcelan
Active Member
Joined: 01 Dec 2006 Posts: 792 Location: Pennsylvania
|
|
|
|
Robert, That is an excellent approach. We use that for internal transfers from in house servers.
Our shop would not allow that with transfers from an external site.
Vinu78, AOPBATCH is allowing you to run a 'program' called sleep found in a unix directory. Pretty much the same thing as what I wrote without the checking for data. Just sleep for 10 minutes, and hope the file has been populated.
It will not use much resources, but it will tie up the initiator, not allowing anything else run there. |
|
Back to top |
|
|
|