View previous topic :: View next topic
|
Author |
Message |
kiran_65
New User
Joined: 01 Apr 2005 Posts: 46
|
|
|
|
Hi Friends,
I have the following scenario.
There is one job performing NDM (getting file from server to mainframe).
The second Job is processing the file.
I want to combine the two jobs into a single job.
I tried to combine into a single job. The following is the problem I am facing.
The NDM step copies the file into GDG. I have used the the same file in the job using (+1) version. But the job ran by taking previous version (-1). I have tried by keeping (0) version. but it is showing file not present.
Note: the two jobs call different proc's
Please help me.
Let me know if the query is not understandable.
Thanks
Kiran |
|
Back to top |
|
|
dbzTHEdinosauer
Global Moderator
Joined: 20 Oct 2006 Posts: 6966 Location: porcelain throne
|
|
|
|
let us see your jcl |
|
Back to top |
|
|
kiran_65
New User
Joined: 01 Apr 2005 Posts: 46
|
|
|
|
Ths JCL is as follows:
Code: |
// EXEC KHRU1923
//S1 EXEC KHRU1924
//*
//STEP0060 EXEC PGM=IEBGENER,
// COND=(0,NE,S1.STEP0050)
//SYSIN DD DUMMY
//SYSPRINT DD SYSOUT=*
//*
//SYSUT1 DD DSN=SYS1.KROGER.CTLLIB(SOUCMAI2),DISP=SHR
// DD DSN=SYS1.KROGER.CTLLIB(PEYEMAI3),DISP=SHR
//*
//SYSUT2 DD SYSOUT=(B,SMTP),LRECL=80
|
The step EXEC KHRU1923 is the NDM step.
The step EXEC KHRU1924 is the proc processing the file.
Let me know if this doesn't clarify.
in the KHRU1924, the file used is D922.KHRSQ.PDETAIL.HOU.KHRD1920(0).
I also tried with D922.KHRSQ.PDETAIL.HOU.KHRD1920(+1), then it is showing file not catalog.
Thanks
Kiran. |
|
Back to top |
|
|
superk
Global Moderator
Joined: 26 Apr 2004 Posts: 4652 Location: Raleigh, NC, USA
|
|
|
|
This concept is never going to work. You need to re-think your approach. Keeping the two jobs is the best option, in my opinion. |
|
Back to top |
|
|
vijaynagabhairava
New User
Joined: 13 Apr 2008 Posts: 3 Location: India
|
|
|
|
Hi Kiran,
I am not sure wether the following logic works or not. You can give a try if you want to.
" Instead of copying the job directly into GDG, copy it to a temporary dataset(like &&sample) and pass it to the next step and prcoess it. Then you just add a simple copy step(using the utility IEBGENER) at the end of job to copt the data present in the temp dataset to GDG"
Once again, I am not sure wether it actually works or not. It is upto you to decide wether to give a try to it or not.
If you give a try to this, please let me know wether it has worked or not.
Thanks
Vijay |
|
Back to top |
|
|
satheeshkamal
New User
Joined: 09 Jan 2007 Posts: 28 Location: Chennai
|
|
|
|
Could you post how exactly you refer your file in proc KHRU1923 and in KHRU1924? |
|
Back to top |
|
|
satheeshkamal
New User
Joined: 09 Jan 2007 Posts: 28 Location: Chennai
|
|
|
|
oops!! sorry...i think i should wear glasses...
Quote: |
I also tried with D922.KHRSQ.PDETAIL.HOU.KHRD1920(+1), then it is showing file not catalog.
|
Is this showing during JCL Check?
If yes, it won't give you problem when you run the JCL |
|
Back to top |
|
|
kkxlnc
New User
Joined: 12 Jun 2005 Posts: 44 Location: Boston
|
|
|
|
Hi,
You can't refer to the same file being NDM'ed to be processed in the same job. The problem is your NDM process would take sometime to completely transfer the file and as such it would not be available for a few seconds or minutes. But, your second step tries to search for that file immediately which it would never find.
Hence you have to have a delay of few minutes in both these steps. I would suggest, you do it through different jobs with some delayed time.
Regards. |
|
Back to top |
|
|
Peter Poole
New User
Joined: 07 Jan 2009 Posts: 50 Location: Scotland
|
|
|
|
This has come up a few times recently, so..
What typically happens with a batch NDM job is that the NDM step goes to NDM and says "Do this please". The step then ends, saying "I've done my bit", and the next step runs.
NDM may or may not run successfully, and will take however long it takes to do whatever it's been asked to do, so you can't guarantee if or when the dataset you care about will be in place.
In a development or ad hoc environment, you're pretty much stuffed at this point. You'll need to wait and see the file in place before submitting the next step(s) as a seperate job.
In a production environment, you may have a couple of further options, depending on the software you have installed. Most scheduling packages, such as CA-7, will allow you to define a dataset being catalogued as a trigger event, so you can tell it "When the NDM job has run and the dataset has been catalogued, submit the next job".
The more convoluted option is to use CA-OPS/MVS or another Automated Operations product to watch for the completion message from NDM and then submit the follow-up step(s).
In either of those cases, I suggest you consult the local Ops Analysts to see which they prefer before re-inventing any wheels!
Cheers. |
|
Back to top |
|
|
northman22
New User
Joined: 02 Mar 2009 Posts: 3 Location: homer alaska
|
|
|
|
You can set up the NDM to submit another batch job based on the completion code of the NDM. It's been awhile, but you can do an
if stepxxxx cond code=0000,
then submit job =xxx.xxxxx.xxx,
else,
then submit job =xxx.xxxxx.xxx
Look at the NDM manual. I may not have the exact commands right.
I used to have a batch job on CA7 NDM a file to another site, then submit a CA7 batch trailer job and post a job waiting on the host CA7 that had a user comment in it. If the NDM failed it would run another batch trailer job posting the CA7 job that the NDM had failed so the operators could see it. |
|
Back to top |
|
|
northman22
New User
Joined: 02 Mar 2009 Posts: 3 Location: homer alaska
|
|
|
|
You also have to specify the site where you want the job submitted just like for the ndm DSN. Like I said it's been awhile and I don't have access to the exact commands. |
|
Back to top |
|
|
|