View previous topic :: View next topic
|
Author |
Message |
Ralph260260
New User
Joined: 20 Oct 2014 Posts: 1 Location: Germany
|
|
|
|
Hello,
this is my first post here in this forum.
I'm 54 Years old and I write from Germany.
I'm doing Jobs in Mainframe over 20 Years.
Today we have this Problem on board:
A man in the job preparation schedules startet a job a second time
-> an soo a Dataset was sent to a SAP System duplicate.
Now we have here a core melt accident.
A was asked from the headquarter, how we can prevent this case of error again.
1. Idea: we have to give the job a new design: copy the inputfile into another Filename and delete the input.
But we have the factor - blunderer:
But what finesse is there available in the OPC or the RVS to block double incomming or outgoing files.
Or is there a parameter to set to this job, so the job can run only one times a day?
Has somebody some helpful new ideas for me?
that would be great |
|
Back to top |
|
|
PeterHolland
Global Moderator
Joined: 27 Oct 2009 Posts: 2481 Location: Netherlands, Amstelveen
|
|
|
|
Hello Ralph, and welcome here.
First, fire the person responsible.
After an application is defined in OPC, nobody should be allowed to run a completed production job from OPC, or outside of OPC.
Your organization has to take care of that.
If the job triggered from OPC fails, it should only be restarted by people
specialised to that and only after the person responsible for the application has taken all recovery actions (like checking for an eventually duplicate dataset being sent) |
|
Back to top |
|
|
Rohit Umarjikar
Global Moderator
Joined: 21 Sep 2010 Posts: 3051 Location: NYC,USA
|
|
|
|
Ralph,
1) We submit the conditions of the jobs in Changeman scheduling and OPC scheduling team will ensure this that the job does not RUN more than 1 time a day.
2) If you know the arrival time of the file for e.g. let us say 11am then set a time dependant job which triggers at 12 noon ( 1 hour of buffer) so that even if another file comes it won't execute.
3) if you have a DB2 then we usually have a table created for this with timestamp and the jobname in it. So if for this job and for todays date if we get any row then it means this is second time running today so we will bypass everything and complete the job.
4) Last one is to use LISTCAT to know the creation date of the file and it seems to be todays date then you can bypass the other steps and DO not send anything to SAP. |
|
Back to top |
|
|
Gabriel Araujo Alves
New User
Joined: 20 Jul 2010 Posts: 38 Location: Brazil
|
|
|
|
Welcome Ralph,
there are lots of ways to check an unique execution per day,
actually your schedule shouldn't allows it, but you can make a program that controls the executions based on ODATE, preventing this duplicate datasets. |
|
Back to top |
|
|
expat
Global Moderator
Joined: 14 Mar 2007 Posts: 8797 Location: Welsh Wales
|
|
|
|
One of the ways I've done it in the past is for the job to set up a dummy dataset, with no delete at the end of job.
The dummy was only deleted once all dependant jobs / applications were completed, or at a set time of day.
Subsequent runs outside or the permitted time scale / application cycle would fail JCL error. |
|
Back to top |
|
|
|