A man in the job preparation schedules startet a job a second time
-> an soo a Dataset was sent to a SAP System duplicate.
Now we have here a core melt accident.
A was asked from the headquarter, how we can prevent this case of error again.
1. Idea: we have to give the job a new design: copy the inputfile into another Filename and delete the input.
But we have the factor - blunderer:
But what finesse is there available in the OPC or the RVS to block double incomming or outgoing files.
Or is there a parameter to set to this job, so the job can run only one times a day?
Joined: 27 Oct 2009 Posts: 2470 Location: Netherlands, Amstelveen
Hello Ralph, and welcome here.
First, fire the person responsible.
After an application is defined in OPC, nobody should be allowed to run a completed production job from OPC, or outside of OPC.
Your organization has to take care of that.
If the job triggered from OPC fails, it should only be restarted by people
specialised to that and only after the person responsible for the application has taken all recovery actions (like checking for an eventually duplicate dataset being sent)
1) We submit the conditions of the jobs in Changeman scheduling and OPC scheduling team will ensure this that the job does not RUN more than 1 time a day.
2) If you know the arrival time of the file for e.g. let us say 11am then set a time dependant job which triggers at 12 noon ( 1 hour of buffer) so that even if another file comes it won't execute.
3) if you have a DB2 then we usually have a table created for this with timestamp and the jobname in it. So if for this job and for todays date if we get any row then it means this is second time running today so we will bypass everything and complete the job.
4) Last one is to use LISTCAT to know the creation date of the file and it seems to be todays date then you can bypass the other steps and DO not send anything to SAP.
there are lots of ways to check an unique execution per day,
actually your schedule shouldn't allows it, but you can make a program that controls the executions based on ODATE, preventing this duplicate datasets.