IBM Mainframe Forum Index
 
Log In
 
IBM Mainframe Forum Index Mainframe: Search IBM Mainframe Forum: FAQ Register
 

How to track incoming input files that trigger a job?


IBM Mainframe Forums -> JCL & VSAM
Post new topic   Reply to topic
View previous topic :: View next topic  
Author Message
shreya19

New User


Joined: 13 Mar 2014
Posts: 34
Location: USA

PostPosted: Thu Feb 25, 2016 6:15 pm
Reply with quote

I have around 100 files received each day.
A job is triggered when a file is received, processes it and deletes the input file. The output of the job is available in SAR.

Now, I have to track if all the files are received each day. What can be the simplest way to do this?
The files received are not the same every day.

what I Thought is: As soon as the file is received, trigger job runs - to add a step in this to copy the input file name to a PS. Repeat this for each file (appending file names to the same PS).
Manually create a list of file and compare the PS with this.
Issues faced: 1. since the file list is not same for each day, how to check what day is today? In order to compare the PS with that particular day's list.
2. Since the file is deleted by the trigger job, there would be difficulty testing.

Any simpler approach?
Back to top
View user's profile Send private message
Abid Hasan

New User


Joined: 25 Mar 2013
Posts: 88
Location: India

PostPosted: Thu Feb 25, 2016 7:09 pm
Reply with quote

Hello,

Going by the solution you already have:
a. The point where the file is received, can be enhanced to write the received file name to a dataset (which will be written in DISP MOD, so each file name will be appended to the last one)
b. I am assuming you have a check-point at which you know for sure that 'all 100 or so' files are received, at this point, execute a simple REXX which would:
1. Calculate the day of the week
2. Compare your earlier created dataset with the list against 'that' day's file list which can be pre-saved in another dataset or a PDS member- your choice.

I remember working on a similar solution in my last engagement; we had NDM there, so getting list of files received on a given day into a PS file, was a simple task. In case you have a similar setup, then it can be done without much effort - provided you have an identifiable list of files that you're expecting on a given day.

Hope this helps.
Back to top
View user's profile Send private message
Willy Jensen

Active Member


Joined: 01 Sep 2015
Posts: 734
Location: Denmark

PostPosted: Thu Feb 25, 2016 7:16 pm
Reply with quote

1. Add a step to the triggered job, add date in front of the file name in your flat file, in a format suitable for sorting and filtering.
2. Generate a list of expected files, also with date at the front.
3. Use a compare program i.e. ISPF SUPERC to outline differences.
Back to top
View user's profile Send private message
Terry Heinze

JCL Moderator


Joined: 14 Jul 2008
Posts: 1248
Location: Richfield, MN, USA

PostPosted: Thu Feb 25, 2016 8:41 pm
Reply with quote

I'd try Willy's solution but in step 1 I'd write to a VSAM file with a key consisting of current date and dataset name. I might include a sequence number as part of the key in case 2 files of the same name arrived on the same day.
Back to top
View user's profile Send private message
shreya19

New User


Joined: 13 Mar 2014
Posts: 34
Location: USA

PostPosted: Fri Feb 26, 2016 11:36 am
Reply with quote

Thanks everyone for the replies!

Abid, we use NDM too. Was there a different approach used by you for file received via NDM? If yes, please suggest
Back to top
View user's profile Send private message
Abid Hasan

New User


Joined: 25 Mar 2013
Posts: 88
Location: India

PostPosted: Fri Feb 26, 2016 12:10 pm
Reply with quote

Hello Shreya,

shreya19 wrote:

Was there a different approach used by you for file received via NDM? If yes, please suggest


It has been a while that I've worked with NDM, but from what I remember NDM writes a log for all activities it partakes.
One of the sites I worked for earlier, used to dump NDM STATs log into GDG versions at business end-of-day, everyday. We would extract the file transmission (to/fro) data from NDM STATs log for a specific date/time interval (using REXX/COBOL - programmer's choice; we were using REXX).

This would give us a PS file with all the file names that were handled by NDM during the specified interval, data was formatted such that it gave:
a. Business path-name (LAN/Server etc)
b. Mainframe file/dataset name
c. Date(s) at which NDM received/completed the request for processing this file
d. Time(s) at which process was initiated and completed
e. Record count

We'd receive an excel from business with the list of files that they'd sent to us on a given business day; same would be uploaded to mainframe as CSV, reformatted to give file-name, date/time stamps, record count.
A final step would compare the NDM extract dataset with the business provided dataset; and voila.

In case of exceptions - a report was written; otherwise, batch stream was released.

Did a quick search, came up with this; see if it is of any help to you.

Hth.
Back to top
View user's profile Send private message
shreya19

New User


Joined: 13 Mar 2014
Posts: 34
Location: USA

PostPosted: Fri Feb 26, 2016 2:13 pm
Reply with quote

Thanks Abdul! This helps a lot. This is very simple approach and gives a good direction to proceed.
Back to top
View user's profile Send private message
View previous topic :: :: View next topic  
Post new topic   Reply to topic View Bookmarks
All times are GMT + 6 Hours
Forum Index -> JCL & VSAM

 


Similar Topics
Topic Forum Replies
No new posts COBOL sorting, with input GDG base COBOL Programming 7
No new posts 3 files concatenated to 1 DFSORT/ICETOOL 2
No new posts JCL sort to compare dates in two file... DFSORT/ICETOOL 2
No new posts Concatenate 2 input datasets and give... JCL & VSAM 2
No new posts Compare 2 files and retrive records f... DFSORT/ICETOOL 3
Search our Forums:

Back to Top