View previous topic :: :: View next topic
|
Author |
Message |
kitchu84
New User
Joined: 02 Dec 2006 Posts: 33 Location: chennai
|
|
|
|
Hi!
I have a PS file which contains the list of all the flat files for which I need to get record counts.
INPUT: ---> This the PS file containing names of other GDG/flat files
Filename
======
ABCD.DATA.FILE1.G0001V00
ABCD.DATA.FILE2.G0005V00
OUTPUT:
Filename Count
====== ========
ABCD.DATA.FILE1.G0001V00 0000000050
ABCD.DATA.FILE2.G0005V00 0000001000
Since there are huge no of GDG/flat files mentioned in the PS(in 1000s!), its difficult to write individual steps to get record count of each and every file mentioned using the below code:
//SYSIN DD *
OPTION COPY
OUTFIL REMOVECC,NODETAIL,
TRAILER1=(20:'NUMBER OF RECORDS',COUNT=(M10,LENGTH=8))
//*
Also the content of PS file (which contains the list of files for which record count is required) keeps on changing every day. Is there a way to get the record counts for a huge set of flat files mentioned in a PS. |
|
Back to top |
|
 |
|
|
Nic Clouston
Global Moderator
Joined: 10 May 2007 Posts: 2002 Location: UK
|
|
|
|
Yes - you could use Rexx or sort or a program to read your input file nd generate the necessary job steps/control cards |
|
Back to top |
|
 |
prino
Senior Member
Joined: 07 Feb 2009 Posts: 1101 Location: Oostende, Belgium
|
|
|
|
kitchu84 wrote: |
I have a PS file which contains the list of all the flat files for which I need to get record counts.
.
.
Since there are huge no of GDG/flat files mentioned in the PS(in 1000s!), its difficult to write individual steps to get record count of each and every file mentioned using the below code:
.
.
Also the content of PS file (which contains the list of files for which record count is required) keeps on changing every day. Is there a way to get the record counts for a huge set of flat files mentioned in a PS. Please advise, |
Maybe you should first explain why the flipping hell you actually need to be doing this?
Having just a mere 25 years of experience on mainframes, I've never come across a requirement that seems as ridiculous as this... (Unless it's a ploy to charge your customers for CPU time) |
|
Back to top |
|
 |
superk
Moderator Team Head
Joined: 26 Apr 2004 Posts: 4650 Location: Raleigh, NC, USA
|
|
|
|
Prino, by my count, I think there have been about a dozen topics I've read in the past month all on this same theme. Never an explanation, and most seem to end up unresolved.
I don't understand this statement:
Quote: |
... its difficult to write individual steps to get record count of each and every file mentioned ...
|
why is it difficult? Once you define the process, the machine writes and then executes the code automagically. It's a process called automation. |
|
Back to top |
|
 |
expat
Global Moderator
Joined: 14 Mar 2007 Posts: 8657 Location: Back in jolly old England
|
|
|
|
But surely if the programs creating these datasets were correctly written, they would display a count of records created, and any programs reading these files would display a count of records read.
Why read the file once again to get the counts when if in the first place things were done correctly there would not be the need. |
|
Back to top |
|
 |
Bill Woodger
DFSORT Moderator
Joined: 09 Mar 2011 Posts: 7314
|
|
|
|
If you really want to go ahead anyway, at least pick on a sub-set of the files first, so the user gets a feel for what you think their requirements are. Say, a sub-set of 50 files or so, you can do that all by hand/qad automation using what you know already.
BUT
I am also at a loss as to why anyone would want to do this. In what way would it be useful to know how many records are on each of a vast number of files?
It seems from your comment about the input file-of-files changing daily, that the intention is to run it daily. I'd like to see the scheduler dependencies for the job(s)! Or will it not matter if some files are from yesterday and some are from today, or haven't been updated for a week, quarter, year, or from a job which abended and has not yet completed, whatever.
You would also need to collate all the outputs. Then stick them in a spreadsheet so that someone can deal with all the flaky figures in a convenient way.
We can continue to pick holes in your request (I'm sure we can find lots more), but it would be more fruitful if we knew what the business requirement was - forget "how to do it" for now, and let us know "what the user actually wants". |
|
Back to top |
|
 |
dbzTHEdinosauer
Global Moderator
Joined: 20 Oct 2006 Posts: 6970 Location: porcelain throne
|
|
|
|
his user is probably like the manager who used to be here.
anytime a copybook or model is not longer used,
normal shops remove the element from the production library.
but to him,
that meant 1 less module or copybook
that he could tell others that he was responsible for. |
|
Back to top |
|
 |
enrico-sorichetti
Global Moderator
Joined: 14 Mar 2007 Posts: 10457 Location: italy
|
|
|
|
Quote: |
I am also at a loss as to why anyone would want to do this. In what way would it be useful to know how many records are on each of a vast number of files? |
Bill, have You noticed how many threads dealing with counting are around here ?
look also that applications are shooting around zillions of duplicate records
( always from the number of threads dealing with getting rid of duplicates )
but maybe the two issues are really tied together !
in my more than 30 years of IT practicing I had to deal wit getting rid of duplicates at most five times and really in emergency situations with close scrutiny on data reconciliation  |
|
Back to top |
|
 |
Bill Woodger
DFSORT Moderator
Joined: 09 Mar 2011 Posts: 7314
|
|
|
|
enrico, you make an interesting point. You're suggesting that he might be wanting the record counts from thousands of files so that he can do some sort of reconciliation to see where data is being lost or duplicated?
Yoiks. It actually makes sense. I had considered Dick's point, but then thought someone like that must prefer Cylinders to records. I thought of a press release (XYZ Mainframe Services processes billionth transaction this year!), but then why daily (although the daily was only my guess, I suppose)?
And you think all these duplicates requests and counting requests might be the same sort of thing? Mmmm....
OK, lots of self-reconciliation retro-fitted. Outsource it to me, "Old School" solution, 500 euros per program. Your own solution, TBD euros per program. |
|
Back to top |
|
 |
expat
Global Moderator
Joined: 14 Mar 2007 Posts: 8657 Location: Back in jolly old England
|
|
|
|
I actually worked on a project where the count reconcilliations were automated rather than having the operators fill in paper sheets from counts displayed on the screen.
Back in 1979 that was. |
|
Back to top |
|
 |
Bill Woodger
DFSORT Moderator
Joined: 09 Mar 2011 Posts: 7314
|
|
|
|
You learn don't you. It's a computer. Get it to do all the (meaningful, I guess I'd better add, given the thread) work it can. |
|
Back to top |
|
 |
kitchu84
New User
Joined: 02 Dec 2006 Posts: 33 Location: chennai
|
|
|
|
hI All,
By difficult I meant that it would not be advisable to write code to get count for each and every file being reffered in the job. Since these jobs were created long back , the programs which created the datasets donot have logic to display the record counts.
My client Requirement was to display the details of jobs run - JOb id, Job Name, start time, end time, return code etc and all the file names used in the job and their respective counts . The job details could be fetched from SAR unload. SImilarly the file names, but I dont see the file counts getting displayed unless a SORT step is used or the programs used in the job have the logic to display the code.
I believe under such a circumstance, there might be a better solution than what I am thinking (i.e displaying count of each and every file). Need your feedback and suggestions. Please advise how to display counts in such a case where we dont have programs taking care to display the count and we have multiple files used in the job.
Thanks, |
|
Back to top |
|
 |
Skolusu
Senior Member
Joined: 07 Dec 2007 Posts: 2205 Location: San Jose
|
|
|
|
kitchu84,
Please answer all of the questions below
1. What is the LRECL and RECFM of the file that contains the DSN names?
2. Do all the files in the list have same DCB properties (LRECL ,RECFM ..)?
3. Do you have authority to submit a job via INTRDR?
4. What is the max number of files can you have in the list file? |
|
Back to top |
|
 |
kitchu84
New User
Joined: 02 Dec 2006 Posts: 33 Location: chennai
|
|
|
|
hi Skolusu,
1. The LRECL= 150 and RECFM=FB of the file that contains the DSN names
2. all the files in the list have different DCB properties ...
3. Unfortunately no , we donot have access to submit a job via INTRDR
4. The List file can contain at the max 1000 files.
Thanks,
Priyanka. |
|
Back to top |
|
 |
Skolusu
Senior Member
Joined: 07 Dec 2007 Posts: 2205 Location: San Jose
|
|
|
|
kitchu84 wrote: |
hi Skolusu,
3. Unfortunately no , we donot have access to submit a job via INTRDR
Thanks,
Priyanka. |
A DFSORT solution is ruled out as it involves generating the JCL and submitting it to the INTRDR. |
|
Back to top |
|
 |
kitchu84
New User
Joined: 02 Dec 2006 Posts: 33 Location: chennai
|
|
|
|
hi Skolusu,
Still if you could let us know the solution, it wil help us learn.
Thanks,
Priyanka. |
|
Back to top |
|
 |
Bill Woodger
DFSORT Moderator
Joined: 09 Mar 2011 Posts: 7314
|
|
|
|
kitchu84 wrote: |
[...]
I believe under such a circumstance, there might be a better solution than what I am thinking (i.e displaying count of each and every file). Need your feedback and suggestions. Please advise how to display counts in such a case where we dont have programs taking care to display the count and we have multiple files used in the job.
[...]
|
What does your boss and his boss and his boss think of all this so far? Somebody has to know whether it is possible to go back to the client and see if there is any room for compromise? What about just displaying "major" file counts, rather than all? What about looking for a "space used" solution? (is the client interested from a "charging" point of view or from a systems' reconciliation point of view, or what?
At the end of the day, you have a possibility to take you file-of-files, rexx, a JCL "template" and generate a number of huge jobs with lots of steps where you have filled in the dataset-name in the step-template and tagged them all together.
I'd see if it was possible to get anything out of the client first. |
|
Back to top |
|
 |
Akatsukami
Global Moderator
Joined: 03 Oct 2009 Posts: 1786 Location: Bloomington, IL
|
|
|
|
kitchu84 wrote: |
3. Unfortunately no , we donot have access to submit a job via INTRDR |
Is Rexx available to you? |
|
Back to top |
|
 |
kitchu84
New User
Joined: 02 Dec 2006 Posts: 33 Location: chennai
|
|
|
|
hello Skolusu,
It would be great if you could suggest a solution if we had access to submit jobs through INTRDR ...
will be useful from learning perspective ...
Thanks, |
|
Back to top |
|
 |
kitchu84
New User
Joined: 02 Dec 2006 Posts: 33 Location: chennai
|
|
|
|
@Akatsukami
No we cannot use REXX as well. |
|
Back to top |
|
 |
|