IBM Mainframe Forum Index
 
Log In
 
IBM Mainframe Forum Index Mainframe: Search IBM Mainframe Forum: FAQ Register
 

Need program( any ) to count records for huge files


IBM Mainframe Forums -> DFSORT/ICETOOL
Post new topic   Reply to topic
View previous topic :: View next topic  
Author Message
soumyaranjan007

New User


Joined: 30 Aug 2006
Posts: 30
Location: mumbai

PostPosted: Tue Jul 27, 2010 12:35 pm
Reply with quote

Hi,

I have huge task to do count huge flat and PDS files. If any body have any program which reduce my effort.

Regards,
Soumya
Back to top
View user's profile Send private message
Anuj Dhawan

Superior Member


Joined: 22 Apr 2006
Posts: 6250
Location: Mumbai, India

PostPosted: Tue Jul 27, 2010 3:11 pm
Reply with quote

What are you looking for - number of records in some files or "count of files" itself?
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19244
Location: Inside the Matrix

PostPosted: Tue Jul 27, 2010 7:47 pm
Reply with quote

Hello,

Once gotten, how will the answers be used?
Back to top
View user's profile Send private message
soumyaranjan007

New User


Joined: 30 Aug 2006
Posts: 30
Location: mumbai

PostPosted: Tue Jul 27, 2010 9:36 pm
Reply with quote

i am looking for "number of records in some files". That program name , i want to use as command.
Back to top
View user's profile Send private message
sqlcode1

Active Member


Joined: 08 Apr 2010
Posts: 577
Location: USA

PostPosted: Tue Jul 27, 2010 9:44 pm
Reply with quote

soumyaranjan007,

Did you say PDS Files?

Use below to get the record count of multiple input files.

Code:
//SORT01  EXEC PGM=SORT                                                 
//SORTIN DD DISP=SHR,DSN=YOUR.INPUT.FILE1                               
//       DD DISP=SHR,DSN=YOUR.INPUT.FILE2                               
//       DD DISP=SHR,DSN=YOUR.INPUT.FILE3                               
//       DD DISP=SHR,DSN=YOUR.INPUT.FILE4                               
//SORTOUT DD SYSOUT=*                                                   
//SYSIN DD *                                                           
 OPTION COPY                                                           
 OUTFIL NODETAIL,REMOVECC,                                             
 TRAILER1=('RECORDS COUNT',COUNT=(M11,LENGTH=12))                       
//SYSOUT DD SYSOUT=*                                                   
//                                                                     


Thanks,
Back to top
View user's profile Send private message
soumyaranjan007

New User


Joined: 30 Aug 2006
Posts: 30
Location: mumbai

PostPosted: Wed Jul 28, 2010 2:25 pm
Reply with quote

This job is taking long time for huge file like 10 corors and more record file. is there any other way to deil those files??
Back to top
View user's profile Send private message
PeterHolland

Global Moderator


Joined: 27 Oct 2009
Posts: 2481
Location: Netherlands, Amstelveen

PostPosted: Wed Jul 28, 2010 2:37 pm
Reply with quote

What is a "coror" ? And "deil" ?
Back to top
View user's profile Send private message
smijoss

Active User


Joined: 30 Aug 2007
Posts: 114
Location: pune

PostPosted: Wed Jul 28, 2010 2:50 pm
Reply with quote

check the CPU usage !!

or if its an one time activity .. you can run for 1 file at a time

(i assume "deil" means DEAL .. )
Back to top
View user's profile Send private message
Anuj Dhawan

Superior Member


Joined: 22 Apr 2006
Posts: 6250
Location: Mumbai, India

PostPosted: Wed Jul 28, 2010 4:46 pm
Reply with quote

PeterHolland wrote:
What is a "coror" ? And "deil" ?
S/He meant "crore" which is equivalent to* 10 millions. And by "deil", possibly s/he meant "deal".

* - They are local Indian jargons for numbers, I've never heard about them in other part of the world, though. E.g.: 1 million is 10 Lakhs.
Back to top
View user's profile Send private message
sqlcode1

Active Member


Joined: 08 Apr 2010
Posts: 577
Location: USA

PostPosted: Wed Jul 28, 2010 5:29 pm
Reply with quote

I have run this job for files with more than 100 million records and it worked just fine.

Wait until Frank or Kolusu responds but I am not aware of any method which is more efficient. In any case it will be useful if you post your entire sysout.

Thanks,
Back to top
View user's profile Send private message
PeterHolland

Global Moderator


Joined: 27 Oct 2009
Posts: 2481
Location: Netherlands, Amstelveen

PostPosted: Wed Jul 28, 2010 6:33 pm
Reply with quote

Anuj,

thank you very much. So in an Indian restaurant i can order crore Tandoori chicken legs? icon_biggrin.gif
Back to top
View user's profile Send private message
Anuj Dhawan

Superior Member


Joined: 22 Apr 2006
Posts: 6250
Location: Mumbai, India

PostPosted: Wed Jul 28, 2010 6:53 pm
Reply with quote

Sure, why not? icon_smile.gif
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19244
Location: Inside the Matrix

PostPosted: Wed Jul 28, 2010 9:56 pm
Reply with quote

Hello,

Dare i ask. . . .

Why does someone believe these record counts are needed icon_confused.gif

If there is someone who actually believes these numbers will be used for something productive, the processes that create these files and the processes that use these files should be modified to generate these counts. . .
Back to top
View user's profile Send private message
PeterHolland

Global Moderator


Joined: 27 Oct 2009
Posts: 2481
Location: Netherlands, Amstelveen

PostPosted: Wed Jul 28, 2010 11:39 pm
Reply with quote

Hello Dick,

Quote:

If there is someone who actually believes these numbers will be used for something productive, the processes that create these files and the processes that use these files should be modified to generate these counts. . .



I grew up in environments like that, the file generating software gave
the record number(s), hash totals and all that stuff. The file processing
software only had to check those numbers.
Maybe nowadays companies dont want to do those basics anymore ?
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19244
Location: Inside the Matrix

PostPosted: Thu Jul 29, 2010 12:15 am
Reply with quote

Hi Peter,

Quote:
I grew up in environments like that, the file generating software gave
the record number(s), hash totals and all that stuff.
Yup. And these were recorded manually in log books so that people could make sure the ending count/total from one run was the same as the beginning count/total of the next. . .

However, none of the organizations i've worked with for a very long time continue the practice. . . If anything like this is tracked it is written to a "log" in a database or vsam file.

If TS would asnswer the question from many replies ago:
Quote:
Once gotten, how will the answers be used?
i believe we might be able to provide something workable that would not require passing all of the data. . .
Back to top
View user's profile Send private message
soumyaranjan007

New User


Joined: 30 Aug 2006
Posts: 30
Location: mumbai

PostPosted: Fri Jul 30, 2010 4:03 pm
Reply with quote

above program was abending with ABEND S322 when i was trying to run for 140 milions record file.

is it possible by REXX program?
Back to top
View user's profile Send private message
Anuj Dhawan

Superior Member


Joined: 22 Apr 2006
Posts: 6250
Location: Mumbai, India

PostPosted: Fri Jul 30, 2010 4:16 pm
Reply with quote

A S322 abend is related to exceeding the amount of CPU time the job or job step can use. If you have no TIME= parameter on the JOB card or the EXEC cards, the system may have a default for the amount of CPU your job can used. And if yes, try again with some higher value of TIME=?

REXX, on ther other hand, is not a good choice for the task you've mentioned about.
Back to top
View user's profile Send private message
dbzTHEdinosauer

Global Moderator


Joined: 20 Oct 2006
Posts: 6966
Location: porcelain throne

PostPosted: Fri Jul 30, 2010 5:57 pm
Reply with quote

increasing the buffno parm on your dd statement may also help.

but, as others have said, insure that the job is submitted to the proper class.

which class? a job class that has more time allocated to it.
and not a jobcard time parm,
you have to ask your system support people which job class is allocated for more time.
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19244
Location: Inside the Matrix

PostPosted: Sat Jul 31, 2010 2:42 am
Reply with quote

Hello,

If you change to rexx fro msome utility you will most likely needeven more cpu time.

Is there some reason you hae not posted why these counts are needed (as has been requested)?
Back to top
View user's profile Send private message
View previous topic :: :: View next topic  
Post new topic   Reply to topic View Bookmarks
All times are GMT + 6 Hours
Forum Index -> DFSORT/ICETOOL

 


Similar Topics
Topic Forum Replies
No new posts Compare 2 files and retrive records f... DFSORT/ICETOOL 3
No new posts Compare 2 files(F1 & F2) and writ... JCL & VSAM 8
No new posts Using API Gateway from CICS program CICS 0
No new posts To get the count of rows for every 1 ... DB2 3
No new posts Write line by line from two files DFSORT/ICETOOL 7
Search our Forums:

Back to Top