Portal | Manuals | References | Downloads | Info | Programs | JCLs | Master the Mainframes
IBM Mainframe Computers Forums Index
 
Register
 
IBM Mainframe Computers Forums Index Mainframe: Search IBM Mainframe Forum: FAQ Memberlist Usergroups Profile Log in to check your private messages Log in
 

 

Performance monitoring in file I/O operation

 
Post new topic   Reply to topic    IBMMAINFRAMES.com Support Forums -> JCL & VSAM
View previous topic :: :: View next topic  
Author Message
adarsh.bhalke

New User


Joined: 06 May 2007
Posts: 16
Location: pune

PostPosted: Tue Jun 12, 2007 8:55 pm    Post subject: Performance monitoring in file I/O operation
Reply with quote

Hi,
I am performing the file I/O operation with vast amount of data.But when i submit the jcl its taking more time to give the result.

Can anybody tell me how to monitor the performance here.
Back to top
View user's profile Send private message

expat

Global Moderator


Joined: 14 Mar 2007
Posts: 8593
Location: Back in jolly old England

PostPosted: Tue Jun 12, 2007 9:00 pm    Post subject:
Reply with quote

The I/O for any file is dependent ont the access method being used.

Is it VSAM, if yes, KSDS, access=Randon, sequential, skip sequential.
Back to top
View user's profile Send private message
dick scherrer

Site Director


Joined: 23 Nov 2006
Posts: 19270
Location: Inside the Matrix

PostPosted: Wed Jun 13, 2007 2:57 am    Post subject:
Reply with quote

Hello,

How many records equals "vast"? Does this job use a single data source (i.e. the vast file)? Does this job interface with any database tables?

How do you base the "taking more time"? More than what some similar job takes or just more than you'd prefer? Has this been running for some period of time or is this a new process that has no history?

What kind of monitoring did you have in mind?

Once you post more info about your process and reply to the questions asked, we may be able to clarify things.
Back to top
View user's profile Send private message
Devzee

Active Member


Joined: 20 Jan 2007
Posts: 684
Location: Hollywood

PostPosted: Wed Jun 13, 2007 9:15 am    Post subject:
Reply with quote

Does your data resides on TAPE?
Back to top
View user's profile Send private message
adarsh.bhalke

New User


Joined: 06 May 2007
Posts: 16
Location: pune

PostPosted: Mon Jun 25, 2007 5:17 pm    Post subject: Re: Performance monitoring in file I/O operation
Reply with quote

no actually my first sequential file contains more than 10,000,000 records and second file contains 100 match keys. so if match key from second file matches with any record in first then i have to write thar record to different (third file). Here the first file is multivolume.If i submit the job for same its taking more than 5 hours because for pericular match key from second file there are more then 10,000 records in first file.
Back to top
View user's profile Send private message
dick scherrer

Site Director


Joined: 23 Nov 2006
Posts: 19270
Location: Inside the Matrix

PostPosted: Mon Jun 25, 2007 6:27 pm    Post subject:
Reply with quote

Hello,

Hello, are the records in the second file the match keys and nothing else?

How long does it take to read the 10million records if the match is not being performed (if you don't have some code that will do this, just "copy" the file with IEBGENER or SORT and assign the output file to DUMMY). Knowing how long it takes to pass the data will help in making an estimate on how long the "real" process should run.

I would expect we can get your process to run in the time it takes to read all of the records plus 10% (or less) if i've correctly understood your requirement. If it takes almost 5 hours to merely read the data, we will have to look further.

Please post back with the answers to the questions above.
Back to top
View user's profile Send private message
expat

Global Moderator


Joined: 14 Mar 2007
Posts: 8593
Location: Back in jolly old England

PostPosted: Mon Jun 25, 2007 6:28 pm    Post subject:
Reply with quote

What are you saying here .... that you read the first file and then go through the second file looking for matches, and then do it all again for the next record of the first file ?

Please, tell me that this is NOT happening here.

Have you considered making file 1 a KSDS and then doing ramdom (on key) from the second file. That way you only read file 2 once.
Back to top
View user's profile Send private message
dick scherrer

Site Director


Joined: 23 Nov 2006
Posts: 19270
Location: Inside the Matrix

PostPosted: Mon Jun 25, 2007 6:58 pm    Post subject:
Reply with quote

Hello,

If the "second" file is only the 200 keys, they could be put into an array and SEARCHed using only a single pass of the 10mil record file. . . If the array was build "in sequence", SEARCH ALL might save even more time.

If i've understood the requirement, there wouldn't need to be any other processing. . . .
Back to top
View user's profile Send private message
View previous topic :: :: View next topic  
Post new topic   Reply to topic    IBMMAINFRAMES.com Support Forums -> JCL & VSAM All times are GMT + 6 Hours
Page 1 of 1

 

Search our Forum:

Similar Topics
Topic Author Forum Replies Posted
No new posts Changing of LRECL of a file abdulrafi DFSORT/ICETOOL 1 Fri Mar 24, 2017 3:25 pm
No new posts splitting a file abdulrafi DFSORT/ICETOOL 3 Fri Mar 24, 2017 11:51 am
No new posts Receive a file using PCOMM macro Harald.v.K IBM Tools 0 Thu Mar 23, 2017 6:50 pm
No new posts Export flat file data into excel sheet murali.andaluri DFSORT/ICETOOL 2 Mon Mar 20, 2017 5:39 pm
No new posts Formatting VB File Learncoholic DFSORT/ICETOOL 3 Mon Mar 20, 2017 12:29 pm


Facebook
Back to Top
 
Mainframe Wiki | Forum Rules | Bookmarks | Subscriptions | FAQ | Tutorials | Contact Us