Joined: 23 Nov 2006 Posts: 19243 Location: Inside the Matrix
Hello,
Quote:
Which would be better , considering there would be around 60,000 to 70,000 records in each of the input files.
This volume should not be an issue either way.
Quote:
Try to run both the JOBs at the same time.
I'd not run them concurrently - there may be performance issues as well as allocation issues. If you run both tests, i'd suggest you run them serially and alternate the order of execution making multiple tests.
Joined: 22 Apr 2006 Posts: 6248 Location: Mumbai, India
Hi Dick,
dick scherrer wrote:
Quote:
Try to run both the JOBs at the same time.
I'd not run them concurrently - there may be performance issues as well as allocation issues. If you run both tests, i'd suggest you run them serially and alternate the order of execution making multiple tests.
Recently, I ran one job two times with the same inputs/setups, but at different times CPU usage were different, so I suggested the above.
For the TS concern, there might be a problem in using the same DSN in both the "concurrent" JOBs (one JOB might go on 'hold') but that DSN name can be changed, while having the same contents, if some one is concerned about performance issues.
Joined: 23 Nov 2006 Posts: 19243 Location: Inside the Matrix
Hello,
Quote:
but at different times CPU usage were different
This should not have an impact on the amount of cpu the "test" job uses. It may impact wall time, but not cpu time needed.
When trying to compare 2 processes like these, it is a good idea to run them in the same job, multiple times, alternating the sequence. This will give a decent approximation of a benchmark without needing to set up some formal benchmark environment.
The 2 items to look at for a simple comparison is cpu time and i/o (excps).
Thanks a lot for all your suggestions. I have decided to go for the program and not the JCL.. not becos of any performance issues . But becos I can handle exceptions better thru a cobol program.