View previous topic :: View next topic
|
Author |
Message |
Husna
New User
Joined: 02 Jun 2006 Posts: 49
|
|
|
|
I have a Job...Which will take 10 hrs to execute. Now, my requirement is to execute this job in an hour. How can I do the same without using any class changes (Prioriy also)?
Any one of you please let me know the solution for this?
Awaiting your reply. |
|
Back to top |
|
|
Bharanidharan
New User
Joined: 20 Jun 2005 Posts: 86 Location: Chennai, India
|
|
|
|
The time a job takes to complete may depend on many parameters. I could think of these:
1. Type and volume of business processing - what is the number of records that get processed.
2. Timing at which the job runs - at what time the job is presented to JES (I assume that the duration of 10 hours you mentioned is the time from which you submitted the job till your job gets completed).
3. How many external connectivities your job handles & how efficient is your VTAM; these may prolong your job's completion time.
4. Resource contention and deadlocks.
If you feel that these factors are constants and have no effect on your process, can you expand your question a bit? Is it a hypothetical question or is it real-time? If it's latter, where do you face this? |
|
Back to top |
|
|
Hanfur
Active User
Joined: 21 Jun 2006 Posts: 104
|
|
|
|
Check the shops available job classes first ,and run in high priority class.
If VSAM processing is there try to implement buffers for reducing processing time.
If its Batch db2 prog which takes more time you have to explain the SQL qeeries to fine tune it as per shops standard.
-Han. |
|
Back to top |
|
|
pbgunasekar
New User
Joined: 28 May 2005 Posts: 26 Location: Chennai
|
|
|
|
try to avoid taking backups of Master VSAM files more times.
regards,
guna |
|
Back to top |
|
|
GlobalGyan
New User
Joined: 31 Jan 2006 Posts: 28
|
|
|
|
My team had done something like this a year back for a project. A daily job was running for about 4 hours and the req was to reduce it at least by 40%. LIke Bharanidharan pointed out in #1, we had looked into the records that were being processed. After some analysis found that the number of records processed was too high and also the daily input feed to the job was a big one with verry less changes in it the next day. So we wrote a program that would determine what needs to be processed and what does not need to be processed. After writing a simple cobol program, the job time was reduced to 20 mins! from 240 mins to 20 mins :-)
-GlobalGyan |
|
Back to top |
|
|
parikshit123
Active User
Joined: 01 Jul 2005 Posts: 269 Location: India
|
|
|
|
Hi,
you might like to use STROBE to analyze which part of your program is taking more time to execute?
There are n number of factors that effects the efficiency of the code. you need to analyze the statistics of previous runs of the program and try to find out optimization opportunity and implement it!
Good luck man. Its a very painful task I believe. But greatly rewarding as well
Thanks,
Parikshit. |
|
Back to top |
|
|
MCEVOY
New User
Joined: 21 Nov 2005 Posts: 18
|
|
|
|
Without knowing more about your job, it is difficult to offer help here.
My experience with such long running jobs is that they often involve a lot of I/O, which is very time-consuming.
You can reduce the time spent on I/O in a number of ways, but the following are fruitful approaches to pursue:
* Ensure that the blocksizes for your files are large (full-track or half-track). Smaller blocksizes imply smaller buffers and more excps to read/write.
* For really large files, code BUFFNO= to get more buffers. The default is 5, but more can help speed things up.
* For files that are passed within the job but not needed outside it, use UNIT=VIO. This creates the file in memory but not on disk (apart from paging), thereby avoiding I/O altogether.
Finally, the recommendation to use STROBE to analyse where the time is being spent is good advice. You may need to get your application support team to do this for you. You may discover that there are radical improvements that can be made to the logic.
If you are processing DB/2 tables, ensure that your tables have been Reorg'ed and your Plans rebound. Use the Explain command to examine the access paths. You might be dismayed to discover that your Select Statements are resulting in a Tablespace Scan every time. This can often happen with Plans which are bound for new tables which may be empty on first implementation and it makes sense to do a Tablespace Scan. However, when the data builds up, this becomes outrageously expensive.
Good luck. |
|
Back to top |
|
|
Ravi Rao
New User
Joined: 28 Feb 2006 Posts: 3
|
|
|
|
HI,
There can many reasons for the job to run so long.
It would be good if you could analyze what programs the job is executing and see how the data is being accessed off files from the JCL.
Maybe the program is opening and closing files for each record that is being processed. You would have to look at reducing I/O for improving the processing time. |
|
Back to top |
|
|
|