IBM Mainframe Forum Index
 
Log In
 
IBM Mainframe Forum Index Mainframe: Search IBM Mainframe Forum: FAQ Register
 

Performance Engineering of a JOB.


IBM Mainframe Forums -> Testing & Performance
Post new topic   Reply to topic
View previous topic :: View next topic  
Author Message
cybertaurean

New User


Joined: 22 Dec 2008
Posts: 87
Location: US

PostPosted: Mon Apr 26, 2010 4:32 pm
Reply with quote

Hi All,

I would like to get some insight into what are the factors that need to be considered while planning for performance improvement of a JOB.

What is counted as a performance improvement

Is it lesser execution time?
Lesser # of IO's performed?
Lesser # of CPU used?

There may be cases where we might be able to reduce the runtime of a job but, the CPU consumption might still be high making it a costly job. Given below is the statistic of a job which runs for around 30 - 60 mins approx. which got reported as a costly job -

Code:
JOB NAME    XXXXXXXX  JOBXXXXX                              CPU TIME   00:11:15.38           SRB TIME            00:00:01.09
PROGRAMMER  XXX-XXXX                                        INIT DATE  04/25/2010 2010.115   INITIATION TIME     01:00:11.30
SYSTEM ID   SYSX                                            TERM DATE  04/25/2010 2010.115   TERMINATION TIME    01:29:36.16
CLASS       N         PERF. GROUP      0                    SERV UNITS 25,715,771            ELAPSED TIME        00:29:24.86
MAIN STORAGE UNITS           0     CPU UNITS      25,674,215     SRB UNITS          41,473     I/O UNITS              83   


Now, my understanding is that a job which would take 30 - 60 mins might not be considered a costly one. Could you advise me (looking at the statistics), what could be the basis for the cost?

Note - The cost of this job for 9 week run (daily) was $14,128.84.

Regards,
Sumesh
Back to top
View user's profile Send private message
Robert Sample

Global Moderator


Joined: 06 Jun 2008
Posts: 8700
Location: Dubuque, Iowa, USA

PostPosted: Mon Apr 26, 2010 5:12 pm
Reply with quote

Quote:
What is counted as a performance improvement

Is it lesser execution time?
Lesser # of IO's performed?
Lesser # of CPU used?
Yes.

Performance tuning can be focused on any of these -- what is tuned depends, usually, upon what is the bottleneck. Since you did not provide any details about CPU usage or I/O usage or anything else, you need to consult your site support group for assistance in tuning the job. 30 minutes for a job to run is not excessive -- we have jobs that run several hours each day.

Quote:
Note - The cost of this job for 9 week run (daily) was $14,128.84.
This is completely and totally useless information to give us -- job costs depend upon the charge back scheme used, and that is entirely site dependent. For example, my site does not do charge back so the job cost would be $0 as far as accounting is concerned.
Back to top
View user's profile Send private message
cybertaurean

New User


Joined: 22 Dec 2008
Posts: 87
Location: US

PostPosted: Mon Apr 26, 2010 5:27 pm
Reply with quote

Thanks Robert,

I didn't quite understand the "chargeback scheme" term. The point that I was trying to make was -

9 wk run = 9X7 = 63 days.
30 - 45 min/day = 45 mins approx.

Approx. Total runtime (secs) = 63X45X60 = 170100 seconds approx.
Total Cost Reported = $14,128.84
$/sec = 0.0831 approx.

LOL. the above might be a stupid way of looking at things, since I don't have any idea how all this stuff is calculated. I just wanted to ask if there is something that can be inferred?.


Would you be familiar with tools that are used for Performance Analysis?. I have heard about Strobe and Tritune.

What is the granularity of the analysis these tools can present?. Can they go as far as identifying the step or even better, the instruction that is CPU intensive?.

Would you have any reference material on Tritune?.

Regards,
Sumesh
Back to top
View user's profile Send private message
Robert Sample

Global Moderator


Joined: 06 Jun 2008
Posts: 8700
Location: Dubuque, Iowa, USA

PostPosted: Mon Apr 26, 2010 5:37 pm
Reply with quote

I am familiar with Strobe from past positions -- it can analyze jobs right down to line of source code. I do not know about Tritune.

Quote:
LOL. the above might be a stupid way of looking at things, since I don't have any idea how all this stuff is calculated. I just wanted to ask if there is something that can be inferred?.
Not from cost -- your example would be totally ridiculous at my site since the cost is considered $0 since the system is treated as overhead. If your site is reporting cost of $14,128.34 then that is up to YOUR site to determine the factors that cause that cost -- we have absolutely no way to identify the factors that are used to determine this cost. You must consult your site support group if you want to identify the factors that influence the cost.

You need to start your tuning effort by using Strobe (or whatever) to identify the bottleneck in processing. This could be CPU, it could be a disk file -- no way to tell without looking. You then make changes to eliminate the bottleneck, at which time another bottleneck will appear. You continue the process until either the cost of making more changes is higher than the potential savings or you run out of time to tune it.
Back to top
View user's profile Send private message
cybertaurean

New User


Joined: 22 Dec 2008
Posts: 87
Location: US

PostPosted: Mon Apr 26, 2010 5:48 pm
Reply with quote

Quote:
"You then make changes to eliminate the bottleneck, at which time another bottleneck will appear. You continue the process until either the cost of making more changes is higher than the potential savings or you run out of time to tune it."


Very good point, Robert!!! icon_smile.gif

As for the cost thing, I will probably contact someone from the site to know the details.

Let me check for Tritune elsewhere (it's the software that is available at my end at the moment and I need to learn it).

Thanks a lot for your help, Robert. Have a good day!!!

- Sumesh
Back to top
View user's profile Send private message
Robert Sample

Global Moderator


Joined: 06 Jun 2008
Posts: 8700
Location: Dubuque, Iowa, USA

PostPosted: Mon Apr 26, 2010 5:51 pm
Reply with quote

Glad to hear it helped! Have a good day yourself and good luck with tuning.
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19243
Location: Inside the Matrix

PostPosted: Mon Apr 26, 2010 11:38 pm
Reply with quote

Hello,

You can measure where the current "expense" is many ways.

To improve performance i've had the most success looking at what a process is supposed to accomplish and then look at how this is accomplished. Often (usually) wholesale improvement of a "bad" process can be realized by changing the way a process is implemented.

Unfortunately, this is not automatic and takes real effort. For example one of the "success stories" involved eliminating more than a billion "hits" to database tables. The way the code was written, each access looked as though it was ok, but i questioned why such a small business requirement needed so much processing. Answer - coder took the easy way out. . . We basically scrapped the code and replaced it with about 5 times the original amount of code than the original, but performed incredibly better.

Some people have the belief that if something runs a long time/uses lots of resources it is big job. I believe that the amount of work to be done should determine what is a big job. . .
Back to top
View user's profile Send private message
expat

Global Moderator


Joined: 14 Mar 2007
Posts: 8796
Location: Welsh Wales

PostPosted: Tue Apr 27, 2010 11:39 am
Reply with quote

As Dick has said, the process itself is a great cause or wasted resource. You need to understand the process in great depth to be able to recommend or try improvements.

One project used an ESDS of about 4 million records, and by introducing a couple of AIX for the file, the processing was amended and used far less resource. Before it used a seq read to find data, but by analysing the most frequently used record fields and then building an AIX for finding data, the need to read almost every record every time was removed.

Again, as Dick has said, the programmer took the easy option.
Back to top
View user's profile Send private message
cybertaurean

New User


Joined: 22 Dec 2008
Posts: 87
Location: US

PostPosted: Tue Apr 27, 2010 7:37 pm
Reply with quote

Thanks a lot for the replies!!!!

Another option that I thought about was splitting the process in sub-processes and running them in parallel. This would reduce the runtime of the entire process. However, will this change the CPU utilization????

Regards,
Sumesh
Back to top
View user's profile Send private message
enrico-sorichetti

Superior Member


Joined: 14 Mar 2007
Posts: 10886
Location: italy

PostPosted: Tue Apr 27, 2010 7:43 pm
Reply with quote

Quote:
However, will this change the CPU utilization????

NO! why do You think so?
if You have work to do, you will have to use the CPU!
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19243
Location: Inside the Matrix

PostPosted: Tue Apr 27, 2010 7:45 pm
Reply with quote

Hello,

Quote:
I thought about was splitting the process in sub-processes and running them in parallel.
So, you are not really interested in fixing the problem. . .?

Not only wil this not save any cpu (which is probably not the issue anyway anyway) but while these run in parallel they will have a bigger negative impact on the rest of the system as well as conflict with themselves causing even worse overall performance. . .

Spreading it around this way is just another form of lazy. . . And will waste more resources long term. If there is true interest in performance improvement, do some research and improve the process(es).
Back to top
View user's profile Send private message
cybertaurean

New User


Joined: 22 Dec 2008
Posts: 87
Location: US

PostPosted: Tue Apr 27, 2010 8:36 pm
Reply with quote

Not at all. I'm trying to learn all that I can that will help me with improving the performance.

I got this hint about splitting the process from somewhere and thought of asking about it.

So, that busts one myth for me. Thanks All!!! icon_biggrin.gif
Back to top
View user's profile Send private message
Robert Sample

Global Moderator


Joined: 06 Jun 2008
Posts: 8700
Location: Dubuque, Iowa, USA

PostPosted: Tue Apr 27, 2010 11:33 pm
Reply with quote

A job is either CPU-bound or I/O-bound. A CPU-bound job could reduce all I/O counts to zero and the run time would not change much (if at all). An I/O-bound job could reduce all CPU usage by 99% and the run time would not change much (if at all).

Splitting a process can help IF
(1) the process is CPU-bound,
(2) the LPAR is assigned multiple processors, and
(3) the initiators can run on different processors.

If your LPAR is running on a single processor, and the job is CPU-bound, splitting the job into multiple processes could actually hurt performance since more address spaces would be competing for the processor time.
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19243
Location: Inside the Matrix

PostPosted: Wed Apr 28, 2010 1:26 am
Reply with quote

Hello,

Quote:
splitting the job into multiple processes could actually hurt performance since more address spaces would be competing for the processor time
Similarly, if the split processes use the same data, a variety of bad things can occur. . .

Two of which are channel/device contention for files and buffer flushing in a multi-user database environment (causing physical i/o that could have been only logical i/o).
Back to top
View user's profile Send private message
View previous topic :: :: View next topic  
Post new topic   Reply to topic View Bookmarks
All times are GMT + 6 Hours
Forum Index -> Testing & Performance

 


Similar Topics
Topic Forum Replies
No new posts Two where-criteria with GT - Performa... DB2 4
No new posts exploiting Z16 performance PL/I & Assembler 2
No new posts JOIN STATEMENT PERFORMANCE. DFSORT/ICETOOL 12
No new posts Which SORT utility can improve the Pe... DFSORT/ICETOOL 16
No new posts COBOL Performance Tuning COBOL Programming 6
Search our Forums:

Back to Top