View previous topic :: View next topic
|
Author |
Message |
rohin
New User
Joined: 29 Apr 2005 Posts: 21 Location: Gurgaon, India
|
|
|
|
Hi,
My requirement is to subtract 2 dates to get number of days between them. Which is the most efficient way to do that? We are considering:
(1) DB2 using (CURRENT DATE - WS-DATE)
(2) Using Cobol funxtion INTEGER-OF-DATE and then take difference
(3) Any other method??
Thanks in advance |
|
Back to top |
|
|
Nic Clouston
Global Moderator
Joined: 10 May 2007 Posts: 2455 Location: Hampshire, UK
|
|
|
|
Most sites have standard date arithmetic routines. If your site has them then they should be used unless you can justify not using them. |
|
Back to top |
|
|
Bill O'Boyle
CICS Moderator
Joined: 14 Jan 2008 Posts: 2501 Location: Atlanta, Georgia, USA
|
|
|
|
Overview: FUNCTION's were introduced with COBOL/370. LE (Language Environment) Callable Date routines can be used in VS/COBOL II and OS/VS COBOL, providing LE has been installed in the LPAR. LE became integrated with the COBOL/370 compiler, about 20 years ago. Try not to mix FUNCTION's and LE Callable Date routines as you will probably get different results, unless you're aware of the Compiler Option. End Overview.
As Nic said, most shops have their own standardized/canned methods for date manipulation. You should check with your administrators. |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8696 Location: Dubuque, Iowa, USA
|
|
|
|
Quote: |
Which is the most efficient way to do that? |
Why do you care about efficiency? Since COBOL on the current z/OS machines execute 10 million to 100 million lines of code per second of CPU time, you would have to execute your COBOL statements BILLIONS or TRILLIONS of times to see any benefit from "efficiency".
These days, unless you actually have a performance problem, you have already spent more time and energy (just by raising the issue) worrying about efficiency than you could possibly save in many years of program execution. |
|
Back to top |
|
|
rohin
New User
Joined: 29 Apr 2005 Posts: 21 Location: Gurgaon, India
|
|
|
|
Thanks for your suggestion. Actually, we were looking at a program which used cobol function interger-of-days and Strobe report confirmed that this is the statement which is consuming highest CPU%. This program itself runs for over 2 hours daily (90 clocks) and produces output files containing 12 million records.
Now, suggest if it makes sense to worry about this? or should we leave it as trivial. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
It should not be too difficult to run benchmarks of different ways to do this arithmetic (it is already working).
Suggest you pay attention to not only the difference in time taken, but also that the correct answer is returned in all cases (do you want it faster or do you want it correct?).
Has anyone looked a how the fields being used are defined? If for example, they are zoned decimal, there will be a major hit on cpu used . . . |
|
Back to top |
|
|
Akatsukami
Global Moderator
Joined: 03 Oct 2009 Posts: 1788 Location: Bloomington, IL
|
|
|
|
rohin wrote: |
Thanks for your suggestion. Actually, we were looking at a program which used cobol function interger-of-days and Strobe report confirmed that this is the statement which is consuming highest CPU%. This program itself runs for over 2 hours daily (90 clocks) and produces output files containing 12 million records.
Now, suggest if it makes sense to worry about this? or should we leave it as trivial. |
What percentage of CPU does it consume? "Highest" is so vague as to be meaningless; if 20%, there may well be scope for improvement; if 0.2% (and every other statement is only 0.0002%), making it 100x efficient will do no good. |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8696 Location: Dubuque, Iowa, USA
|
|
|
|
Quote: |
Thanks for your suggestion. Actually, we were looking at a program which used cobol function interger-of-days and Strobe report confirmed that this is the statement which is consuming highest CPU%. This program itself runs for over 2 hours daily (90 clocks) and produces output files containing 12 million records.Now, suggest if it makes sense to worry about this? or should we leave it as trivial. |
You have provided nowhere near enough information to answer this question. The 12 million records may -- or may not -- be relevant, depending on the job.
Is the job I/O-bound or CPU-bound? If it is I/O-bound, then reducing the CPU time of that statement will impact the overall run time of the program by a grand total of ... ZERO ... seconds since the job's critical path is I/O, not CPU.
Assuming that the job is CPU-bound and not I/O-bound, the next question is how much CPU time does the program take? If it takes 15 seconds of CPU in those 2 hours of elapsed time, then you need to change when the program runs so it's not competing so heavily for resources (or change the WLM policy to give the job more resources). Note that moving the job start time or changing the WLM polciy can improve job performance whether it is I/O-bound or CPU-bound.
Next, if the job uses (for example) 30 minutes of CPU time in the total elapsed time, what does STROBE tell you the percentage of time spent on that one statement? If it is the highest statement of CPU usage and it is being used say 5% of the total CPU time, then there's not going to be much of a difference in the job no matter how you do the calculation.
If the program is CPU-bound and the program is using 30 minutes of CPU time (for example) and that one statement is using 98% of the CPU time (for example), then -- and only then -- you can say that looking into improving the performance of that statement can help your job. |
|
Back to top |
|
|
Bill Woodger
Moderator Emeritus
Joined: 09 Mar 2011 Posts: 7309 Location: Inside the Matrix
|
|
|
|
If you have 12m records, presumably you are doing at least 24 million (perhaps many more) function calls. Which means you will, many, many, times, be using a date that you have already converted.
Since we know nothing about your processing, it is difficult to say something concrete. You are Strobe-ing, so presumably you feel there is a problem.
The best way to improve the performance of a piece of code is to execute it as few times as possible to service the requirement accurately. Best is not to execute it at all (can't always be done).
Is your "input" data, or can it be, in "date order" for at least one of the dates?
Are you if, if you have identified the data-processing as the problem, doing all other "selection" prior to processing the dates, so that no extraneous conversions are done?
If performance is a problem, look at the whole program/system and don't be afraid to "redesign" if that'll do it (usually this does not involve a re-write as such, but a rearrangement of code over different programs).
EDIT. Just to make it obvious, the more "in date order" the data is, the more times a test like "if this date is the same as the last we can use what we already now" will save you doing anything else. Same principle with any keyed-reads, or "look-up" files/tables. |
|
Back to top |
|
|
|