View previous topic :: View next topic
|
Author |
Message |
Rameshs
New User
Joined: 15 Jun 2005 Posts: 53 Location: India, Chennai
|
|
|
|
Which method is faster PERFORM Loop or FUNCTION REVERSE ? |
|
Back to top |
|
|
ykishor Currently Banned New User
Joined: 11 Aug 2007 Posts: 24 Location: my pc
|
|
|
|
surely FUNCTION REVERSE..but there could be exceptions depending on the kind of data |
|
Back to top |
|
|
Bill O'Boyle
CICS Moderator
Joined: 14 Jan 2008 Posts: 2501 Location: Atlanta, Georgia, USA
|
|
|
|
I'm not thoroughly convinced that some of these FUNCTIONS are quicker than the old fashioned method.
However, in this day and age, with the speed of these machines, this difference in speed is probably immeasurable and "Fast" is diluted to a relative term.
What's more important is ease of maintenance and perhaps, other folks will gravitate towards the FUNCTIONS rather than a manual method.
If we were on a 360/20, then I'd say do it manually.
Regards, |
|
Back to top |
|
|
ykishor Currently Banned New User
Joined: 11 Aug 2007 Posts: 24 Location: my pc
|
|
|
|
yes sir...could be..
just would like to share a thing...it makes much of a difference when u do the same thing for lets say 50 million records.
There is performance difference in the following:
MOVE 0 to WS-VAR-NAME.
MOVE ZERO TO WS-VAR_NAME.(faster for huge volume of data) |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
It would surely be educational to look at the generated assembler for these 2 moves. . .
You might see different code generated depending on the size and data type of ws-var-name also. . . |
|
Back to top |
|
|
Terry Heinze
JCL Moderator
Joined: 14 Jul 2008 Posts: 1249 Location: Richfield, MN, USA
|
|
|
|
Completely agree with Bill. My understanding of FUNCTIONs is that most, if not all, of them call LE subroutines and are probably more cpu intensive. But since cpus have become so fast, efficiency has taken a back seat to readability (maintainability) and rightfully so. In the situation where the function is to be performed millions of times, a simple benchmark test might be in order. |
|
Back to top |
|
|
ykishor Currently Banned New User
Joined: 11 Aug 2007 Posts: 24 Location: my pc
|
|
|
|
Quote: |
efficiency has taken a back seat to readability (maintainability) and rightfully so |
This view is shared by a lot of people..including my bosses!!!!.....but I agree only partially...
I would try to do a job using a complicated ICETOOL/SAS(much faster) rather than a COBOL program(heavy in terms of I/O and elapsed time)... |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8696 Location: Dubuque, Iowa, USA
|
|
|
|
Quote: |
I would try to do a job using a complicated ICETOOL/SAS(much faster) rather than a COBOL program(heavy in terms of I/O and elapsed time)... |
This all depends -- some years back I had a SAS program that took a week or so to develop converted to COBOL for production reasons. The SAS job took twice as much I/O and about four times as much CPU time to run as the COBOL program, which makes sense since SAS has to create the program data vector, and it writes a lot of data to the WORK file and reads it back in when requested. On the other hand, the COBOL program took more than 2 months to develop versus the one week for the SAS program.
For quick and dirty, or ad hoc reports, I use SAS. When there's massive amounts of data to process (millions of records) every day / week / month (i.e., a production situation), I use COBOL because the savings over time pay for the additional cost of development. |
|
Back to top |
|
|
|