View previous topic :: View next topic
|
Author |
Message |
asimkp
New User
Joined: 24 Aug 2008 Posts: 4 Location: Bangalore,India
|
|
|
|
My job is running a SAS Program and Macros and is taking too much of time.
The SAS-Datasets which we are creating are pretty huge(30 Gigs of Data/500,000 Tracks), and using the same Datasets in the subsequent jobs, which in-turn is taking 3+ hours of processeing time.
The LRECL of the Dataset is 27648 and we are using BLKSIZE=0.
I have tried following options to tune my JCL, but no change in the EXCP COUNT or the Elasped time.-
1. Added BUFFNO=15 to the input datasets.
2. Incresed the Region Parameter.
Can you please help me to tune-in my JCL |
|
Back to top |
|
|
enrico-sorichetti
Superior Member
Joined: 14 Mar 2007 Posts: 10873 Location: italy
|
|
|
|
telling that something ... is taking too much of time...
is just useless whining
come back when You have data not impressions |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8696 Location: Dubuque, Iowa, USA
|
|
|
|
Quote: |
I have tried following options to tune my JCL, but no change in the EXCP COUNT or the Elasped time.-
1. Added BUFFNO=15 to the input datasets.
2. Incresed the Region Parameter. |
This indicates the job is CPU-bound, not I/O-bound. SAS is quite efficient at handling I/O already, so you're not likely to find many ways to improve it. Increasing the region parameter won't actually do anything to improve performance in 99.99% of cases -- CPU-bound processes aren't usually waiting for memory, they are waiting to do work on the processor. About your only choice is to analyze the SAS code and make improvements in it (indexing instead of sorts, minimizing SAS work files, and so forth). |
|
Back to top |
|
|
Ronald Burr
Active User
Joined: 22 Oct 2009 Posts: 293 Location: U.S.A.
|
|
|
|
asimkp wrote: |
and using the same Datasets in the subsequent jobs, which in-turn is taking 3+ hours of processeing time. |
If you are processing the same 30 gigabyte dataset multiple times for different processes, try extracting ALL of the fields needed by ANY/ALL of the processes in a SINGLE pass of the 30 gig dataset, then work on that extract for each of the individual processes. |
|
Back to top |
|
|
asimkp
New User
Joined: 24 Aug 2008 Posts: 4 Location: Bangalore,India
|
|
|
|
Enrico - Apologies for the generalisation..here are the numbers -
STEPNAME PROCSTEP RC EXCP CPU SRB CLOCK SERV PG
SAS003 SAS 00 2015K 13.34 .54 186.36 152M 0
EXCP = 2015K
CPU = 13.34 Minutes
Elap Times = 186.36 mins
Robert - Thanks for th suggestion, we indeed are using 12 SORTWK files, though in OPTIONS we specified it as 6. But again, touching the SAS code would be like re-doing the code, which wil demand time.
Ronald - We have the same logic in place of extracting the required fields in SAS Programs for every job.
Thanks for your help..Any more suggestions are welcome.. |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8696 Location: Dubuque, Iowa, USA
|
|
|
|
Given your reluctance to tune the code, contact your site support group for assistance. They may be able to place the job in another service class to give it more CPU and get it done quicker. However, the ratio of CPU time to elapsed time doesn't seem wildly out of line based on what I've seen of modern mainframes, so I suspect the only way you could possibly make substantial improvements (which may not be possible of course) in the elapsed time would be to rewrite the SAS code more efficiently. |
|
Back to top |
|
|
Ronald Burr
Active User
Joined: 22 Oct 2009 Posts: 293 Location: U.S.A.
|
|
|
|
asimkp wrote: |
Ronald - We have the same logic in place of extracting the required fields in SAS Programs for every job. |
Just because it's the SAME logic doesn't mean that it is the most EFFICIENT logic.
I once tuned an application that generated 9 different reports from a 7-reel master file that had a 4K LRECL. EACH job took an hour and a half to run ( there were no tape silos in those days ) - making a total of 13.5 hours of elapsed time for the 9 reports. I found that most of the reports used many of the same fields with some unique fields depending on the report - and COMBINED, all of the fields being used by the 9 reports totaled only 165 bytes. SO, I reworked the application to extract ALL referenced fields in ONE pass ( 1.5 hours ), and then redefined the "Master File" in each of the individual report programs to be the 165-byte extract file instead of the "real" master file. Afterward, each of the report programs took only 5 minutes to run instead of 1.5 hours. So, the total elapsed time for the 9 reports went from 13.5 hours to 2.25 hours.
Did it take a little work - yes. I had to change the JCL in one job to add a SORT step to build the extract, change the JCL for the 9 report steps to point to the new extract file, create a copybook for the new "master file", and change the copybook name and re-compile 9 modules. But saving 11.25 hours of elapsed time per day seemed to be worth the effort ( and it allowed OTHER applications to use that 11.25 hours of initiator time ). |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
Quote: |
But again, touching the SAS code would be like re-doing the code, which wil demand time. |
If you really want to improve the process(es), invest the time. . .
Basically what i read is that you have something that runs poorly and you want some magic bullet to fix it (with no thought or effort required). Not very realistic.
Just because somone implemented a "thing" and then it was cloned to make several other "things" is no reason that all should continue as is. . . |
|
Back to top |
|
|
enrico-sorichetti
Superior Member
Joined: 14 Mar 2007 Posts: 10873 Location: italy
|
|
|
|
worse than that...
they do not even know what would be the reasonable time |
|
Back to top |
|
|
|