View previous topic :: View next topic
|
Author |
Message |
KumaranJeeva
New User
Joined: 10 Apr 2008 Posts: 7 Location: London
|
|
|
|
Hi,
While running a Mainframe job that executes a program to process records from a table and write into a file, it's taking nearly 3 frs to process around 8,00,000 records for LRECL - 150.
Did any of you face the same situation?
Please help as it's an emergency. |
|
Back to top |
|
|
Craq Giegerich
Senior Member
Joined: 19 May 2007 Posts: 1512 Location: Virginia, USA
|
|
|
|
Some details would make it a lot easier to offer suggestions. JCL program details etc. |
|
Back to top |
|
|
KumaranJeeva
New User
Joined: 10 Apr 2008 Posts: 7 Location: London
|
|
|
|
A batch program is invoked through this Job. The program reads nearly 8 million records from Database and writes two output files one with volume of 16 million and another with 8 million records.
Please let me know if you need more information. |
|
Back to top |
|
|
Phrzby Phil
Senior Member
Joined: 31 Oct 2006 Posts: 1050 Location: Richmond, Virginia
|
|
|
|
Could you possibly help someone with this little info?
How, e.g., is the program getting the records from the table? What does the WHERE look like? Got indexes? How's your CPU time? |
|
Back to top |
|
|
KumaranJeeva
New User
Joined: 10 Apr 2008 Posts: 7 Location: London
|
|
|
|
YEah it's reading from couple of tables with different cursors, processing the data and writing them into two file. The were condition is simple primary key chking with no joins at all for the cursors. The CPU time taken for this jobs is 21 mins only.
I'd be glad to provide any more info needed. |
|
Back to top |
|
|
Craq Giegerich
Senior Member
Joined: 19 May 2007 Posts: 1512 Location: Virginia, USA
|
|
|
|
What are the output files like -- RECFM, LRECL, BLKSIZE, tape or disk? |
|
Back to top |
|
|
Craq Giegerich
Senior Member
Joined: 19 May 2007 Posts: 1512 Location: Virginia, USA
|
|
|
|
Well I hope the job has finished, it has been 3 hours and you still have not given us enough information to answer your emergency request! |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19243 Location: Inside the Matrix
|
|
|
|
Hello,
As has been mentioned, you have provided little or nothing that we can use to help you. . . .
Quote: |
it's reading from couple of tables with different cursors, processing the data and writing them into two file. |
Might help if you posted the SQL code that defines these.
How many rows are in the 2 tables? It may be far faster to unload the tables and use the sequential files as input to your process.
Has this job run faster previously? Is this the first attempt to run this (with the full volume)?
Might your wall-time be caused by contention within the database due to other tasks using these tables while your job is running? |
|
Back to top |
|
|
KumaranJeeva
New User
Joined: 10 Apr 2008 Posts: 7 Location: London
|
|
|
|
This is the first attempt to run this job with full volume. The data in one table is around 8 million records and in another is 80,000 records. The unload process and using the sequential file process is not acceptable.
There are two cursors used. The second cursor is inside the first cursor.
As i've mentioned the cursors are based on simple conditions and but fetches more those 8 million records.
I mainly want to know if any of you have faced similar situation. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19243 Location: Inside the Matrix
|
|
|
|
Hello,
Quote: |
The unload process and using the sequential file process is not acceptable |
Who dictated this? What business reason is ther to not do this if the runtime would drop to part of an hour versus multiple hours.
Quote: |
I mainly want to know if any of you have faced similar situation.
|
Yes, regularly, on different systems with different performance issues.
It sounds like you want to "match" data in the 2 tables. I suggest you try the unload and process (and if you need to match the 2 files, use a 2-file match/merge, not some kludge with arrays. . . |
|
Back to top |
|
|
KumaranJeeva
New User
Joined: 10 Apr 2008 Posts: 7 Location: London
|
|
|
|
The problem is we're not supposed to change the existing job format.
Moreover it's based on keys. I'm not familiar about any flat files which can be handled with keys.
We're not trying to match two tables, we're trying to do a chk before taking the data from the tables and loading into a file. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19243 Location: Inside the Matrix
|
|
|
|
Hello,
Quote: |
I'm not familiar about any flat files which can be handled with keys. |
In a flat file, keys refer to the field(s) the file os sorted by.
Quote: |
We're not trying to match two tables, we're trying to do a chk |
What kind of check? To make sure something from one table exists or does not exist in the other? That is a match. . .
If you post the nested cursor code, it may help someone make a suggestion.
Quote: |
The problem is we're not supposed to change the existing job format. |
Typically, i agree with this "rule". However, if something is implemented in such a way as to be unsupportable, the format may need to be changed. What if it ran 40 hours rather than only a few? I suspect the no format change rule would be waived to get the process to run. |
|
Back to top |
|
|
Anuj Dhawan
Superior Member
Joined: 22 Apr 2006 Posts: 6248 Location: Mumbai, India
|
|
|
|
Hi,
If you post a little about how does the program
Quote: |
process records from a table and write into a file |
, in literal sense the code you are using, it might fetch better suggestions. |
|
Back to top |
|
|
Itanium
Active User
Joined: 22 Jan 2006 Posts: 114 Location: India
|
|
|
|
dick,
Hats-off to your patience.
KumaranJeeva,
Please help us to help you !!!
Thanks. |
|
Back to top |
|
|
KumaranJeeva
New User
Joined: 10 Apr 2008 Posts: 7 Location: London
|
|
|
|
Hi all,
Thanks for all your suggestions. After making a change in the query, the problem is solved now.
Once again thank you very much for your suggestions. |
|
Back to top |
|
|
Phrzby Phil
Senior Member
Joined: 31 Oct 2006 Posts: 1050 Location: Richmond, Virginia
|
|
|
|
After all of this assistance, wouldn't you like to share what you've learned with the community? |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19243 Location: Inside the Matrix
|
|
|
|
Hello Kumaran J,
Thank you for posting that your process is now working.
If you can post what you did to improve performance, it will probably help someone else who encounters a similar problem. |
|
Back to top |
|
|
KumaranJeeva
New User
Joined: 10 Apr 2008 Posts: 7 Location: London
|
|
|
|
The change was a code error. There was nothing to do with optimization. That's why i didn't post it here. But whenever a situation arise like this, mostly the culprit will be either in the query and the positioning of the query in the code. |
|
Back to top |
|
|
Phrzby Phil
Senior Member
Joined: 31 Oct 2006 Posts: 1050 Location: Richmond, Virginia
|
|
|
|
Still - so secretive. An experienced user may not learn from your "code error," but a new user certainly will. How about giving it a try?
If you are embarassed by the error - and who hasn't been? - remember, we really don't know who you are, just who you are pretending to be. |
|
Back to top |
|
|
|