View previous topic :: View next topic
|
Author |
Message |
sushanth bobby
Senior Member
Joined: 29 Jul 2008 Posts: 1020 Location: India
|
|
|
|
Hi,
I would like to know, whether is it worth an effort to convert current DB2 programs to support Multi-row fetch & insert.
I would also like to know, whether any other shop has done a process like this. Because huge amount of effort is required when number of programs to convert is high.
Thank You,
Sushanth |
|
Back to top |
|
|
GuyC
Senior Member
Joined: 11 Aug 2009 Posts: 1281 Location: Belgium
|
|
|
|
well, we did and doing it gradually.
Only the big fetchers and inserters and it is definitely worth the effort.
up to 30% cpu gain. |
|
Back to top |
|
|
sushanth bobby
Senior Member
Joined: 29 Jul 2008 Posts: 1020 Location: India
|
|
|
|
Thank You GuyC,
In our shop currently they were looking for some performance benefits in our applications across all the domain(Client Server side as well as mainframe).
I suggested the Mult-row conversion for our currrent DB2 application programs. Funny thing is, from then on i didn't hear from them regarding this. So, i was thinking is this a really a weird conversion OR worth an effort to do something like this.
Again Thank You Very much GuyC!
Sushanth |
|
Back to top |
|
|
sushanth bobby
Senior Member
Joined: 29 Jul 2008 Posts: 1020 Location: India
|
|
|
|
Quote: |
This is in regarding the post (sub:Advice on normal db2 prg to multi-row support conversion).
i was thinking on the same lines for our shop for converting our COBOL DB2 insert programs.
Initially we evolved from 'Single record read from PS file to single insert into target table ' to 'Single record read from PS file to single insert into Global temp table and bulk select from to Global temp table and insert into target table' this conversion gave us few cpu gain.
I haven't coded or rather seen a multi row version of programs,so can u tell me what all complexities this conversion will have and issues u have faced.I have just gone thru the documents for multi row inserts , the changes i can understand, are the cursor usage which is a bit different here and the usage of arrays.Do u see anything else ?? and where is CPU gain coming from.
Thanks
Rakesh |
Hi Rakesh,
In open, more people can join and share.
Created Temporary Table will require the usage of workfile(database/tablespaces), that is used for certain SQL operations.
You might have to change the your process like move your data from PS to an COBOL array and use the array while inserting.
Quote: |
Where is CPU gain coming from. |
CPU gain will be from number statements executed is reduced, i think.
Thank You,
Sushanth |
|
Back to top |
|
|
GuyC
Senior Member
Joined: 11 Aug 2009 Posts: 1281 Location: Belgium
|
|
|
|
instead of one loop :
Code: |
* code is more pseudocode than actual cobol
open
fetch
perform until sqlcode > 0
process row
fetch into row
end-perform
close |
you need two :
Code: |
* code is more pseudocode than actual cobol
open
fetch into row-table
move 1 to ix-row
perform until sqlcode > 0 and ix-row > sqlerrd(3)
perform unitl ix-row > sqlerrd(3)
process row (ix-row)
ix-row +1
end-perform
fetch into row-table
end-perform
close |
So it requires some effort.
It all depends on how expensive your CPU-seconds and how cheap your man-hours are.
The benefit is in the limited numbers of calls to DB2. |
|
Back to top |
|
|
|