View previous topic :: View next topic
|
Author |
Message |
michaeltai Warnings : 1 New User
Joined: 23 Jul 2005 Posts: 20
|
|
|
|
I have an input file with 950,000 lines, for each line I have to insert 28 records into the table, the batch job will run more than 6 hours, is there any way to reduce the run time if I don't want to write them to a file and use load utility afterwards? Some one told me to EXEC SQL COMMIT END-EXEC every N records. But why? Is it related to DB2 log? How much time will it save? Is there any other methods to improve the performance? |
|
Back to top |
|
|
raghunathns
Active User
Joined: 08 Dec 2005 Posts: 127 Location: rochester
|
|
|
|
get the exclusive control on the table space and insert records. |
|
Back to top |
|
|
michaeltai Warnings : 1 New User
Joined: 23 Jul 2005 Posts: 20
|
|
|
|
It's a conversion environment we are in, so the tablespace is not accessed by anyone beside us. Also, we lock that table before we insert records. |
|
Back to top |
|
|
|