View previous topic :: View next topic
|
Author |
Message |
murugan_mf
Active User
Joined: 31 Jan 2008 Posts: 148 Location: Chennai, India
|
|
|
|
I have to read one table having more than 10000 rows and insert into 4 diffrent tables.
Here, for each and every insert I am commiting the database but it is not optimized.
How to use savepoint to get rid of this or else is there any other way? |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
Quote: |
for each and every insert I am commiting the database but it is not optimized. |
Typically, SELECTs are optimized when possible. INSERTs are always expensive.
Quote: |
How to use savepoint to get rid of this or else is there any other way |
What exactly do you want to "get rid of"? What does the term "savepoint" mean to you? It is not a database term. . .
On alternative would be to create 4 qsam files and load them externally rather than within the code. For most situations, 10thousand is not very many rows - how often does this process run? |
|
Back to top |
|
|
|