I have an input file with 950,000 lines, for each line I have to insert 28 records into the table, the batch job will run more than 6 hours, is there any way to reduce the run time if I don't want to write them to a file and use load utility afterwards? Some one told me to EXEC SQL COMMIT END-EXEC every N records. But why? Is it related to DB2 log? How much time will it save? Is there any other methods to improve the performance?