I have a MPP which is triggered by an IMS transaction. This MPP is COBOL program and is table driven. It reads from certain DB2 tables and depending on Input given to MPP and result from queries on DB2 tables it selects the modules names to which it makes call dynamically. The module names are also stored in the DB2 table.
Now the number of hits to this IMS transaction are running into millions(per day). By the way module is very light weight and does not consume much of CPU seconds. But as the number of hits go in millions the cumulative CPU time is very huge and thats why it is one of the most CPU cosuming transaction within the organisation.
So only way to improve the performance is to eliminate the SQL queries from the module and make the table content accessible from memory rather than from DB2 table. Though the select queries cosume few microseconds of CPU, the cumulative effect can be significant.
So the solution I am thinking of is to write a module which will read all the contents of the DB2 tables and keep it in its working storage (the number of rows in the DB2 table are very less). The same module will be preloaded in the IMS region so that the working storage content will remain consistent. In the MPP in question I will simply call this module to get the contents from its linkage section.
Now can somebody verify this solution or suggest any better solution?
Also if this is a good solution then, how to do this preloading activity in IMS region?
First thing this is not an IMS related query.
Performance efficiency should be improved where it is deteriorated.
Is it in DB2 query or in dynamic calls? perform the performance testing using Strobe or some other utilities which ur shop uses.
1. If it is just DB2 query and less number of rows. Copy all the program names in ur working storage section and do a dynamic call.
(Based on the condition move the program name to the variable).
2. If it is due to run time loading, avoid it and link all the sub modules into a single static load. (check if ur IMS region supports that good load)
call program-a using var-a
call program-b using var-b
If it is just DB2 query and less number of rows. Copy all the program names in ur working storage section and do a dynamic call.
Very interesting proposition But we already had considered this option.
There is a deliberate reason why this module was made table driven becuase the number program might increase or existing program names might also change. So our client wanted the module not to change in any of the circumstance and in future also our client want it like that only. If we incorporate your specified approach then every time and again we have to change the module if any of the dynamically called module changes or gets added.
As I said a single thread of the transaction is performing very good. We do not have any performance issue over here. It is just that by sheer volume the total CPU seconds are very high. So we are going after the SQL queries which we can minimise.