jzhardy
Active User
Joined: 31 Oct 2006 Posts: 139 Location: brisbane
|
|
|
|
looking to sound out this group for advice and suggestions on a technical solution.
the requirements are - and these have been decided, and are not negotiable - to build a batch process to:
o transform CSV record sets into XML using reference data + COBOL
o approx 30,000 reference data records will exist, defining the mapping from each CSV record set element to a node on the output XML tree.
o average daily input will be 50,000 CSV record sets. But peak usage may be as many as 10,000,000 per day.
for the technical solution:
one option is to load the reference data into a suitable Db2 table, and hit this multiple times. Obviously allow for a suitably sized bufferpool
another option is to define the ref data statically (VALUE clauses), but this has been ruled out as the business want a true 'data driven' approach as opposed to 'code driven'
another option which may be more efficient is to preload into static memory the full ref data structure. (There is a natural hash index that can be used, so access should be no problem)
On this last option, is there a way to run re-entrant COBOL and capture the 'first invocation' event ? The idea would be to load from VSAM (or db2) into memory on the first invocation only, or rather into a memory segment where this is permissible for re-entrant code
other options ? |
|