rexx77
New User
Joined: 14 Apr 2008 Posts: 78 Location: Mysore
|
|
|
|
Hi All,
I am working on a requirement where in I might get more number of records (Say 500+), for which I need to check the back end db2 table and formulate the results.
For the same record, I need to refer another table for the filter conditions, meaning, using which I need to query the master database.
Now, I cannot go for sequential processing where main module takes first record, queries another table and refer the master table. This could result in more execution time.
In Hadoop, we have concept of assigning the same work to multiple nodes, and each work independently, master program combines the result.
In the same way, can I have a master SP which calls the same common module multiple times with different input parameters, with out waiting for the return code of the each call. May be we can have 5 to 10 calls to the same SP , wait for each to complete and take another batch and issue the next set of calls.
Can you please let me know if my understanding will work? or do you have any other better way of doing this piece of requirement ?
many thanks for your time in this. |
|
Rohit Umarjikar
Global Moderator

Joined: 21 Sep 2010 Posts: 3109 Location: NYC,USA
|
|
|
|
Yes Very well you can do it,
1) Declare your input parms to SP as Varchar of whatever max size you want and then populate this varchar like each record one after another and then give a call to SP. Now SP can unstring this to an array for each record or a key of that record and then process it till it reads all the varchar.
2)Another way is use a temp table and populate that with all the records that you have (500+) and then call the SP and in SP make a join with this temp table and process it. |
|