View previous topic :: View next topic
|
Author |
Message |
jzhardy
Active User
Joined: 31 Oct 2006 Posts: 131 Location: brisbane
|
|
|
|
I have a specific requirement to generate a large series of reports that are dependent upon a single Global Temporary Table.
The cost of generating the GTT is very large. The cost of each (SQL) report is also significant, but not as costly as the initial generation of the GTT.
The problem I have is that the complete sequence takes about 2 to 3 hours to run. Ideally I would like to generate the GTT, and then run the 30 or so queries in parallel. By my calculations, the job will then finsih in about 30 minutes on average.
Obvious solution is to materialse the GTT or define a new (temporary) db2 table, but production policy will not allow this.
I've tried splitting the reports 10-wide, but then i'm essentailly generating the GTT 10 times which is very wasteful of production resources.
suggestions for a strategy ? |
|
Back to top |
|
|
Rohit Umarjikar
Global Moderator
Joined: 21 Sep 2010 Posts: 3053 Location: NYC,USA
|
|
|
|
If GtT and sql is factored most in to the cost then it can think of vsams or a PS Data set.
Just unload what you need in GTT into PS and use that to create any reports.
Second, why put it into GTT one shot everything ? Split up the jobs per reports and filter the necessary data per report and run them in parallel. |
|
Back to top |
|
|
jzhardy
Active User
Joined: 31 Oct 2006 Posts: 131 Location: brisbane
|
|
|
|
- Just unload what you need in GTT into PS and use that to create any reports.
this won't work. The issue is that the 30 reports are dependent on the GTT *and* data in other DB2 tables. Yes, could extract everything into PS and join/merge in DFSORT etc, but overkill and not a realistic solution.
The point of the GTT was to assemble the common elements, and thereby greatly simplify the various reports - which are still, notwithstanding, complex in their own right. |
|
Back to top |
|
|
jzhardy
Active User
Joined: 31 Oct 2006 Posts: 131 Location: brisbane
|
|
|
|
to anticipate a likely offering :
why not give to Business Intelligence?
yes, that's the best long term option, but the requirement is a here and now requirement. BI is not agile enough to accommodate this request in a short time. |
|
Back to top |
|
|
Rohit Umarjikar
Global Moderator
Joined: 21 Sep 2010 Posts: 3053 Location: NYC,USA
|
|
|
|
Why use GTT ? Create a same DB2 table that you need and load the common element in one batch job into it and then use this one db2 table creating many reports in separate jobs.
See if DBA can find you any performance improvement solutions if such design changes are not feasible . |
|
Back to top |
|
|
jzhardy
Active User
Joined: 31 Oct 2006 Posts: 131 Location: brisbane
|
|
|
|
unfortunately we (developers) don't have create table authority in production. |
|
Back to top |
|
|
Nic Clouston
Global Moderator
Joined: 10 May 2007 Posts: 2455 Location: Hampshire, UK
|
|
|
|
Why do developers run jobs against production data? Your job, surely, is to develop not run?
Create your jobs to create the temporary DB2 table and give the whol mess to the production people to run.
Or get sign-off to be allowed to create a DB2 production table. |
|
Back to top |
|
|
Rohit Umarjikar
Global Moderator
Joined: 21 Sep 2010 Posts: 3053 Location: NYC,USA
|
|
|
|
It looks like you are expecting quick solution since it’s a production support issue and instead of rethinking for a long term solution , that means only DBA can help you by looking at the query and other stats for that execution.
Please approach such team at your site as said earlier as well. |
|
Back to top |
|
|
|