View previous topic :: View next topic
|
Author |
Message |
neo4u
New User
Joined: 03 Oct 2016 Posts: 8 Location: India
|
|
|
|
Hi,
I am asked to clean-up the DEV db2 tables, to improve performance. We have 300+ tables that needs to be cleanup with irrelevant/unimportant data. We have identified the data that needs to be deleted from our tables. We know the keys to delete data from these tables.
Our approach is to
1. Unload the data using UTILITY+LISTDEF --> so that we can have all unload run in a single step
2. Write a program that reads the key file and delete records from the 300+ tables.
Challenge/question i have is, instead of unloading the entire tablespace is there a way I can unload only the rows that I am deleting? I am aware of UNLOAD .. FROM TABLE.. WHEN.. --> But how to code a single step to unload 300+ tables?
Does any one has a better approach here, inviting your suggestion |
|
Back to top |
|
|
Arun Raj
Moderator
Joined: 17 Oct 2006 Posts: 2481 Location: @my desk
|
|
|
|
How about unloading all the data, then load back only the data you need (after filtering), using load replace. |
|
Back to top |
|
|
guptae
Moderator
Joined: 14 Oct 2005 Posts: 1208 Location: Bangalore,India
|
|
|
|
Hello Neo4u,
You can also try REORG DISCARD as well. |
|
Back to top |
|
|
Rohit Umarjikar
Global Moderator
Joined: 21 Sep 2010 Posts: 3053 Location: NYC,USA
|
|
|
|
First talk to DBA and evaluate what they have to offer you.
Why unload all the data from 300+ table and load back the same by eliminating fewer bad rows (by using SORT probably)? its waste of resources and time when it gets into Millions of rows and more dangerous and cumbersome if this becomes recurring task in future or new tables get added to the list.
REORG DISCARD, is per table space so if they have 300+ table spaces then those many jobs or one TS and 300 tables then still 300 different jobs or control cards? if not then this is an optimized choice to make also provided REORG supports all the Keys filter condition.
I think of this , see if it has any issues implementing it.
1.Define a new table and insert 300 delete/truncate sqls (for 300 tables ).
2. Write COBOL Program and Declare a cursor ( step1 table) and use dynamic sql to execute all queries till EOC. Issue Commit for every successfully dynamic sql when gets processed.
Or
If you don't want to create a new table then replace that with PS Data set and l0op it until EOF. |
|
Back to top |
|
|
neo4u
New User
Joined: 03 Oct 2016 Posts: 8 Location: India
|
|
|
|
Arun Raj wrote: |
How about unloading all the data, then load back only the data you need (after filtering), using load replace. |
We thought of this, but It would need a lot of time to write SORT steps to filter the records. Each table has the key defined at different positions, so we need to find the position of the key and then write sort card to filter it out, which is cumbersome.
Rohit Umarjikar wrote: |
First talk to DBA and evaluate what they have to offer you.
Why unload all the data from 300+ table and load back the same by eliminating fewer bad rows (by using SORT probably)? its waste of resources and time when it gets into Millions of rows and more dangerous and cumbersome if this becomes recurring task in future or new tables get added to the list.
REORG DISCARD, is per table space so if they have 300+ table spaces then those many jobs or one TS and 300 tables then still 300 different jobs or control cards? if not then this is an optimized choice to make also provided REORG supports all the Keys filter condition.
I think of this , see if it has any issues implementing it.
1.Define a new table and insert 300 delete/truncate sqls (for 300 tables ).
2. Write COBOL Program and Declare a cursor ( step1 table) and use dynamic sql to execute all queries till EOC. Issue Commit for every successfully dynamic sql when gets processed.
Or
If you don't want to create a new table then replace that with PS Data set and l0op it until EOF. |
We are doing similar to what you suggested.
1.Define a new table and insert 300 delete/truncate sqls (for 300 tables ).
We have a table, where we will add table names that needs to be deleted and the key column name. A program reads this table and creates a dynamic delete query and executes it.
My question is on the backup side, in case if something goes wrong I need the backup to put the data back into the tables. |
|
Back to top |
|
|
Nic Clouston
Global Moderator
Joined: 10 May 2007 Posts: 2455 Location: Hampshire, UK
|
|
|
|
Quote: |
My question is on the backup side |
Then you should have asked that question at the start. There is no mention of backup in your original post.
What is your problem regarding backup? Surely standard backup and recovery is within your abilities? |
|
Back to top |
|
|
guptae
Moderator
Joined: 14 Oct 2005 Posts: 1208 Location: Bangalore,India
|
|
|
|
Hey there,
Quote: |
My question is on the backup side, in case if something goes wrong I need the backup to put the data back into the tables. |
Do you see any issue with Imagecopy or unload whatever suits you.... or even quiesce will in case you want to recover from quiesce point. |
|
Back to top |
|
|
guptae
Moderator
Joined: 14 Oct 2005 Posts: 1208 Location: Bangalore,India
|
|
|
|
Hey There,
Another thought.... Do you have qmf in your shop.. If yes than you can create QMF table with your keys (which you want to delete) and
than create query to delete from base table using QMF table.
But yes, You might need to create 300 delete statement and QMF tables depending upon your key value. |
|
Back to top |
|
|
Rohit Umarjikar
Global Moderator
Joined: 21 Sep 2010 Posts: 3053 Location: NYC,USA
|
|
|
|
Quote: |
We are doing similar to what you suggested. |
I guess that's most efficient and flexible way to do it unless any better solution present.
Quote: |
My question is on the backup side, in case if something goes wrong I need the backup to put the data back into the tables. |
If something goes wrong then you ain't you suppose to have restart logic in place? why would you want to put it back? |
|
Back to top |
|
|
|