IBM Mainframe Forum Index
 
Log In
 
IBM Mainframe Forum Index Mainframe: Search IBM Mainframe Forum: FAQ Register
 

DB2 Unload / Load Multiple tables


IBM Mainframe Forums -> DB2
Post new topic   Reply to topic
View previous topic :: View next topic  
Author Message
neo4u

New User


Joined: 03 Oct 2016
Posts: 8
Location: India

PostPosted: Tue Nov 27, 2018 3:43 pm
Reply with quote

Hi,

I am asked to clean-up the DEV db2 tables, to improve performance. We have 300+ tables that needs to be cleanup with irrelevant/unimportant data. We have identified the data that needs to be deleted from our tables. We know the keys to delete data from these tables.

Our approach is to
1. Unload the data using UTILITY+LISTDEF --> so that we can have all unload run in a single step
2. Write a program that reads the key file and delete records from the 300+ tables.

Challenge/question i have is, instead of unloading the entire tablespace is there a way I can unload only the rows that I am deleting? I am aware of UNLOAD .. FROM TABLE.. WHEN.. --> But how to code a single step to unload 300+ tables?

Does any one has a better approach here, inviting your suggestion
Back to top
View user's profile Send private message
Arun Raj

Moderator


Joined: 17 Oct 2006
Posts: 2481
Location: @my desk

PostPosted: Tue Nov 27, 2018 6:56 pm
Reply with quote

How about unloading all the data, then load back only the data you need (after filtering), using load replace.
Back to top
View user's profile Send private message
guptae

Moderator


Joined: 14 Oct 2005
Posts: 1208
Location: Bangalore,India

PostPosted: Tue Nov 27, 2018 7:59 pm
Reply with quote

Hello Neo4u,

You can also try REORG DISCARD as well.
Back to top
View user's profile Send private message
Rohit Umarjikar

Global Moderator


Joined: 21 Sep 2010
Posts: 3053
Location: NYC,USA

PostPosted: Tue Nov 27, 2018 10:36 pm
Reply with quote

First talk to DBA and evaluate what they have to offer you.

Why unload all the data from 300+ table and load back the same by eliminating fewer bad rows (by using SORT probably)? its waste of resources and time when it gets into Millions of rows and more dangerous and cumbersome if this becomes recurring task in future or new tables get added to the list.

REORG DISCARD, is per table space so if they have 300+ table spaces then those many jobs or one TS and 300 tables then still 300 different jobs or control cards? if not then this is an optimized choice to make also provided REORG supports all the Keys filter condition.

I think of this , see if it has any issues implementing it.
1.Define a new table and insert 300 delete/truncate sqls (for 300 tables ).
2. Write COBOL Program and Declare a cursor ( step1 table) and use dynamic sql to execute all queries till EOC. Issue Commit for every successfully dynamic sql when gets processed.
Or
If you don't want to create a new table then replace that with PS Data set and l0op it until EOF.
Back to top
View user's profile Send private message
neo4u

New User


Joined: 03 Oct 2016
Posts: 8
Location: India

PostPosted: Wed Nov 28, 2018 11:45 am
Reply with quote

Arun Raj wrote:
How about unloading all the data, then load back only the data you need (after filtering), using load replace.


We thought of this, but It would need a lot of time to write SORT steps to filter the records. Each table has the key defined at different positions, so we need to find the position of the key and then write sort card to filter it out, which is cumbersome.


Rohit Umarjikar wrote:
First talk to DBA and evaluate what they have to offer you.

Why unload all the data from 300+ table and load back the same by eliminating fewer bad rows (by using SORT probably)? its waste of resources and time when it gets into Millions of rows and more dangerous and cumbersome if this becomes recurring task in future or new tables get added to the list.

REORG DISCARD, is per table space so if they have 300+ table spaces then those many jobs or one TS and 300 tables then still 300 different jobs or control cards? if not then this is an optimized choice to make also provided REORG supports all the Keys filter condition.

I think of this , see if it has any issues implementing it.
1.Define a new table and insert 300 delete/truncate sqls (for 300 tables ).
2. Write COBOL Program and Declare a cursor ( step1 table) and use dynamic sql to execute all queries till EOC. Issue Commit for every successfully dynamic sql when gets processed.
Or
If you don't want to create a new table then replace that with PS Data set and l0op it until EOF.


We are doing similar to what you suggested.
1.Define a new table and insert 300 delete/truncate sqls (for 300 tables ).
We have a table, where we will add table names that needs to be deleted and the key column name. A program reads this table and creates a dynamic delete query and executes it.

My question is on the backup side, in case if something goes wrong I need the backup to put the data back into the tables.
Back to top
View user's profile Send private message
Nic Clouston

Global Moderator


Joined: 10 May 2007
Posts: 2455
Location: Hampshire, UK

PostPosted: Wed Nov 28, 2018 2:41 pm
Reply with quote

Quote:
My question is on the backup side

Then you should have asked that question at the start. There is no mention of backup in your original post.

What is your problem regarding backup? Surely standard backup and recovery is within your abilities?
Back to top
View user's profile Send private message
guptae

Moderator


Joined: 14 Oct 2005
Posts: 1208
Location: Bangalore,India

PostPosted: Thu Nov 29, 2018 9:17 am
Reply with quote

Hey there,

Quote:

My question is on the backup side, in case if something goes wrong I need the backup to put the data back into the tables.



Do you see any issue with Imagecopy or unload whatever suits you.... or even quiesce will in case you want to recover from quiesce point.
Back to top
View user's profile Send private message
guptae

Moderator


Joined: 14 Oct 2005
Posts: 1208
Location: Bangalore,India

PostPosted: Thu Nov 29, 2018 2:37 pm
Reply with quote

Hey There,

Another thought.... Do you have qmf in your shop.. If yes than you can create QMF table with your keys (which you want to delete) and
than create query to delete from base table using QMF table.

But yes, You might need to create 300 delete statement and QMF tables depending upon your key value.
Back to top
View user's profile Send private message
Rohit Umarjikar

Global Moderator


Joined: 21 Sep 2010
Posts: 3053
Location: NYC,USA

PostPosted: Thu Nov 29, 2018 7:44 pm
Reply with quote

Quote:
We are doing similar to what you suggested.
I guess that's most efficient and flexible way to do it unless any better solution present.
Quote:
My question is on the backup side, in case if something goes wrong I need the backup to put the data back into the tables.
If something goes wrong then you ain't you suppose to have restart logic in place? why would you want to put it back?
Back to top
View user's profile Send private message
View previous topic :: :: View next topic  
Post new topic   Reply to topic View Bookmarks
All times are GMT + 6 Hours
Forum Index -> DB2

 


Similar Topics
Topic Forum Replies
No new posts INCLUDE OMIT COND for Multiple values... DFSORT/ICETOOL 5
No new posts Load new table with Old unload - DB2 DB2 6
No new posts Replace Multiple Field values to Othe... DFSORT/ICETOOL 12
No new posts How to load to DB2 with column level ... DB2 6
No new posts Remote Unload of CLOB Columns DB2 6
Search our Forums:

Back to Top