View previous topic :: View next topic
|
Author |
Message |
Jimy Carol
New User
Joined: 31 Aug 2012 Posts: 25 Location: Australia
|
|
|
|
Hi ,
I am trying to create a tool which can automatically create data and load a PBG table space with required number of partitions .
The requirement arise because of this situation : when a new PBG table space is created , the initial PARTITION that DB2 offers is 1 . When the data is completely filled in 1st partition the second partition is added by Db2 and so on . In our test environment we frequently need to create PBG with certain number of partitions (lets say 5 ) and load it with test data .
I am trying to create a tool which accepts the Number of Partition required as inut and create and Load data such that result has PBG with the required number of partitions filled with test data .
I need some pointers if such utility already exist . any hints /possible soln is highly appreciated .
Regards ,
Carol |
|
Back to top |
|
|
GuyC
Senior Member
Joined: 11 Aug 2009 Posts: 1281 Location: Belgium
|
|
|
|
your premises is wrong.
Quote: |
NUMPARTS integer
Indicates that the table space is partitioned. If MAXPARTITIONS is also specified, the table space is a partition-by-growth table space, otherwise, the table space is a range-partitioned table space.
integer
Specifies the number of partitions. If n is specified for a partition-by-growth table space, integer indicates the number of partitions that are initially created unless DEFINE NO is specified. integer must be a value between 1 and 4096 inclusive and must be less than or equal to the value that is specified for the MAXPARTITIONS clause.
|
|
|
Back to top |
|
|
Joseph Reynolds
New User
Joined: 07 Oct 2013 Posts: 6 Location: United States
|
|
|
|
I am sure there are tools for purchase that can do this, and more.
The amount of data to get beyond 1 partition depends upon the data and data types.
Assuming no requirements on data content (i.e. constraints, RI, cardinality of values, ...) and if some number of inserts is acceptable, then this SQL will insert into a table with one integer and one character column, in this case 50000 rows. Adjust the columns and count accordingly to get the type and amount of data you want.
INSERT INTO <table> (<tabcol1>,<tabcol2>)
WITH TAL (C1, C2) AS
(SELECT 1 AS C1
,CHAR(CURRENT TIMESTAMP) AS T2 FROM SYSIBM.SYSDUMMY1
UNION ALL
SELECT C1 + 1 AS C1,
CHAR(CURRENT TIMESTAMP) AS C2
FROM TAL WHERE TAL.C1 < 50000)
SELECT C1, C2 FROM TAL;
Inserting will drive logging, which you may not want to do. To avoid excessive logging, insert a small number of rows to prime the table. Then unload once, followed by repeated loads of the unloaded data with RESUME to get the amount of data you want. This avoids excessive logging. |
|
Back to top |
|
|
sushanth bobby
Senior Member
Joined: 29 Jul 2008 Posts: 1020 Location: India
|
|
|
|
Hi Carol,
You can use IEBDG, its a utility to generate test data. I have used this couple of times in my previous organization.
Here are few samples of IEBDG
Below are the steps i did,
1. Setting up the fields for the corresponding DB2 datatypes in IEBDG
2. Edit LOAD utility statement for the input
3. Load
Thanks,
sushanth |
|
Back to top |
|
|
|