View previous topic :: View next topic
|
Author |
Message |
anshuljain26
New User
Joined: 04 Apr 2010 Posts: 37 Location: Chandigarh
|
|
|
|
HI,
We have a design change. we have an options with us, for Informatica B2B to read from table. Now the data record is too large, i.e. 66000.
Now we should go for Varchar datatype with 64K page size or a table with CLOB datatype.
I know that CLOB data resides in auxiliary space, so i think Informatica B2B will take more time to read. Is that so?
Which one is better and feasible.
Note: Feasibility also depends on the resources available with the technology at the organization level. Kindly, please help me ignore this fact. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
You need to provide a much better explanatoin of what you are trying to accomplish.
If the data is 66000 bytes, how will a 64k varchar help . . .
Sounds like there is / has been a design flaw. Why did someone believe there should be so long a bit of data?
If you clearly explain what business function you are trying to implement, someone may have a suggestion as people here have most likely done something very similar in much different ways. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Follow on:
Quote: |
Note: Feasibility also depends on the resources available with the technology at the organization level. Kindly, please help me ignore this fact. |
What does this really say?
I have re-read this several times and have no idea . . .
d |
|
Back to top |
|
|
enrico-sorichetti
Superior Member
Joined: 14 Mar 2007 Posts: 10873 Location: italy
|
|
|
|
Quote: |
Feasibility also depends on the resources available with the technology at the organization level. Kindly, please help me ignore this fact. |
tell me something that I can understand and explain to those m*r*n* |
|
Back to top |
|
|
anshuljain26
New User
Joined: 04 Apr 2010 Posts: 37 Location: Chandigarh
|
|
|
|
Thanks, Dick & Enrico-sorichetti.
Firstly, i forgot to write somewhere around 66k, to be exact 65041bytes and as 64k is 65536 bytes, therefore I think data may reside.
Secondly, we are writing the end of day report to an MQ, but due to design issues like loosing the data on outage, issues and exponential increase of size of MQ(discussed in one of the post), we want to use DB2 to store the data from mainframes and B2B reading and deleting it.
We want to communicate between mainframes and Informatica B2B for message of 65041bytes and at one go we can expect generation of 1lakh such messages from mainframes at EOD.
Thridly, on the feasibility thing, implementing of design and idea is all i mean
Thanks. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
If my multiplication is correct, you want to transfer 6,504,100,000 (65041 * 100000 = 6.5 billion bytes) at about the same time every day. . .
Have you ever tried such a massive amount of data in a short timeframe? What is the expected performance?
If the data is put into a db2 table, it will not "get lost". |
|
Back to top |
|
|
anshuljain26
New User
Joined: 04 Apr 2010 Posts: 37 Location: Chandigarh
|
|
|
|
HI,
but that will be on diff records. and yes it is happening at present via MQ.
And on performance definitely MQs are ease for B2B to read from.
But here should we go for Varchar datatype or CLOB? |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
Quote: |
But here should we go for Varchar datatype or CLOB? |
How was this choice picked? Why use either?
You haven't provided any info about what the process is really doing, so my saying one or the other wouldn't be proper. . .
Quote: |
but that will be on diff records. and yes it is happening at present via MQ. |
How long does this currentlly take - using what kind of connection? |
|
Back to top |
|
|
GuyC
Senior Member
Joined: 11 Aug 2009 Posts: 1281 Location: Belgium
|
|
|
|
DB2 SQL reference guide :
Quote: |
Maximum length of VARCHAR :
4046 bytes for 4 KB pages
8128 bytes for 8 KB pages
16320 bytes for 16 KB pages
32704 bytes for 32 KB pages |
|
|
Back to top |
|
|
sushanth bobby
Senior Member
Joined: 29 Jul 2008 Posts: 1020 Location: India
|
|
|
|
Hi Anshul,
You may have to use a CLOB, since its 65536 bytes.
Quote: |
Secondly, we are writing the end of day report to an MQ, but due to design issues like loosing the data on outage, issues and exponential increase of size of MQ(discussed in one of the post), we want to use DB2 to store the data from mainframes and B2B reading and deleting it. |
I have couple of questions on the above,
Are you trying to store formatted report data in DB2 and try to retreive it later ?
Do you want to use DB2 table as a scratch pad for temporary usage like store data for a couple of transactions and delete the data when it ends ?
Thanks,
Sushanth |
|
Back to top |
|
|
|