View previous topic :: View next topic
|
Author |
Message |
rexx77
New User
Joined: 14 Apr 2008 Posts: 78 Location: Mysore
|
|
|
|
Greetings.
I am trying to load a file which has over 65 million records to test region table using DSNUPROC procedure.
Job failed with S04E abend. I googled for that and there was mention in some forum to create SORT WORK FILES in TAPE to overcome this issue.
i gone through the manuals for what should be the DD statement for TAPE datasets, but i am not able to get any useful information.( I dont know the VOL, SER,Label parameters for my shop mainframe tapes)
Can anyone here help me in giving any idea in Bulk Loading data to DB2 tables and/or help me in creating the datasets in TAPE (Temporary SORT work files).
Thanks in advance. |
|
Back to top |
|
|
enrico-sorichetti
Superior Member
Joined: 14 Mar 2007 Posts: 10889 Location: italy
|
|
|
|
we do not know anything about Your environment,
any suggestion might be wrong
speak to Your support, the should know |
|
Back to top |
|
|
rexx77
New User
Joined: 14 Apr 2008 Posts: 78 Location: Mysore
|
|
|
|
Thanks for your quick reply. Do you have any idea of how to insert bulk data to tables...... |
|
Back to top |
|
|
Anuj Dhawan
Superior Member
Joined: 22 Apr 2006 Posts: 6248 Location: Mumbai, India
|
|
|
|
Quote: |
over 65 million records |
Just for an experiment, if you try with less number of records, what happens? |
|
Back to top |
|
|
rexx77
New User
Joined: 14 Apr 2008 Posts: 78 Location: Mysore
|
|
|
|
Actually before loading to table i am sorting the Input file. When i speify Stopafter 999999 records in sorting and if i load the sorted file to table then it is successfully loaded.
But my testing team is not finding records which match their conditions.
That is the reason i would like to insert all the records available in the file
Any inputs? |
|
Back to top |
|
|
dbzTHEdinosauer
Global Moderator
Joined: 20 Oct 2006 Posts: 6966 Location: porcelain throne
|
|
|
|
Quote: |
But my testing team is not finding records which match their conditions. |
sounds as if you need to isolate those records which will satisfy your test-teams requirements,
and then load those,
if you can not solve your 'total' file load. |
|
Back to top |
|
|
rexx77
New User
Joined: 14 Apr 2008 Posts: 78 Location: Mysore
|
|
|
|
Not like that.
The 1 million records which i have loaded is belongs to some other partition data which is present in the file. They need to test with the other partitions data which is also available in the same file. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19243 Location: Inside the Matrix
|
|
|
|
Hello,
Quote: |
They need to test with the other partitions data which is also available in the same file. |
Yes, we understand this. . .
What reason is there preventing selectively copying some records they do need from the file to load, rather than just using the first million. . .? |
|
Back to top |
|
|
shyamkumarnagabandi
New User
Joined: 12 May 2009 Posts: 7 Location: hyderabad
|
|
|
|
i think its better to run the 'COPY TABLESPACE DATABASE_TABLENAME'
before loading the data to the table...
please let me know if it is wrong... |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19243 Location: Inside the Matrix
|
|
|
|
Hello,
How does this help the s04e abend? |
|
Back to top |
|
|
shyamkumarnagabandi
New User
Joined: 12 May 2009 Posts: 7 Location: hyderabad
|
|
|
|
sorry...Actually i got the abend as S04E, but after again submitting the Job it has run succesfully
with out any changes..but i dont why it was happened like that....
Reasons for getting the Abend S04E is:
--Resource unavailable
--A decimal packed field contained non-numeric data when an INSERT or an UPDATE SQL statement was executed
--An SQL WHERE clause in a cursor contains a packed numeric working storage field that contains non-numeric data when the OPEN CURSOR statement is executed. Initialize the field or remove it from the SQL statement if not used. |
|
Back to top |
|
|
suresh1624
New User
Joined: 21 Nov 2007 Posts: 28 Location: chennai
|
|
|
|
Quote: |
The 1 million records which i have loaded is belongs to some other partition data which is present in the file. They need to test with the other partitions data which is also available in the same file. |
looks like you are loading a partition table. split the file based on your patition key and try to load independently each partition. also can u please post the table definition you are loading and the o/p messages you got while loading. |
|
Back to top |
|
|
|