View previous topic :: View next topic
|
Author |
Message |
Vidhya Kalyanasundaram
New User
Joined: 19 Jul 2007 Posts: 30 Location: chennai
|
|
|
|
Is there anyway to read data from a file and load it in an external table using easytrieve? |
|
Back to top |
|
|
Devzee
Active Member
Joined: 20 Jan 2007 Posts: 684 Location: Hollywood
|
|
|
|
Yes you can read file in easytrive.
Quote: |
to an external table |
Is this Db2? or what is this external table |
|
Back to top |
|
|
Vidhya Kalyanasundaram
New User
Joined: 19 Jul 2007 Posts: 30 Location: chennai
|
|
|
|
See the explanation for external table in this link :
ibmmainframes.com/viewtopic.php?t=1635&highlight=easytrieve+table+limit
I need to load the input from the file given in the DDname to the table defined inside the program
as :
FILE table1 TABLE
ARG ...
DESC ...
ENDTABLE
Also tell me is there any way to get the maximum number of entries in the table in a WS variable which will be coded in the program as:
FILE table1 TABLE(max.no.entries)
ARG ...
DESC ...
ENDTABLE |
|
Back to top |
|
|
William Thompson
Global Moderator
Joined: 18 Nov 2006 Posts: 3156 Location: Tucson AZ
|
|
|
|
Vidhya Kalyanasundaram wrote: |
I need to load the input from the file given in the DDname to the table defined inside the program
as :
FILE table1 TABLE
ARG ...
DESC ...
ENDTABLE |
table1 is the DD name (should be sorted).....
Quote: |
Also tell me is there any way to get the maximum number of entries in the table in a WS variable which will be coded in the program as:
FILE table1 TABLE(max.no.entries)
ARG ...
DESC ...
ENDTABLE |
It doesn't look like it, but you could feed the dataset through the internal sort to get the count (the sort's TO would be the table). |
|
Back to top |
|
|
noorkh
New User
Joined: 06 Mar 2006 Posts: 76 Location: Chennai
|
|
|
|
Hi Vidhya,
Can you brief more about your requirement? We normally use search table to read data from Jcl and to search within the program using some other field. Normally this table file work as a reference file rather than storing anything into it.
Please let us know more about your requirement so that we can understand your problem. |
|
Back to top |
|
|
Vidhya Kalyanasundaram
New User
Joined: 19 Jul 2007 Posts: 30 Location: chennai
|
|
|
|
1.When a table is declared in an Easytrieve Program as :
FILE EMPLOYEE TABLE ( 1000 )
- It indicates the table, EMPLOYEE is located in a file, external to your EZT program identified by the logical name of the table file mentioned in the JCL part i.e the DDNAME.
- The literal indicates the Max.No.Of Entries in the table.
2. EASYTRIEVE PLUS build this external table EMPLOYEE,dynamically with the number of entries i.e 500 (specified in the FILE statement)just prior to its usage by loading the table dynamically with the data from the file specified in the DDNAME.
3. Here incase if the number of records from the EMPLOYEE file specified in the DD statement exceeds 1000, the table overflow takes place when it is dynamically loaded into EMPLOYEE table and the following statement get displayed in the SPOOL ending the job with MAXCC=16 :
*******A008 TOO MANY TABLE ENTRIES - EMPLOYEE
*******A014 PREMATURE TERMINATION DUE TO PREVIOUS ERROR(S)
Is there any way to load the records one by one into the table from the file and having a record count which can be incremented when each record enters the table. So that when its about to reach the Max.Entries we can display a message to increase the table size rather than job ending abruptly.
Or moving the file entries to an array variable and then searching the data using IF is the only option to resolve this when the table entries need to be searched with SEARCH TABLE statement?
Please reply to satisfy this requirement!
Thanks in advance. |
|
Back to top |
|
|
noorkh
New User
Joined: 06 Mar 2006 Posts: 76 Location: Chennai
|
|
|
|
Hi Vidhya,
When you declare it as a table then it tries to read record(whenever you search for it). You can't restrict it from gettting such abend.
You need to specify maximum limit for your table. But this is not as fast as array usage. You better go for array usage.
You can't have anything like to check for maximum entries of table.
I hope it clears your doubt. |
|
Back to top |
|
|
IQofaGerbil
Active User
Joined: 05 May 2006 Posts: 183 Location: Scotland
|
|
|
|
What is the reason for the 1000 table limit? |
|
Back to top |
|
|
Vidhya Kalyanasundaram
New User
Joined: 19 Jul 2007 Posts: 30 Location: chennai
|
|
|
|
Hai,
When the number of records in the file TABLE1(which is specified in the DD statement) going to be loaded into the external table TABLE1 defined in the Easytreive program is 9000 but i have specified the limit for the external table as follows:
FILE TABLE1 TABLE (6000)
ARG 1 3 A
DESC 4 13 A
IARG W 3 A VALUE 'AAN'
IDESC W 13 A
*
JOB INPUT NULL
DISPLAY IARG
STOP EXECUTE
As the table limit has exceeded it should throw an error as follows:
*******A008 TOO MANY TABLE ENTRIES - TABLE1
*******A014 PREMATURE TERMINATION DUE TO PREVIOUS ERROR(S)
----This code dint throw an error
But it throws the same error only when we perform a search on that table as:
FILE TABLE1 TABLE
ARG 1 3 A
DESC 4 13 A
IARG W 3 A VALUE 'AAN'
IDESC W 13 A
*
JOB INPUT NULL
DISPLAY IARG
SEARCH CORRA WITH IARG GIVING IDESC
STOP EXECUTE
So we can come to a conclusion that the table is getting loaded only after the job statement during the SEARCH operation.
Is there any way to trace the count of records in the file before the JOB statement?
Please help me in getting the count in a variable before the job statement |
|
Back to top |
|
|
Vidhya Kalyanasundaram
New User
Joined: 19 Jul 2007 Posts: 30 Location: chennai
|
|
|
|
Hai sorry ,
The second code in which the error is thrown is as follows:
SEARCH TABLE1 WITH IARG GIVING IDESC
and not SEARCH CORRA |
|
Back to top |
|
|
IQofaGerbil
Active User
Joined: 05 May 2006 Posts: 183 Location: Scotland
|
|
|
|
Once more - why a limit?
what happens when that limit is breached?
It looks like you want to know when your coded limit is about to be broken then you will change your code to increase it, is that correct?
How long does this cycle of check, increase continue?
Is there an absolute upper limit, if so what is it? |
|
Back to top |
|
|
Vidhya Kalyanasundaram
New User
Joined: 19 Jul 2007 Posts: 30 Location: chennai
|
|
|
|
Instream and External tables were used in Easytrieve Programs as the TABLE parameter of the FILE statement declares that the file as the object of a SEARCH statement that accesses tables.
So after declaration of the TABLEs in the FILE statement, the SEARCH STATEMENT can be included in the JOB-SECTION as:
SEARCH Statement Explanation:
The SEARCH statement performs a search of a table.
SEARCH can be --> 1.Coded any place in a JOB activity
2.Issued any number of times against any number
of tables.
The SEARCH statement has this format:
SEARCH filename WITH field-name-1 GIVING field-name-2
filename - Filename is the name of the table that appears on the FILE statement.
field-name-1 - Field-name-1 is the name of a field that contains a value that is compared to the search argument. It must be the same length and type as the search argument (ARG).
field-name-2 - Field-name-2 is the name of a field into which the description is placed if a match exists between field-name-1 and the search argument. It must be the same length and same type as the description (DESC).
Since this search does a BINARY SEARCH(whereas loading the data into an array performs only sequential search), this is an efficient searching technique.
But the drawback here is, we have to specify the LIMIT(the value i.e.the no.of.records, we expect out of the file from which it will be loaded) in the table declaration section itself(as the default value is just 256) and incase the no.of.records coming from the PS file exceeds the value what we have coded as LIMIT, the TABLE OVERFLOW takes place with MAXCC=16.
Yes, as you said ,I am trying to avoid this abend and overcome this by by knowing when it is about to broken???
So need some logic to have a count???
Help me for the same!!!!
Thanks in advance
Vidhya |
|
Back to top |
|
|
IQofaGerbil
Active User
Joined: 05 May 2006 Posts: 183 Location: Scotland
|
|
|
|
I thought that the 256 limit was for instream tables not external ones like yours?
Anyhoo,
Why not try a different approach?
Even if you do get a solution to this problem by allocating a counter, there may come a point at which the table will be too big for Easytrieve to handle ie not enough core storage, giving
*******A003 INSUFFICIENT CORE STORAGE AVAILABLE
Now if you think that point will never be reached then why not find what the max value is that can be attributed to the tablesize and stick with that, or something closer to what your analysis of maximum future data might be?
It took me five minutes to find the max value in my shop for the table sizes in your example (ie over 300,000)
However if your data is likely to cause the storage problem sometime in the future, then you should rethink your design. eg sort both files and do a two file match possibly?
Good luck. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19243 Location: Inside the Matrix
|
|
|
|
Hello,
Quote: |
However if your data is likely to cause the storage problem sometime in the future, then you should rethink your design. eg sort both files and do a two file match possibly |
If your table/array is to contain more than just a few "records", you should sort both sets of data and do a 2-file match. To scan a large array/table over and over is just a complete waste of machine resources. |
|
Back to top |
|
|
Vidhya Kalyanasundaram
New User
Joined: 19 Jul 2007 Posts: 30 Location: chennai
|
|
|
|
Thank You ! |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19243 Location: Inside the Matrix
|
|
|
|
You're welcome
How was your question resolved? If you post your solution, it may help someone with a similar question sometime. |
|
Back to top |
|
|
|