View previous topic :: View next topic
|
Author |
Message |
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19243 Location: Inside the Matrix
|
|
|
|
Hello,
Quote: |
as this needs to be done online |
Sorry, but i very much disagree.
I believe what is needed is a smooth process for the user to get the needed answers in a timely fashion. That does not make the use of online resources a requirement. |
|
Back to top |
|
|
expat
Global Moderator
Joined: 14 Mar 2007 Posts: 8796 Location: Welsh Wales
|
|
|
|
If it must be done in a wasteful and inappropriate manner then you will need to look at some of the ways to maybe waste a little less resource.
I guess that your shop does not bill each department on its usage of resource, because if they do there will probably be some very unhappy bean counters when the resource bills start rolling in.
If you can use a unique key, then using a VSAM KSDS accessed by IDCAMS within REXX should work. |
|
Back to top |
|
|
vidyaa
New User
Joined: 02 May 2008 Posts: 77 Location: chennai
|
|
|
|
What i meant to say is the user only sees the front screen designed by panels he doesnot know what is happening in the backend rexx he corrects the input if he gets the error message, all the validations are done in rexx and throws the message |
|
Back to top |
|
|
expat
Global Moderator
Joined: 14 Mar 2007 Posts: 8796 Location: Welsh Wales
|
|
|
|
What happens behind the panels are usually of no interest to the user, just the results.
So how you design / implement a package that delivers the results is of no consequence to the user, which gives you a free hand to search the best and most effective methods. |
|
Back to top |
|
|
Pedro
Global Moderator
Joined: 01 Sep 2006 Posts: 2594 Location: Silicon Valley
|
|
|
|
Quote: |
is there any way to incorporate this searching part in languale like cobol and combine with rexx as this needs to be done online. |
Yes, you can write the searching part in a compiled language. But a billion records is still a billion records! With a sequential file, I do not think you will ever have a satisfactory process for an online environment. It is the reading of the file that takes a long time and that does not change much regardless of rexx or compiled language. I stand by my recommendation to use VSAM or a database so that less I/O is performed.
A REXX program can call a compiled program to do part of the work. The compiled program can use ISPF VGET / VPUT services to pass information to REXX. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19243 Location: Inside the Matrix
|
|
|
|
Hello,
Quote: |
But a billion records is still a billion records! |
I suspect a billion record vsam file will present "opportunities" also. . .
How long are the records (i just want to get an idea of the space required)?
How often is the current sequential file re-created (the vsam would probably need to be reloaded at that time also)?
How often does a user need to request info from this data?
How many users make these requests? |
|
Back to top |
|
|
vidyaa
New User
Joined: 02 May 2008 Posts: 77 Location: chennai
|
|
|
|
each record is of length 313 and there can multiple users using to raise the request. what we are trying is get the input from the user and trying to process the valid request in a batch cycle.
for example conside my online screen will have like we need to ge the input form the user
custmer number :1001
customer name:XXX
ins number:1234
if he enters these values i need to validate for each field seperately like f he enters abcd in customer number then i need to prompt message like'enter numeric values' like wise i have many fields and i do seperate validation for them till they give the correct input
next step after this inital validation i take all the input entered and serach in a file(having billion records) if any record with such combinations entered by the user exists . if it exists i write to a file else i will prompt the user as 'serach not found' and ask him to reenter.
This is the whole task. |
|
Back to top |
|
|
MBabu
Active User
Joined: 03 Aug 2008 Posts: 400 Location: Mumbai
|
|
|
|
333 * 1,000,000,000 + space for various control info might be 350 Gigabytes of information or roughly 7 3390-54 volumes and you want to search all of that every time someone presses the enter key? Uhh.... are you sure you have the right numbers here? Rexx or even basic COBOL, shouldn't even enter your mind for a task like this. Only a very robust data base will handle this task with any reasonable response time. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19243 Location: Inside the Matrix
|
|
|
|
Hello,
One of the processes i've inherited may be somewhat similar. Various users request info from history data that goes back (currently) 17 years. While the total number of records is less than a billion (max records to scan is just under 700 million), these records vary from several hundred bytes to more than 14k bytes. The entire inventory is on carts and there are 6 or more carts for each month.
Due to users needing different data for different date ranges, we queue requests as we do not want to process this as an every day, on request sort of process.
The users enter their selection criteria and it is stored in a common gdg and each set of user requests is cataloged as a new generation. When the run is processed, all of the entries in the gdg are combined and the monster(s) cut loose to run for hours and hours . . . The selected output is separated by user and that is the end of it. If a user realizes they submited an incorrect request, they simply enter the correct one. If they tell us, we can delete the invalid request but if not, it usually just generates a "not found" message. We do not validate in real time - far too costly.
FWIW |
|
Back to top |
|
|
gcicchet
Senior Member
Joined: 28 Jul 2006 Posts: 1702 Location: Australia
|
|
|
|
Hi Dick,
I have seen a similar process where the user requested data for a particular name or id, now the files were stored in name or id order.
The data was held on hundred of carts.
Now what the process did was create an index storing the first record of each cart, so based on the request, JCL was generated specyfing the vol seq no the data resided on.
Gerry |
|
Back to top |
|
|
|