I am having a file which is having multiple records for a key. I have to build a record by taking some values from multiple rows and make a single record for a key. I tried with splice but only getting 50% correct output. The details of requirement are as follows:
Here the key is from 9-20 bytes i.e. 001X002011111, the sequence is present in 21-22 bytes (i.e. for above example it is 01,02,04,05,89,90)
In Output I have to make a single record for a key with following details
From 01 sequence take bytes 24-27, 30-35, 50,53
From 02 sequence take bytes 25-28,40-43,45-48
From each 89 sequence take bytes 30-33,45-50,60-62
From each 90 sequence take byes 42-45, 47-50, 52-55
I don’t need any details from 04 and 05 sequence. Kindly suggest. Its urgent.
amit_tater,
1) Please provide expected output for your sample input data.
2) If you have multiple records for 01,02,89,90, Is it safe to assume that the key fields you want in the output will have the same values for all the dups? For example, is it possible to have different values at 24-27, 30-35, 50,53 positions for the two 01 sequence records?
3) Provide Input RECFM/LRECL and Output RECFM/LRECL.
Joined: 15 Feb 2005 Posts: 7129 Location: San Jose, CA
Quote:
From each 89 sequence take bytes 30-33,45-50,60-62
From each 90 sequence take byes 42-45, 47-50, 52-55
Are there always three 89 records? Are there always three 90 records?
Or can the number of 89 and 90 records change from run to run? If so, what is the minimum and maximum number for each?
Quote:
Kindly suggest. Its urgent.
If you want help quickly, then you need to do a much better job of describing what it is you want to do exactly with all cases covered.
Show an example of the records in your input file (relevant fields only) for all situations you need to handle and what you expect for output. Explain the "rules" for getting from input to output. Give the starting position, length and format of each relevant field. Give the RECFM and LRECL of the input file. If the input file can have duplicates within it, show that in your example.
Joined: 09 Mar 2011 Posts: 7309 Location: Inside the Matrix
Pandora-box, please wait for confirmation from TS of what the requirement is, especially with regard to the duplicates. Despite the "urgency" we may never hear from them again.
When we do hear, something with two or fewer passes of the data would be good...
Joined: 07 Dec 2007 Posts: 2205 Location: San Jose
Pandora-Box wrote:
Sure Bill!!
Will ensure that going forward
I truly appreciate your enthusiasm to provide solutions , but keep in mind that you never should exceed more than 3 passes of data to get the desired results. You might as well write a program instead of having sort job involving multiple passes. Remember that you are only seeing a small set of input data and in reality the input may be in millions.