I have requirement to delete the duplicates from a file and while deleting duplicates we need to update the trailer record count.
Below is the input file before eliminating dups
AISCDCHNG2011-08-24 -----header
D010004M
D010004M
D010004M
D010004M
D010004M
D010005A
D010006M
D010007M
D010008M
D010008M
D010008M
D010008M
D010008M
D010008M
D010009M
Z0000000015 ----trailer with count as 15 records
after sort my expected results should be as
***************************** Top
AISCDCHNG2011-08-24
D010004M
D010005A
D010006M
D010007M
D010008M
D010009M
Z0000000006
**************************** Bott
header shoul come as it is, and triler should update with new count.
Joined: 09 Mar 2011 Posts: 7309 Location: Inside the Matrix
Have a look through the forum, there are many trailer-update examples and dropping-duplicate examples.
While you are "dropping" them, is anything "catching" them, or do they just disappear off to some recycle-bing in the sky? If the data on the duplicate records doesn't mean anything, why were the records generated in the first place?
Record formate is FB
Record length is 50 bytes
Structure is
Header record stat with 'H' and 49 bytes data
Details record start with 'D' and 49 bytes data
trailer record start with 'Z' and 10 byte count and 39 byte spaces
and eliminated duplicate need not to capture in any file.
I need to achive this requirment in one step only. could you please help me here.
Joined: 09 Mar 2011 Posts: 7309 Location: Inside the Matrix
sant532 wrote:
[...]
and eliminated duplicate need not to capture in any file.
I need to achive this requirment in one step only. could you please help me here.
I cannot understand either of these points.
How can you have data in your system which you just leave abandoned on the computer-room floor? What do you do, one day, when someone asks about that particular data? Or a part of it?
Imagine this. You have a bug. You get duplicate records. You fix the bug. Throw away the duplicates. A couple of months later, you get a query. You look at the data, query doesn't seem to make sense. You get another. Same thing. Another few over the next months. Never solved.
Or
Same thing, but you realise the date of the transactions is the same, and that particular date rings a bell. You check further with this extra knowledge, and realise that the "fix" was just not quite right... Now what? Restore and re-run to three months ago to get the data to drop it off, but to a file this time, to check to see if it gives you anything...
Or
Same thing, but you check against your log of data fixes, check the audit trails, files, work out the problem, do a new, more thorough, impact analysis, inform everyone, organise the restoration of the five records which were genuinely "duplicate" etc etc. You even have sign-offs that you were not the only one who missed picking up the different scenario hived-off in error.
For this requirement, I'm sure one step is not a problem. If it were though, what are your "rules" for "one step"? An idiotic "ban" or a "huge input file" or something else reasonable?
********************************* TOP OF DATA **********************************
SYNCSORT FOR Z/OS 1.3.2.1R U.S. PATENTS: 4210961, 5117495 (C) 2007 SYNCSO
z/OS 1.11.0
SYSIN :
SORT FIELDS=(1,1,CH,A,2,7,CH,A) 0023000
SUM FIELDS=(NONE) 0024000
OUTFIL IFTRAIL=(HD=YES,TRLID=(1,1,CH,EQ,C'Z'), 0025000
*
TRLUPD=(2:COUNT=(M11,LENGTH=10))) 0026000
WER903I SYNCSORT 1.3.2.1 IS NOT LICENSED FOR SERIAL B8A36, TYPE 2817 723, LPAR
WER903I PRODUCT WILL STOP WORKING IN 43 DAYS UNLESS A VALID KEY IS INSTALLED.
WER268A OUTFIL STATEMENT : SYNTAX ERROR
WER211B SYNCSMF CALLED BY SYNCSORT; RC=0000
******************************** BOTTOM OF DATA ********************************