View previous topic :: View next topic
|
Author |
Message |
raghavmcs
Active User
Joined: 14 Jul 2005 Posts: 105
|
|
|
|
Dear Experts,
Before posting did I searched for around 1 hour in the forum but didnt got anything which could give the solution to my question.
I came across a situation where I have got a flat file in of lenght 100.There is fiels defined as packed decimal of lenght 4 which start at position 23 of the file.
Today date file in system is having non numberic value in some/or one of the reocrds in this file and before uploading this file into production for an emer fix I need to remove this records.
I tried to get these records from fileaid with searching with quick option for anything NE 0 option but it didnt reported anything for me.
The program is not capable enough of handling this situation in production as I said I am working on an emer fix to avoid the problem in tonight's run.
I hope I am clear and I am posting my question and right forumrequest you to please post a quick step to do this,thanks |
|
Back to top |
|
|
Arun Raj
Moderator
Joined: 17 Oct 2006 Posts: 2481 Location: @my desk
|
|
Back to top |
|
|
Frank Yaeger
DFSORT Developer
Joined: 15 Feb 2005 Posts: 7129 Location: San Jose, CA
|
|
|
|
raghavmcs,
I don't understand why you would expect a check for NE 0 to find non-numeric values.
Here's a DFSORT job that will remove records that have a non-numeric PD value in positions 23-26.
Code: |
//S1 EXEC PGM=ICEMAN
//SYSOUT DD SYSOUT=*
//SORTIN DD DSN=... input file
//SORTOUT DD DSN=... output file
//SYSIN DD *
OPTION COPY
OMIT COND=(23,4,PD,NE,NUM)
/*
|
If it doesn't do what you want, then you need to give more details about what these non-numeric values look like - show some examples of these values in hex. Also give the RECFM and LRECL of your input file. |
|
Back to top |
|
|
raghavmcs
Active User
Joined: 14 Jul 2005 Posts: 105
|
|
|
|
Experts thanks for all your replies and suggestions..
Frank appreciate your solution I was not knowing that DFSORT was having NUMERIC check option also,my bad...
we were able to recover the correct file whicih is going to get uploaded in tonights run.The recovered file was having all the correct data.
Frank,
Still I ran your solution with the file I was having with me and it gave the desired results by skiping the bad records,thanks |
|
Back to top |
|
|
|