IBM Mainframe Forum Index
 
Log In
 
IBM Mainframe Forum Index Mainframe: Search IBM Mainframe Forum: FAQ Register
 

Csv file blank data


IBM Mainframe Forums -> JCL & VSAM
Post new topic   Reply to topic
View previous topic :: View next topic  
Author Message
prasadplease

New User


Joined: 02 Sep 2006
Posts: 31
Location: Mumbai

PostPosted: Wed Aug 12, 2009 4:03 pm
Reply with quote

Hi,
My job emails a sequential file(having ; delimiters) via email into a csv file.
The problem is if any field in any particular record of the file has junk data, all data following that field appears as blank in the csv file.
I can see the data in the sequential file but not in the csv file.

Has anyone faced this problem before?

Thanks,
Prasad.
Back to top
View user's profile Send private message
prasadplease

New User


Joined: 02 Sep 2006
Posts: 31
Location: Mumbai

PostPosted: Wed Aug 12, 2009 4:26 pm
Reply with quote

It would also be of help if someone can guide me how to check in SORT if an alphanumeric field has junk or null data
Back to top
View user's profile Send private message
expat

Global Moderator


Joined: 14 Mar 2007
Posts: 8797
Location: Welsh Wales

PostPosted: Wed Aug 12, 2009 4:28 pm
Reply with quote

What is the definition of junk ?
Back to top
View user's profile Send private message
Robert Sample

Global Moderator


Joined: 06 Jun 2008
Posts: 8696
Location: Dubuque, Iowa, USA

PostPosted: Wed Aug 12, 2009 4:47 pm
Reply with quote

I echo expat: what are you defining as junk characters? I've transferred many dozens of .csv files using tabs, commas, and semicolons as delimiters and never had a problem. But then, I do insist on having quote marks around any alphanumeric fields if there's any chance of non-display data in them.
Back to top
View user's profile Send private message
prasadplease

New User


Joined: 02 Sep 2006
Posts: 31
Location: Mumbai

PostPosted: Wed Aug 12, 2009 5:16 pm
Reply with quote

By Junk data i mean non-displayable data (the kind u need to F P'.')

@Robert
I tried using quotes around the particular field, but still facing the problem.
Back to top
View user's profile Send private message
Robert Sample

Global Moderator


Joined: 06 Jun 2008
Posts: 8696
Location: Dubuque, Iowa, USA

PostPosted: Wed Aug 12, 2009 6:13 pm
Reply with quote

Actually, you have two problems then. The first is that if the data is not displayable, your transfer of the file to another machine in text mode will do unpredictable things to your data. This is most likely the cause of your original issue. If you transfer the file in binary, it will not be readable on the other machine since .csv files tend to be used on Windows machines.

Therefore, you have two options:
1) convert the non-display data to displayable EBCDIC data on the mainframe before transferring to the other machine -- which will probably resolve your other issue
2) give up on the .csv idea and transfer the file as binary, then use a routine on the other machine to convert -- byte by byte -- the file from EBCDIC to a collating sequence recognized by the other machine.
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19244
Location: Inside the Matrix

PostPosted: Wed Aug 12, 2009 9:56 pm
Reply with quote

Hello,

Quote:
The problem is if any field in any particular record of the file has junk data
Suggest the "bad" data be cleaned up and the code that puts this data into the system changed to make sure only valid data is stored in the first place. . .

There is typically no business reason to allow "junk" in any database table or external file.
Back to top
View user's profile Send private message
prasadplease

New User


Joined: 02 Sep 2006
Posts: 31
Location: Mumbai

PostPosted: Thu Aug 13, 2009 12:13 pm
Reply with quote

Thanks everyone.

I replaced the junk/non-displayable data from the file using a sort step and then emailed the file as csv.

//SYSIN DD *
SORT FIELDS=COPY
ALTSEQ CODE=(0040)
OUTREC FIELDS=(1,525,TRAN=ALTSEQ)
/*
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19244
Location: Inside the Matrix

PostPosted: Thu Aug 13, 2009 7:43 pm
Reply with quote

Hello,

Quote:
There is typically no business reason to allow "junk" in any database table or external file.
Quote:
I replaced the junk/non-displayable data from the file using a sort step
I guess it is better to waste system resources than correct whatever causes the "bad" data. . . icon_confused.gif
Back to top
View user's profile Send private message
prasadplease

New User


Joined: 02 Sep 2006
Posts: 31
Location: Mumbai

PostPosted: Thu Aug 13, 2009 7:47 pm
Reply with quote

I agree with you Dick that there shud not be bad data in the first place.
Bad dat was in test but I dont expect such data in production. This is just a precautionary step.
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19244
Location: Inside the Matrix

PostPosted: Thu Aug 13, 2009 9:46 pm
Reply with quote

Thanks for the clarification - good luck icon_smile.gif

d
Back to top
View user's profile Send private message
View previous topic :: :: View next topic  
Post new topic   Reply to topic View Bookmarks
All times are GMT + 6 Hours
Forum Index -> JCL & VSAM

 


Similar Topics
Topic Forum Replies
No new posts FTP VB File from Mainframe retaining ... JCL & VSAM 1
No new posts Store the data for fixed length COBOL Programming 1
No new posts Extract the file name from another fi... DFSORT/ICETOOL 6
No new posts How to split large record length file... DFSORT/ICETOOL 10
No new posts Extracting Variable decimal numbers f... DFSORT/ICETOOL 17
Search our Forums:

Back to Top