View previous topic :: View next topic
|
Author |
Message |
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Possibly. . . .
From the lack of data, i was never sure i knew where the "break" was or if one eventual record could be "broken" in multiple places in the input.
Be interesting to learn what actually "fixed" this. |
|
Back to top |
|
|
superk
Global Moderator
Joined: 26 Apr 2004 Posts: 4652 Location: Raleigh, NC, USA
|
|
|
|
The first couple of times I read the post, I thought it was talking about handling "carriage control", and I couldn't for the life of me figure out why that would pose a problem. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Yup,
My first thought reading the initial post was that the cr/lf was coming in as data and the new output needed to be positioned accordingly by the program. |
|
Back to top |
|
|
rahul_krg
New User
Joined: 08 Aug 2007 Posts: 7 Location: India
|
|
|
|
Hi All,
After searching for the handling of CR/LF characters in EBCDIC characters, I came through this thread. I'm facing the same problem of getting the output record splitting in number of records depending upon the occurances of CR code (X'0D') in the file. This CR character will not be there in the final output in the file, but it was part of the record written to file. I've observed that if there are 3 CRs in the record, it gets splitted in 3 individual records.
Please help in decoding this problem of handling CR in Mainframe. I'm using ZFile to generate MVS datasets through JAVA program. My record contains some COMP field.
RG |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
Quote: |
I'm using ZFile to generate MVS datasets through JAVA program. My record contains some COMP field. |
What are the definition & attributes for the "MVS datasets"? Please post a few examples of these "COMP" fields.
Quote: |
I've observed that if there are 3 CRs in the record, it gets splitted in 3 individual records |
This is normal. I'd suggest correcting the process that creates the file to only create 1 CR per record. |
|
Back to top |
|
|
rahul_krg
New User
Joined: 08 Aug 2007 Posts: 7 Location: India
|
|
|
|
Hi Dick,
Thanks for the reply. Answering your question, as we are creationg the MVS dataset using ZFile API (IBM proprietary package)through Java, attributes are defined on run-time depending upon the record structure. Datasets are of type Variable block & attributes like lrecl, storage space are calculated on the fly. But the problem is quite general and will be there in any kind of dataset.
To describe the issue elaborately, as the record contains COMP fields we can get CR characters as a part of data. For example, decimal value 269 will be stored as X'010D' for a COMP field. Now X'0D' is representation of CR character in EBCDIC characters. So on encounter of this byte X'0D' record gets splitted in 2 lines, whereas this is a data and not the CR.
So I want to avoid this situation where CR is part of data. I can't avoid the occurances of CR in the data since they come from the data source.
I hope the situation is more clear.
Thanks. |
|
Back to top |
|
|
enrico-sorichetti
Superior Member
Joined: 14 Mar 2007 Posts: 10873 Location: italy
|
|
|
|
Quote: |
So I want to avoid this situation where CR is part of data. I can't avoid the occurances of CR in the data since they come from the data source. |
You cannot avoid the situation !
in any record with binary data, x'00' to x'ff' there will always be some bit configurations
which are unprintable/control characters, CR in Your case
check on to write/read records in binary format |
|
Back to top |
|
|
rahul_krg
New User
Joined: 08 Aug 2007 Posts: 7 Location: India
|
|
|
|
Hi Enrico,
Thanks for the reply. It seems that the situation can't be avoided. But the output of my work now depends on avoding CR character coming in data. Since the whole output dataset get garbled because of splitted records.
When I try to use binary mode for writing records, it puts everything in one records till maximum records length. This is not acceptable since I'm writing variable length records.
Is ther any solution or work-around for this. I hope there is a solution for every problem.
Thanks for the help. |
|
Back to top |
|
|
enrico-sorichetti
Superior Member
Joined: 14 Mar 2007 Posts: 10873 Location: italy
|
|
|
|
As I see it
first of all too many confusing posts....
are You working :
under USS with unix like / stream oriented data management
under MVS record oriented data management...
You cannot mix the two architectures for the same dataset
if the data is text only then it might be ok to use the pc approach for record delimiting,
but remember ...
in a mainframe environment the standard record terminator is x'15'
anything else is ... garbage or pure binary data
if You have in Your record layout binary/comp data
You might need to review the application specification |
|
Back to top |
|
|
enrico-sorichetti
Superior Member
Joined: 14 Mar 2007 Posts: 10873 Location: italy
|
|
|
|
/rant on
in this thread issues are posed by three different people, adding confusion
and making things harder to understand
I wish people would start theyr own threads for problems
so that the responders could face just one issue at the time ( expecially the terminology)
/rant off |
|
Back to top |
|
|
enrico-sorichetti
Superior Member
Joined: 14 Mar 2007 Posts: 10873 Location: italy
|
|
|
|
Quote: |
Is ther any solution or work-around for this. I hope there is a solution for every problem. |
the solution is in reading the docs...
a simple googleing for "java zfile record format" gave
http://www-03.ibm.com/servers/eserver/zseries/software/java/jzos/doc/javadoc/toolkit/com/ibm/jzos/ZFile.html#ZFile(java.lang.String,%20java.lang.String)
and... yes... You can write binary variable length records
Quote: |
write
public void write(byte[] buf,
int offset,
int len)
throws ZFileException
Write the buffer to the native file.
This method calls the fwrite() C-library routine.
Parameters:
buf - the byte array to write
offset - the offset, inclusive in buf to start writing bytes from
len - the number of bytes to write
Throws:
ZFileException - if the native call fails
|
why not give it a try |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
The way we prevent this problem is to make sure that COMP data is not created on the UNIX or Win-based systems. |
|
Back to top |
|
|
rahul_krg
New User
Joined: 08 Aug 2007 Posts: 7 Location: India
|
|
|
|
Hi,
COMP data will be there since its a part of Record structure. I'm reading a the data file (which is a MVS dataset) corresponding to the record structure and after processing it, writing back to a new MVS dataset. My Java application is running under USS i.e. OMVS.
and yes, Enrico, docs are useful resources and I've already gone through all the docs related to ZFile. Also I've tried both the record mode and stream mode to write ZFile. The method "void write(byte[] buf, int offset, int len)" doesn't work really as it seems. It writes one record and terminates the writing operation as it reaches the end of buffer. I couldn't find the length of individual record as it was variable. With this method calculation of the value of the argument 'len' is not known, hence it also didn't help. Till now I couldn't write a variable length record in binary mode using ZFile with all possible combinations. Hence I was propelled to use text mode which then pose with CR character problem.
Well thanks for your replies. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
You're welcome,
Quote: |
COMP data will be there since its a part of Record structure. |
Yes, i understand that it is currently part of the record structure.
I suggest you change the record structure for the upload and and re-format on the mainframe (unless the COMP data might be completely removed and no reformat would be necessary). |
|
Back to top |
|
|
|