IBM Mainframe Forum Index
 
Log In
 
IBM Mainframe Forum Index Mainframe: Search IBM Mainframe Forum: FAQ Register
 

Zipping a mainframe EBCDIC file to ASCII in PC using PKZIP


IBM Mainframe Forums -> All Other Mainframe Topics
Post new topic   Reply to topic
View previous topic :: View next topic  
Author Message
tjk3030

New User


Joined: 18 Mar 2006
Posts: 22

PostPosted: Wed Apr 18, 2007 7:25 am
Reply with quote

Hello All

Could someone give me some actual JCL examples for

1. Zipping a mainframe EBCDIC file, and when unzipped on a pc its ascii.

The pkzip manual is confusing.

Thanks
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19244
Location: Inside the Matrix

PostPosted: Wed Apr 18, 2007 7:56 am
Reply with quote

Hello,

Which manual(s) are you using? The PKWARE site has several versions of the manuals for download - i suspect these are for all of the current versions. The example jcl/control statements looks quite striaghtforward.

Please post the jcl/control statements you are working with and the jes output / sysout info from your run. The output should show which version you are running.

One of the first things to resolve is which version of the software you are running.

Before getting involved with the details of the compress/decompress, you need to make sure that what you zip is of such a format that it can be unzipped on an ascii platform. What kind of file(s) do you want to compress? If the data contains any packed-decimal or binary data items, you need to address them before trying to transfer them to another platform - compressed or otherwise.
Back to top
View user's profile Send private message
tjk3030

New User


Joined: 18 Mar 2006
Posts: 22

PostPosted: Wed Apr 18, 2007 7:58 am
Reply with quote

In need to add some more information. I'm creating a mainframe file containing records alphabetic fields, comp-3 fields, comp fields etc. Is it possible to zip the file on the mainframe, export it to a windows box, unzip
and everything shows up as as ascii.

From what I can tell, the answer is no.

Can anyone help?
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19244
Location: Inside the Matrix

PostPosted: Wed Apr 18, 2007 8:23 am
Reply with quote

Hello,

You will be your best help as you have control of the mainframe data. One of the worst things to struggle with is a downloaded mainframe file that has not been properly expanded for use on an ascii platform - typically unix, linux, and win-based target systems all need the data expanded before the download.

It is fairly simple to write a program to convert all of the numeric data to display decimal rather than packed or binary or zoned. When you convert the numeric data, you want to insert a "real" decimal point (when appropriate) and a trailing minus-sign if the number is negative.

Depending on the file, you may decide you want to use your installation's sort program to convert the data. While it may be possible to do with your sort, it is trivial to do this with a program and the program may be expanded to handle situations that cannot be done with the sort.

If you have questions, we're here icon_smile.gif
Back to top
View user's profile Send private message
superk

Global Moderator


Joined: 26 Apr 2004
Posts: 4652
Location: Raleigh, NC, USA

PostPosted: Wed Apr 18, 2007 9:03 am
Reply with quote

Code:

//PKZIP    EXEC PGM=PKZIP
//SYSPRINT DD   SYSOUT=*
//INFILE   DD   DSN=INFILE,DISP=SHR
//ZIPFILE  DD   DSN=HLQ.DATASET.ZIP,DISP=(,CATLG,DELETE),...
//SYSIN    DD   *
-ECHO
-ACTION(ADD)
-ARCHIVE_OUTFILE(ZIPFILE)
-COMPRESSION_LEVEL(NORMAL|MAXIMUM|FAST|SUPERFAST|STORE)
-DATA_DELIMITER(CR|LF|CZ)
-DATA_TYPE(TEXT)
-FILE_EXTENSION(DROP|SUFFIX|NAMEFILE)
-FILE_TERMINATOR(CR|LF|CZ)
-INFILE(INFILE)
-ZIPPED_DSN(**,WINDOWS_FILE_NAME)
/*
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19244
Location: Inside the Matrix

PostPosted: Wed Apr 18, 2007 7:03 pm
Reply with quote

Hello,

The posted jcl will run the utility, but that will not handle the various numeric format expansion to make the data usable on an ascii platform.

From your powt you need more than just jcl to execute the utility.
Back to top
View user's profile Send private message
tjk3030

New User


Joined: 18 Mar 2006
Posts: 22

PostPosted: Mon Apr 30, 2007 12:43 am
Reply with quote

Actually my client has software to take the data file and it explode it out into ASCII text format. However, PK Zip on the windows side doesnt seem to recgonize the LRECL data (the file is variable length), -save_lrecl, and just dumps out by blocks of data.
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19244
Location: Inside the Matrix

PostPosted: Mon Apr 30, 2007 1:50 am
Reply with quote

Hello,

Between your original post and this one, things seem to have changed. . .
This
Quote:
Zipping a mainframe EBCDIC file, and when unzipped on a pc its ascii.

and this
Quote:
Actually my client has software to take the data file and it explode it out into ASCII text format
are quite different. . .

This
Quote:
the windows side doesnt seem to recgonize the LRECL data (the file is variable length),
adds another wrinkle. Rarely does a UNIX or Win-based system have software that will properly deal with mvs variable-length data with embedded comp or comp-3 fields.

As i posted previously, you will be way ahead to do formatting work on the mainframe. Keep in mind that mvs variable blocked records and "lrecl" are not very often supported on ASCII platforms. If you compress vb/packed data on the mainframe, there is a very good chance that it will be difficult (if even possible) to use after it is transferred.

What you see on the target system is exactly what was sent - blocked data. . . I'm sure that someone also knows that the ftp needs to be done in binary for compressed files. . .
Back to top
View user's profile Send private message
tjk3030

New User


Joined: 18 Mar 2006
Posts: 22

PostPosted: Mon Apr 30, 2007 3:17 am
Reply with quote

The reason I cant blow it out on the mainframe side of things, is because a text version of the record is over 32k.
Back to top
View user's profile Send private message
William Thompson

Global Moderator


Joined: 18 Nov 2006
Posts: 3156
Location: Tucson AZ

PostPosted: Mon Apr 30, 2007 3:39 am
Reply with quote

tjk3030 wrote:
The reason I cant blow it out on the mainframe side of things, is because a text version of the record is over 32k.
OK and your point? What "requirement" limints you to the "magical" limit of 32k?
"Blow it out"? As in like convert non translatable hex into ASCII?
I agree with Dick, what are you holding back? Some additional "requirement" that "prevents" you from providing a swift, accurate and correct solution to your "requirement"?
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19244
Location: Inside the Matrix

PostPosted: Mon Apr 30, 2007 4:15 am
Reply with quote

Hello,

How is this data used/created on the mainframe? If you have "standard" mainframe data, it will typically unpack, compress/zip, and download quite nicely. Please post a record layout from the mainframe program that created the problem file. The creation jcl would also help.

I believe (as Bill says) there are things you have not yet posted. If you post better details, we can provide better suggestions.

Transferring data really isn't magic but it is often made more difficult than it needs to be. I say this after many thousand unique cross-platform data transfers, including sending/receiving data from/to qsam/vsam/databases(many) on the mainframe and data to be used by MS-Office, Oracle, SQLServer, and many other databases on unix and win-based systems as well as systems that "aren't on the map".
Back to top
View user's profile Send private message
tjk3030

New User


Joined: 18 Mar 2006
Posts: 22

PostPosted: Mon Apr 30, 2007 4:20 am
Reply with quote

Expand all the packed, comp, binary fields out to text. They have written a routine to do that. The big problem now is that the lrecl information is not being picked up correctly. Evidently when unzipped on their end, a "block of data" is considered 1 record. According to the PK Zip manual, if you are sending a binary you have to use the command, save_lrecl. I would have thought on the receiving end, when unzipped, the RDW would be interpreted correctly.

Normally we are sending them a data cartridge, I'm trying to move them to zipped & ftp transfer.

Nothing is holding me back on my end, its the users end.
Back to top
View user's profile Send private message
tjk3030

New User


Joined: 18 Mar 2006
Posts: 22

PostPosted: Mon Apr 30, 2007 4:44 am
Reply with quote

Here's a simple version of the rec:

01 Data-rec.
05 key1 pic s9 comp.
05 name pic x(50).
05 address1 pic x(50)
05 Length of tbl1 pic s9 comp.
05 length of tbl2 pic s9 comp.
05 length of tbl3 pic s9 comp.
05 TBL1 occurs 1 to 500 times.
10 field1 pic s9(13)v99 comp-3.
10 field2 pic s9(13)v99 comp-3.
etc etc
05 TBL2 occurs 1 to 500 times
10 field3 pic s9(13)v99 comp-3
10 field3 pic s9(13)v99 comp-3.
Etc, Etc.

I'm not going to type the whole thing out, but if the record was purely text
in a nbr of cases, the rec size would exceed 32K.

I'm zippping, with -data(binary), -save-lrecl(y) and the other zip statements, FTPing binary.

The client is unzipping on a Windows based machine using PK Zip, they have a SAS program that expands the data out to text, by using the above data record layout. Its not picking up the lrecl data. It just seems to pick up the block data length. From what the told me today (it seems to change everyday), the data looks ok, but is just a continuous stream up to the end of the block - there's nothing there to indicate the end of the record. The RDW looks ok too.

Normally they have a cartridge reader plugged into the desktop & read the tape that way, run it thru the SAS program & load into their database.
That works just fine.
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19244
Location: Inside the Matrix

PostPosted: Mon Apr 30, 2007 5:56 am
Reply with quote

Yup, i too wouldn't type many many fields by hand icon_smile.gif

If it was my mainframe file, i'd have a copybook though. . .

What kind of file is this? How is it built - possibly from many small flat records? Might they be candidates for the transfer, then build the composite records on the target?

All of the mainframe compatible tape drives i've used on unix/windows come with software that handles the blocking/de-blocking and puts usable files on the target. Doesn't help much when the transfer is no longer via cart. . .

One thing that might help is to split the existing "big" record into multiple smaller unpacked records and then re-combine them on the windows machine. Is the only use for these records that they are input to SAS code? I suspect that a bit of new SAS code could be put together to combine them.

Good luck and we're here to bounce ideas off of icon_smile.gif
Back to top
View user's profile Send private message
tjk3030

New User


Joined: 18 Mar 2006
Posts: 22

PostPosted: Mon Apr 30, 2007 8:32 am
Reply with quote

Yes, it does have a copybook. Yup, I always had problems with whoever designed that particular record in our system. Not only does it have limitations in size, but it also limits the data, it is essentailly an extract of our DB. If any new tables are added, which have only been a couple it cant handle it. Plus ya gotta love Pic s9(13)v00 fields, most will never hold more than a $999.99. And it compresses down by over 90%. Unfornately it is used all over our system, at one point they talked about increasing it etc, it was a 6 month task.
Sorry about the specs over all, I didnt think it could be done, and trying to debug what a client sees is damm hard.
Originally my client said they could read it, which was true - for only the first record.
Oh, well.

THANKS ALL FOR YOUR INPUT
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19244
Location: Inside the Matrix

PostPosted: Mon Apr 30, 2007 6:59 pm
Reply with quote

Hello,

Where does the data "wind up" on the target?

Is the purpose of the "receiving SAS code" to reformat the data for subsequent table loads? If it is to be loaded in various database tables, there might be multiple files created (on the mainframe) from the "monster" in a format that would readily load into the target database tables. Those could be compressed for transfer, but would (i believe) be easier to de-compress.
Back to top
View user's profile Send private message
View previous topic :: :: View next topic  
Post new topic   Reply to topic View Bookmarks
All times are GMT + 6 Hours
Forum Index -> All Other Mainframe Topics

 


Similar Topics
Topic Forum Replies
No new posts How to split large record length file... DFSORT/ICETOOL 10
No new posts Extracting Variable decimal numbers f... DFSORT/ICETOOL 17
No new posts Mainframe openings in Techmahnidra fo... Mainframe Jobs 0
No new posts SFTP Issue - destination file record ... All Other Mainframe Topics 2
No new posts Access to non cataloged VSAM file JCL & VSAM 18
Search our Forums:

Back to Top