View previous topic :: View next topic
|
Author |
Message |
janmejay Warnings : 1 New User
Joined: 22 Jun 2007 Posts: 85 Location: bangalore
|
|
|
|
I have a requirement where in I am getting "X" number of zipped MVS file daily. And I have to unzip them. And then merge them to get a single file. I want this to be automated. I have the names of files of MVS and the the name of the .TXT file inside the archive of every zip file.
1) Can I do the merging before I unzip all of them. And then unzip a single file.>> Tried this way but was able to merge the zipped files. But when unzipping, I am able to get only first files contents in output.
2) Do I go ahead and unzip all of them first and then merge it.
3) Since I am getting all these files through my previous step of the job which is actually doing a copy of UNIX HFS files to MVS. I have thought that whether I can unzip or merge them along with OGET statement?
Not sure which path i should opt and go ahead.
To unzip a single file I am using below commands.
ACTION=UNZIP
MODE=BINARY
ARCHIVE=SEQ/AFILE
OFILE=SEQ/UNZIPF1;inventory1.txt
Where inventory1.txt is the file name inside archive.
Note- I have to unzip them using zip390. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
Suggest you do them individually until you have it all working.
Then, you can investigate combining things . . . |
|
Back to top |
|
|
vasanthz
Global Moderator
Joined: 28 Aug 2007 Posts: 1742 Location: Tirupur, India
|
|
|
|
Hello,
Quote: |
2) Do I go ahead and unzip all of them first and then merge it. |
I don't have ZIP390, but this approach seems simpler.
You could create a GDG (WELLS.TEST.GDG), then unzip the first file to WELLS.TEST.GDG(+1) 1st generation.
2nd file to WELLS.TEST.GDG(+1) 2nd generation and so on...
For the merging part, just reference the file as DSN=WELLS.TEST.GDG without specifying the generation number in JCL, it will read the data from all the generations(concatenation)
Hope it helps. |
|
Back to top |
|
|
janmejay Warnings : 1 New User
Joined: 22 Jun 2007 Posts: 85 Location: bangalore
|
|
|
|
I am able to unzip them individually and then merge them.
The point is that there is no fix number of files that are coming everyday. This can vary. I need a approach where I can build such a SORT card or something which can help me to accomplish this task. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
When each new file arrives on the mainframe, unzip and copy it to the +1 of some gdg that is for this purpose. At the end of the day, run a job that names the gdg base and all generations will be processed - most recent first.
When reading the entire gdg, specify old,delete,keep for the disposition.
Lastly, create a new +1 that is an empty file for the processing of the next day. |
|
Back to top |
|
|
janmejay Warnings : 1 New User
Joined: 22 Jun 2007 Posts: 85 Location: bangalore
|
|
|
|
Ok Dick, I tried with your way but little different appraoch. What I did till now is that I was able to merge all generations of MVS zip files in to one file.
Now I want to unzip them , when i unzipped this merged file which is actually a combination of various zipped generations. Then the output contains only record with last zipped file. other files contents are not populating.
I used this for my unzip:- (using ZIP390)
ACTION=UNZIP
MODE=BINARY
ARCHIVE=SEQ/AFILE
OFILE=SEQ/UNZIPF1
Can you tell me what I missing here?
Thanks! |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
Quote: |
What I did till now is that I was able to merge all generations of MVS zip files in to one file. |
I understand what you tried, but i do not believe this is a valid compressed file. It is several files, not one.
I know of no way to decompress multiple files in one execution. I believe this this is what is causing the problem. |
|
Back to top |
|
|
janmejay Warnings : 1 New User
Joined: 22 Jun 2007 Posts: 85 Location: bangalore
|
|
|
|
Seems same to me Dick, Any different approach which I can follow and accomplish my task? |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
No. If i had to do this, i'd do it as suggested.
A similar approach has been used Many times whtn multiple inputs have to be processed as one . . . |
|
Back to top |
|
|
janmejay Warnings : 1 New User
Joined: 22 Jun 2007 Posts: 85 Location: bangalore
|
|
|
|
Hello Dick,
Finally we came to a conclusion to limit the file up to 10 and then create a job steps to to unzip them one by one and finally run a step that names the gdg base and all generations processed this way. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Good to hear you have an approach - thank you for letting us know
Keep in mind that after the entire set is processed, all of those generations need to be uncataloged.
And no job should create a new +1 while this job to consume those files is running.
d |
|
Back to top |
|
|
|