View previous topic :: View next topic
|
Author |
Message |
Virendra Shambharkar
New User
Joined: 26 Aug 2015 Posts: 55 Location: India
|
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8696 Location: Dubuque, Iowa, USA
|
|
|
|
The article describes the process pretty clearly. As long as you're using z/OS version 2.2 and the facilities listed in the article, there's no other special requirements. |
|
Back to top |
|
|
Virendra Shambharkar
New User
Joined: 26 Aug 2015 Posts: 55 Location: India
|
|
|
|
Thanks a lot |
|
Back to top |
|
|
Virendra Shambharkar
New User
Joined: 26 Aug 2015 Posts: 55 Location: India
|
|
|
|
Hi,
When I try to create a UNIX pipe I am getting an error
IGD17501I ATTEMPT TO OPEN A HFS FILE FAILED,
RETURN CODE IS (00000081) REASON CODE IS (0594003D)
Below link mentions that it occurs "if attempting to
start the CAE Server on USS when the server logs directory has
not been created"
www-01.ibm.com/support/docview.wss?uid=swg1PM43917
Can somebody give pointers to how to create the CAE server or how to resolve this error .
Thanks in advance. |
|
Back to top |
|
|
Virendra Shambharkar
New User
Joined: 26 Aug 2015 Posts: 55 Location: India
|
|
|
|
Sorry how to create the server log directory. |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8696 Location: Dubuque, Iowa, USA
|
|
|
|
Did you notice that the CAE error message applies ONLY to the DB2 Query Monitor Tool? That's what the APAR is about. If you decide to create the CAE server, read the installation guide for the DB2 Query Monitor Tool product and follow its directions.
In general, the return code X'81' and reason code x'0594003D' means that you are attempting to open a Unix file or directory that does not exist. This could mean that you actually are trying to open a non-existent file or attempting to navigate to a non-existent directory, or it could mean that the user id you are running the batch jobs under does not have appropriate Unix permissions to the file (or to one, or more, of the directories in the tree to the file). |
|
Back to top |
|
|
Virendra Shambharkar
New User
Joined: 26 Aug 2015 Posts: 55 Location: India
|
|
|
|
Thanks.
I have created a new FIFO file using TSO OEDIT and I can see read and write permissions. I am trying to copy data from my mainframe dataset to this FIFO file and below are the permissions I can see:-
Command Filename Message Type Permission Audit Ext Fmat
------------------------------------------------------------------------------
new FIFO rw-rw-rw- fff--- ----
Do we need to get any special permission to use this FIFO UNIX file in z/OS and how do we get these permissions.
Thanks again. |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8696 Location: Dubuque, Iowa, USA
|
|
|
|
What user id does the batch job run under? And you didn't say anything about the directory tree -- if your batch user id doesn't have access to the entire directory tree, the error could occur due to it.
At this point, I recommend working with your site support group. You are not posting anywhere near enough data for us to help you on the forum, and the site support group will be able to see everything and help you resolve issues. |
|
Back to top |
|
|
Virendra Shambharkar
New User
Joined: 26 Aug 2015 Posts: 55 Location: India
|
|
|
|
This is the directory structure I see where my ID starts with gID. I am also running the job with my ID only.
Pathname . : /u/g138818
Command Filename Message Type Permission Audit Ext Fmat
-------------------------------------------------------------------------------
. Dir rwxr-xr-x fff--- ----
.. Dir rwxr-xr-x fff--- ----
.sh_history File rw------- fff--- --s- ----
new FIFO rw-rw-rw- fff--- ----
******************************* Bottom of data ********************************
Do I need to get permission on this directory through some commands or updating the permission file?
Thanks for your help. |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8696 Location: Dubuque, Iowa, USA
|
|
|
|
Quote: |
Do I need to get permission on this directory through some commands or updating the permission file? |
In Unix System Services, if using OMVS you use the chmod command to change permissions; if using OEIS you use the menu to change permissions. Since Unix System Services interacts with the site security package (usually RACF, TOP SECRET, or ACF2), you cannot expect to "update the permission file" and have anything work.
And to answer the question, if you're just using your user id you should have the rights you need. What is the precise error message the batch job is giving you (we shouldn't have to ask for this -- you should have posted it when you first said the batch job is having a problem)? Are you using the fully qualified path in your batch job (/u/g138818/new FIFO)? Are you using quotes around the name in your JCL so the space in the file name isn't misinterpreted?
And terminology is critical in IT where similar terms may mean very different things. "GID" is group id and "UID" is user id in Unix System Services -- your home directory should be /u/userid which is NOT the group id; if you're using the group id for anything in your job then you are using the wrong field. |
|
Back to top |
|
|
Virendra Shambharkar
New User
Joined: 26 Aug 2015 Posts: 55 Location: India
|
|
|
|
Thanks for replying so far . This has been very helpful.
Actually the error for 'RETURN CODE IS (00000081) REASON CODE IS (0594003D)' was because Unix is case sensitive and in JCL I was giving the path in upper case. So the job is complete with RC 00 but no data is copied to FIFO file. Is there any issue with below job.
Code: |
//G138818L JOB ,,LINES=(9,CANCEL),NOTIFY=&SYSUID,CLASS=D,
// MSGLEVEL=(1,1),MSGCLASS=X,TIME=2
//*
//COPYSTEP EXEC PGM=IKJEFT01
//INMVS DD DSN=G138818.SAMPLE.FILE1,DISP=SHR
//OUTHFS DD PATH='/u/g138818/testx',DSNTYPE=PIPE,
// LRECL=80,BLKSIZE=3200,RECFM=FB,
// PATHOPTS=(OWRONLY,OCREAT,OEXCL),
// PATHMODE=(SIWUSR,SIWUSR),
// PATHDISP=(KEEP,DELETE)
//SYSTSPRT DD SYSOUT=*
//SYSTSIN DD DUMMY
//SYSTSIN DD *
OCOPY INDD(INMVS) OUTDD(OUTHFS) TEXT CONVERT(YES) PATHOPTS(USE)
/*
|
Even If I connect z/OS unix directory through RDz and try to edit this testx I am not able to edit the FIFO type file whereas I am able to edit a file defined with type as FILE.
Thanks again for helping so far.
Thanks. |
|
Back to top |
|
|
Nic Clouston
Global Moderator
Joined: 10 May 2007 Posts: 2455 Location: Hampshire, UK
|
|
|
|
Please use the code tags when posting code and data and screen shots. Coded for you this time. |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8696 Location: Dubuque, Iowa, USA
|
|
|
|
First, this makes no sense:
Code: |
// PATHMODE=(SIWUSR,SIWUSR), |
PATHMODE is used to define file attributes and having the same file attributes twice doesn't give you anything. There are separate file attributes for the user, the group, and the other permissions but specifying the same attributes more than once for user (or group or other) is doing nothing for you -- syntactically it is not wrong, but neither is it doing anything more than specifying the file attributes a single time.
Second, I have been reading the manuals and find no indication -- either way -- on whether or not it is possible to edit a FIFO file. The very name -- first in first out -- would imply that editing a FIFO file would make no sense. Have you tried to run your job and then use a cat command in Unix to list the contents? |
|
Back to top |
|
|
Virendra Shambharkar
New User
Joined: 26 Aug 2015 Posts: 55 Location: India
|
|
|
|
Hi,
I copied data to Unix FIFO file from a Unix file using 'cp' command. Now when I give this file as input to a COBOL program to read data, at the open statement in the program the job is not able to locate the Unix FIFO file . The job ran for more than an hour but could not go beyond OPEN INPUT file statement. In the DD statement in I gave the path of the file and PATHDISP as (KEEP,KEEP).
Code: |
//G138818D JOB ,,LINES=(9,CANCEL),NOTIFY=&SYSUID,CLASS=D,
// MSGLEVEL=(1,1),MSGCLASS=X,TIME=2
//*
//STEP1 EXEC PGM=TESTPGMA
//STEPLIB DD DSN=OPERN.COBOL.LOADLIB,DISP=SHR
// DD DSN=OPERN.COBOL.LOADLIB1,DISP=SHR
//INPUTDD DD PATH='/u/g138818/testnew',PATHDISP=(KEEP,KEEP)
//SYSOUT DD SYSOUT=*
//SYSPRINT DD SYSOUT=* |
If I give a UNIX file ( file defined as file and not FIFO) as input to same jobs complete successfully.
How can we make Unix FIFO file read by job in MVS. Is there any other parameter that will need to be given in the job to locate the file.
Thanks a lot for response so far |
|
Back to top |
|
|
Bill Woodger
Moderator Emeritus
Joined: 09 Mar 2011 Posts: 7309 Location: Inside the Matrix
|
|
|
|
I think you need to describe to us exactly how you expect the pipe to work, and how you would benefit from that.
I've looked at the magazine article, and there is nothing to indicate how an interrelation between USS and z/OS would work for these pipes. You do have z/OS 2.2? You have tried to use the pipes as described in the article?
What are you actually trying to achieve? |
|
Back to top |
|
|
Virendra Shambharkar
New User
Joined: 26 Aug 2015 Posts: 55 Location: India
|
|
|
|
I have a few batch jobs where output file for one step is used as input to other steps and these jobs run for a long time.
I am exploring if by creating an Unix Pipe I can reduce the run time of these jobs.
So I am trying reading data from Unix pipes in COBOL program , whether any changes will be needed in COBOL code , performance of cobol programs when using Unix pipes as input instead of PS file etc.
So I created a Unix pipe , passed data to it and trying to see how to process through COBOL code. |
|
Back to top |
|
|
Bill Woodger
Moderator Emeritus
Joined: 09 Mar 2011 Posts: 7309 Location: Inside the Matrix
|
|
|
|
And you are on z/OS 2.2?
Where does the "cp" come into it?
Make two example programs. One which reads and writes, and one which reads.
Make two separate JOBs for them, with the output from the first and the input to the second using the "pipe".
Define the dependencies and stuff as outlined in the article.
Submit the (probably by now it is three) JOBs.
Once the first record is written in the first application JOB, it will be available for the second application JOB. So, the overall time for the work will be reduced because it can be run concurrently.
This should be "transparent" to the application - within any documented limits which may exist. Read the documentation, not just the article. |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8696 Location: Dubuque, Iowa, USA
|
|
|
|
You do understand, I hope, that Unix files (such as pipes) do NOT have a record structure whereas COBOL requires a record structure for its files? Furthermore, have you explored other ways to speed up your processing (such as using FREE=CLOSE and ensuring the output data set -- z/OS has files on tape and in Unix System Services; everything else is a data set; your use of "output files" in your post is incorrect terminology -- is closed as soon as the last record is written)? Also note that if the program output is generated throughout the long processing time, you may not be able to reduce the elapsed time for the programs without completely rewriting the entire application.
I found this article Two Kinds of Pipes by using Google for z/os batch pipes. I tested the write and read jobs given and they worked the way they are supposed to. I did not attempt to keep the pipe around to edit it, but then I don't see any reason to EVER edit a pipe. You'll want to use the jobs as specified without any changes to ensure they work at your site before you attempt to make any changes to them. |
|
Back to top |
|
|
Virendra Shambharkar
New User
Joined: 26 Aug 2015 Posts: 55 Location: India
|
|
|
|
Thanks for the responses but we have exlpored z/OS pipes but it it not available on the mainframe and it is a license product.
Thank you again for responding. |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8696 Location: Dubuque, Iowa, USA
|
|
|
|
The jobs in the article do NOT rely upon the licensed product z/OS BatchPipes and they will run on any z/OS system installed in the last several years. I think you have several issues -- such as not understanding the difference between the licensed product z/OS BatchPipes and the Unix pipe mechanism used in the article, and thinking you can edit a pipe file -- and while I think Unix pipes may work for what you want to do, you (or someone AT YOUR SITE) needs to investigate further (about 3 or 4 weeks would be sufficient). We have answered your original questions -- no special requirements or software is required to use Unix pipes in a z/OS environment, and no special processing is required. |
|
Back to top |
|
|
Virendra Shambharkar
New User
Joined: 26 Aug 2015 Posts: 55 Location: India
|
|
|
|
Thanks Robert for the responses. The only reason I thought I will edit PIPE was to pass data to the PIPE for my testing. Since we already explored batch pipes we wanted to explore Unix pipe once.
I will investigate further on my own. Thanks again for responding . It has helped a lot. |
|
Back to top |
|
|
Willy Jensen
Active Member
Joined: 01 Sep 2015 Posts: 712 Location: Denmark
|
|
|
|
The jobs mentioned in Robert's link "Two Kinds of Pipes by using Google for z/os batch pipes" works like like charm in a z/OS 1.13 system. As far as I can tell from the documentation they do not require Batch Pipes, but of course they do require access to UNIX Services. What z/OS 2.2 brings is easy control over the parallel batch execution, at least that is how I understand it... |
|
Back to top |
|
|
|