View previous topic :: View next topic
|
Author |
Message |
beeram527
New User
Joined: 19 Apr 2005 Posts: 2
|
|
|
|
All,
My query as follows:
I need to process a transformation logic on a File and create Output file.
I need to perform the above same logic for around 1000 files.
Each time Input File name varies and logic will be same and Output file will get created.
Can any one assist, How can I do the above ?
One way I thought like, write the Transformation Logic in a Proc and call the Proc from Job for Number of files times.
But this involves lot of repetitive coding.
Is there any way we can do with the Looping i.e. each time the same Proc should be called but File name should be varied. Thank you. |
|
Back to top |
|
|
steve-myers
Active Member
Joined: 30 Nov 2013 Posts: 917 Location: The Universe
|
|
|
|
I don't think JCL could be tricked to do something like this. One problem is there is a limit of 255 job steps in a job.
What is the source for the data set names for the input and output data sets? One mechanism would be to prepare JCL and submit a single job for each data set pair. Another mechanism would be to use dynamic allocation to allocate the data sets, then run the program as a subroutine of the controller. When the program completes use dynamic allocation again to release the data sets. One problem with this approach is the program is run serially, where with the first approach the jobs could potentially be run in parallel. |
|
Back to top |
|
|
beeram527
New User
Joined: 19 Apr 2005 Posts: 2
|
|
|
|
Thanks Steve.
With the Proc approach I might need to write 10 or more JCLs accordingly with each of 100 steps, due to limit of 255.
Dataset names could be as follows:
Input Dataset will be : Qual1.Qual2.Qual3.IFil1
Qual1.Qual2.Qual3.IFil2
...
Output Files will be : Qual1.Qual2.Qual3.OFil1
Qual1.Qual2.Qual3.OFil2
....
We know the Dataset names well in advance so can write some where or can Pass accordingly.
I have doubt on this:
Another mechanism would be to use dynamic allocation to allocate the data sets, then run the program as a subroutine of the controller. When the program completes use dynamic allocation again to release the data sets.
Would this be done through Cobol program ?? |
|
Back to top |
|
|
steve-myers
Active Member
Joined: 30 Nov 2013 Posts: 917 Location: The Universe
|
|
|
|
beeram527 wrote: |
... I have doubt on this:
Another mechanism would be to use dynamic allocation to allocate the data sets, then run the program as a subroutine of the controller. When the program completes use dynamic allocation again to release the data sets.
Would this be done through Cobol program ?? |
Yes, it can be done through Cobol. You can use the BPXWDYN external function to do the dynamic allocations.
I'm not really an expert on BPXWDYN. The basic idea is you send it a character string that resembles the TSO ALLOCATE or FREE commands. For better or worse, its documentation is difficult to find. |
|
Back to top |
|
|
prino
Senior Member
Joined: 07 Feb 2009 Posts: 1306 Location: Vilnius, Lithuania
|
|
Back to top |
|
|
steve-myers
Active Member
Joined: 30 Nov 2013 Posts: 917 Location: The Universe
|
|
|
|
Why program anything? This task can be done very easily in TSO in batch.
Code: |
//A EXEC PGM=IKJEFT01
//SYSTSPRT DD SYSOUT=*
//SYSTSIN DD *
ALLOC FILE(INPUT) SHR DSN('input data set')
ALLOC FILE(OUTPUT) OLD DSN('output data set')
CALL 'program-lib(program)'
ALLOC FILE(INPUT) REUS DA('input data set')
ALLOC FILE(OUTPUT) REUS DA('output data set')
CALL 'program-lib(program)'
... |
Obviously some of the statements need work, especially the ALLOC FILE(OUTPUT) statements, but this has to be easier than writing Cobol programs. |
|
Back to top |
|
|
|