IBM Mainframe Forum Index
 
Log In
 
IBM Mainframe Forum Index Mainframe: Search IBM Mainframe Forum: FAQ Register
 

Can we run EXTRACT Job in loop


IBM Mainframe Forums -> JCL & VSAM
Post new topic   Reply to topic
View previous topic :: View next topic  
Author Message
rakeshreddy
Warnings : 1

New User


Joined: 21 Mar 2007
Posts: 28
Location: Bangalore

PostPosted: Thu Dec 04, 2008 1:49 am
Reply with quote

Hi Friends,

Here is the scenario:

I have the below job

//SYSIN DD *
EXTRACT LIST(#F2CLIST) OBJECTS USERPROGRAM(DFH$FORA)


The above job need to run on each record which are listed in a PS file.

Ex;

EXTRACT LIST(XXXXXXXX) OBJECTS USERPROGRAM(DFH$FORA)


PS file have below content:

000001 #BN2LIST
000002 A#TN2LST
000003 AF11BLST
000004 ASHI3LST
000005 ATN18LST
000006 ATN4ILST
000007 ATST3LST
000008 EN11LIST
000009 GN11LIST
.............. and so on...

each time the output need to go to new GDG Generation.. such that i can get one GDG Generation for each record in the PS file.

Each time the job runs, new GDG version need to be created and the output need to go in it.

any idea how can we do this!!!

please let me know if you need any other info..
Back to top
View user's profile Send private message
superk

Global Moderator


Joined: 26 Apr 2004
Posts: 4652
Location: Raleigh, NC, USA

PostPosted: Thu Dec 04, 2008 1:56 am
Reply with quote

One possibility ...

Have a job that reads the contents of the dataset and builds the content of the EXTRACT command and then submits a job for that record. If you keep all of the jobs with the same jobname that they will execute one-at-a-time, which is what you'd want to do anyway if you're creating GDG's.
Back to top
View user's profile Send private message
superk

Global Moderator


Joined: 26 Apr 2004
Posts: 4652
Location: Raleigh, NC, USA

PostPosted: Thu Dec 04, 2008 1:59 am
Reply with quote

... or ...

Have a job that reads the datasets and creates a single multi-step job, with one step for each EXTRACT command and an ever-increasing value for the GDG generation value (i.e. +1, then +2, etc.).
Back to top
View user's profile Send private message
superk

Global Moderator


Joined: 26 Apr 2004
Posts: 4652
Location: Raleigh, NC, USA

PostPosted: Thu Dec 04, 2008 2:04 am
Reply with quote

... or ...

Have your job read the first record only of the dataset, build the EXTRACT command, do it's thing, then delete that record from the dataset, and re-submit itself. Keep going until all the records have been deleted.
Back to top
View user's profile Send private message
rakeshreddy
Warnings : 1

New User


Joined: 21 Mar 2007
Posts: 28
Location: Bangalore

PostPosted: Thu Dec 04, 2008 2:11 am
Reply with quote

there are lot of(about 500+) records in the PS file.. i want to aviod submitting that many times..

could you give me an outline JCL of your second idea" Have a job that reads the datasets and creates a single multi-step job, with one step for each EXTRACT command and an ever-increasing value for the GDG generation value (i.e. +1, then +2, etc.)."

or outline JCL of first idea..

Thnanq for quick help
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19244
Location: Inside the Matrix

PostPosted: Thu Dec 04, 2008 2:22 am
Reply with quote

Hello,

You need to post the complete jcl for the "extract" step that works for one "list".
Back to top
View user's profile Send private message
superk

Global Moderator


Joined: 26 Apr 2004
Posts: 4652
Location: Raleigh, NC, USA

PostPosted: Thu Dec 04, 2008 2:27 am
Reply with quote

You know, of course, that you should've mentioned the size of the sample dataset up front since that would influence the approach someone would take. That also negates the option of using just one job with multiple steps, since you'll exceed that maximum number of steps a job is allowed to contain, as well as the fact that you can only go to +255 generations for GDG processing in a single job. At the very least you'll have to break this up into a minimum of two jobs.

Can we presume that it's OK to read and process the dataset and then delete the records from it that have already been processed?

It seems to me that you'll need a combination of #2 and #3.
Back to top
View user's profile Send private message
rakeshreddy
Warnings : 1

New User


Joined: 21 Mar 2007
Posts: 28
Location: Bangalore

PostPosted: Thu Dec 04, 2008 2:48 am
Reply with quote

Hi dick scherrer, here is by JCL

Code:

//UTIL   EXEC PGM=DFHCSDUP,PARM='CSD(READONLY)'
//DFHCSD   DD DISP=SHR,DSN=CIT#.C3200803.GSST.DFHCSD       
//CRFINPT  DD *                                             
//CRFOUT   DD SYSOUT=*                                     
//FOROUT   DD DISP=(,CATLG),DSN=O096021.CSDOUT.L2.TEST.WORK,
//         UNIT=SYSDA,SPACE=(CYL,(30,15),RLSE)             
//CBDOUT   DD SYSOUT=*                                     
//SYSPRINT DD SYSOUT=*                                     
//SYSUDUMP DD SYSOUT=*                                     
//SYSIN    DD *                                             
EXTRACT LIST(XXXXXXXX) OBJECTS USERPROGRAM(DFH$FORA)       
//
//SORT     EXEC PGM=SYNCSORT
//SORTIN   DD  DISP=SHR,DSN=O096021.CSDOUT.L2.TEST.WORK
//SORTOUT  DD  DSN=O096021.DFHCSD.OUTPUT(+1),         
//             DISP=(NEW,CATLG,DELETE),               
//             UNIT=SYSDA,SPACE=(TRK,(1,1),RLSE)       
//SYSOUT   DD  SYSOUT=*                               
//SYSIN    DD  *                                       
  MERGE FIELDS=COPY                                   
  INCLUDE COND=((1,4,CH,EQ,C'TRAN'),AND,               
                (116,5,CH,EQ,C'BELOW'))               
  END                                       


Where I use O096021.CSDOUT.L2.TEST.WORK to get the data from Setp1, and gives as input to STEP2.. and the final Output will be in GDG.

Both the Steps need to run for every record in the PS file.

Hi Superk,
And YES, we can break up the PS file if we want or we can delete the records in PS file...
Back to top
View user's profile Send private message
gcicchet

Senior Member


Joined: 28 Jul 2006
Posts: 1702
Location: Australia

PostPosted: Thu Dec 04, 2008 2:50 am
Reply with quote

Hi,

since the GDG base can only have a max of 255 entries, a number of them are going to roll off, is this really what you want ?


Gerry
Back to top
View user's profile Send private message
rakeshreddy
Warnings : 1

New User


Joined: 21 Mar 2007
Posts: 28
Location: Bangalore

PostPosted: Thu Dec 04, 2008 2:58 am
Reply with quote

I can split the records in PS and make them 250 or so.. and i can run the job on remaining records(once i got GDG for above 250 records.. i will process data and delete all the versions such that i can run job for remaining records)...

did i explained clearly.. feel free to ask any info in you need..

Thnq all
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19244
Location: Inside the Matrix

PostPosted: Thu Dec 04, 2008 3:02 am
Reply with quote

Hello,

What is the requirement to use a gdg?

What would happen if you wrote the extracted info to members in a pds? The member names could be the same as the "list" names?
Back to top
View user's profile Send private message
rakeshreddy
Warnings : 1

New User


Joined: 21 Mar 2007
Posts: 28
Location: Bangalore

PostPosted: Thu Dec 04, 2008 3:36 am
Reply with quote

I will process the extracted data using REXX code and sends the report to the team...

I used a loop in REXX which gets incrimented by 1 everytime and reads the data in each GDG Generation... at the end sends the report..

I felt it was easy to use GDG( for incrimental purpose)

If you have any idea we can get the same result if we send extracted info in PDS members.. I have no problem.. we can proseed that way...
Back to top
View user's profile Send private message
rakeshreddy
Warnings : 1

New User


Joined: 21 Mar 2007
Posts: 28
Location: Bangalore

PostPosted: Thu Dec 04, 2008 4:11 am
Reply with quote

My REXX doe does the following things:

It reads the first record in the fist generation, then picks the word in cols between 9 to 15..
like wise it reads all the records in first generation and writes the found WORD to output dataset.. at the end it writes the Ganeration name..

Then it goes to the second generation and does the same things..

this goes untill th end of the Generations in the GDG.

If we can do the same this by writing the extracted report to PDS members.... Please let me know how can we!!
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19244
Location: Inside the Matrix

PostPosted: Thu Dec 04, 2008 7:26 am
Reply with quote

Hello,

I'm afraid that i'm becoming more confused rather than understanding more. . . icon_confused.gif

Is there a way you can post 2 very small samples of the output from the EXTRACT and the next output from your rexx process?

Is there a need for a gdg or a pds or could the final output be created and sent "on the fly"?

I do not yet understand what will be "sent to the team" when this is working correctly. Once we all understand what the required output is, we may be able to offer better suggestions.
Back to top
View user's profile Send private message
rakeshreddy
Warnings : 1

New User


Joined: 21 Mar 2007
Posts: 28
Location: Bangalore

PostPosted: Thu Dec 04, 2008 9:12 pm
Reply with quote

Sorry for confusions i caused

Here is the explanation of my whole task:

JOB1

*************************************************
//UTIL EXEC PGM=DFHCSDUP,PARM='CSD(READONLY)'
//STEPLIB DD DISP=SHR,DSN=CICP.PKGE.SDFHLOAD
//DFHCSD DD DISP=SHR,DSN=CIT#.C3200803.GSST.DFHCSD
//CRFINPT DD *
//CRFOUT DD SYSOUT=*
//FOROUT DD DISP=(,CATLG),DSN=O096021.CSDOUT.L2.TEST.WORK,
// UNIT=SYSDA,SPACE=(CYL,(30,15),RLSE)
//CBDOUT DD SYSOUT=*
//SYSPRINT DD SYSOUT=*
//SYSUDUMP DD SYSOUT=*
//SYSIN DD *
EXTRACT LIST(#BN2LIST) OBJECTS USERPROGRAM(DFH$FORA)



After the above step Below is the content in "O096021.CSDOUT.L2.TEST.WORK"


000001 JOURDFHJ01 JCTBN20 DUMMIED OUT USER JOURNAL
000002 JOURDFHLGLOGJCTBN20 LOG OF LOGS
000003 JOURDFHLOG JCTBN20 SYSTEM LOG
000004 JOURDFHSHUNTJCTBN20 SHUNT
000005 LSRPLSRPOOL1LSR4BN20
000006 LSRPLSRPOOL7LSR4BN20
000007 LSRPLSRPOOL8LSR4BN20
000008 TYPELU300D WINSPRT
000009 TYPEPRTSCSP WINSPRT
000010 TERMIO01 WINSPRT
000011 TERMIO09 WINSPRT
000012 TERMIO50 WINSPRT
000013 TERMIO60 WINSPRT WINS PRINTER LUXEMBOURG



//SORT EXEC PGM=SYNCSORT
//SORTIN DD DISP=SHR,DSN=O096021.CSDOUT.L2.TEST.WORK
//SORTOUT DD DSN=O096021.DFHCSD.OUTPUT(+1),
// DISP=(NEW,CATLG,DELETE),
// UNIT=SYSDA,SPACE=(TRK,(1,1),RLSE)
//SYSOUT DD SYSOUT=*
//SYSIN DD *
MERGE FIELDS=COPY
INCLUDE COND=((1,4,CH,EQ,C'TRAN'),AND,
(116,5,CH,EQ,C'BELOW'))


After the above step Below is the content in "O096021.DFHCSD.OUTPUT.G0001v00"


000001 TRANIPCI IPCP43A CICS/IPCP SYSTEM INITIALIZATION
000002 TRANIPCM IPCP43A CICS/IPCP SYSTEM MENU FUNCTION
000003 TRANIPCP IPCP43A CICS/IPCP MAIN TRANSACTION
000004 TRANIPCT IPCP43A CICS/IPCP SYSTEM TERMINATION
000005 TRANHGAD HOURGLASDEACTIVATE AUDIT FACILITY
000006 TRANHGAE HOURGLASACTIVATE AUDIT FACILITY
000007 TRANHGCC HOURGLASHOURGLASS CICS CONTROL
000008 TRANHGCV HOURGLASHOURGLASS VERIFICATION
000009 TRANHGDL HOURGLASHOURGLASS VERIFY DB2 LOCAL



**********************************************************

The above Job need to be ran against all the Records(List names's) which I listed in a

PS File.

PS "O096021.DATACOLL.LISTS) file Content(LIST NAMES)

000100 #BN2LIST
000200 A#TN2LST
000300 AF11BLST
000400 ASHI3LST
000500 ATN18LST
000600 ATN4ILST
..................up to 250 records


And it creates one GDG Generation for each record(List name) in PS.

O096021.DFHCSD.OUTPUT.G0001V00
O096021.DFHCSD.OUTPUT.G0002V00
O096021.DFHCSD.OUTPUT.G0003V00
O096021.DFHCSD.OUTPUT.G0004V00
O096021.DFHCSD.OUTPUT.G0005V00
O096021.DFHCSD.OUTPUT.G0006V00
.................................up to 250 generations

Now AM SUBMITTING THE ABOVE JOB MANUALLY FOR EVERY LIST NAME IN THE PS FILE(250 times am

submitting the JOB1)

CAN IT BE AUTOMATED, LIKE IF I SUBMIT THE JOB ONCE IT NEED TO RUN AGAINST ALL THE

RECORDS(LIST NAMES) IN THE PS FILE
**********************************************************

AFter I get the GDG with all the Generations I will RUN the REXX code which does the

below:

It reads the first generation of GDG... And Picks the charecters from columns5 to 9 then

prints in (A)...and picks charecters from columns12 to 19 then prints in (B) in a new

dataset "O096021.OUTPUT.FINAL" as shown below...

CONTENT OF "O096021.OUTPUT.FINAL"

000001 IPCI IPCP43A
000002 IPCM IPCP43A
000003 IPCP IPCP43A
000004 IPCT IPCP43A
000005 HGAD HOURGLAS
000006 HGAE HOURGLAS
000007 HGCC HOURGLAS
.............................................. so on


Column 1 to 4 --> charecters from columns5 to 9 of GDG Generation
Column 12 to 19--> charecters from columns12 to 19 of GDG Generation


After finishing reading all the records in First generation, REXX goes to Second

Generation and picks the same fields and appends to the above output dataset

"O096021.OUTPUT.FINAL"..

continues upto last generation and appends the fields to same output dataset

"O096021.OUTPUT.FINAL"..



Let me know if you need any other info...
the automation will save lot my time and my interruption

Thnq ALL......
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19244
Location: Inside the Matrix

PostPosted: Fri Dec 05, 2008 12:53 am
Reply with quote

Hello,

The "output" from the Syncsort/MERGE did not come from the posted "input". The example data needs to be consistent.

When the "smoke clears", all of the EXTRACTed data goes thru the same "filter" to select the data for the final output, correct?

Is there some reason that the process cannot simply MOD each EXTRACT output into the same file and use that as input to the INCLUDE process (if the volume is too large, the INCLUDE could be run file by file and that output MODed into a single file.

The more i believe i understand, the less i believe a gdg or a pds is needed. . .
Back to top
View user's profile Send private message
rakeshreddy
Warnings : 1

New User


Joined: 21 Mar 2007
Posts: 28
Location: Bangalore

PostPosted: Tue Dec 09, 2008 12:28 am
Reply with quote

Thanq Dick scherrer..

I got this using REXX.. I used two loops, one to read the records in the PS file and other for reading the JCL and creating new jcl for Other record in the PS file..

Thnq for helping me.. and sorry for confusions waht i caused previously...

Thnq to the This foruma and all the friends.. icon_biggrin.gif

any one let me know if you want the REXX code...
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19244
Location: Inside the Matrix

PostPosted: Tue Dec 09, 2008 12:30 am
Reply with quote

You're welcome icon_smile.gif

Yes, if you post the code it may help someone else with a similar requirement someday.

d
Back to top
View user's profile Send private message
View previous topic :: :: View next topic  
Post new topic   Reply to topic View Bookmarks
All times are GMT + 6 Hours
Forum Index -> JCL & VSAM

 


Similar Topics
Topic Forum Replies
No new posts Need help for File Aid JCL to extract... Compuware & Other Tools 23
This topic is locked: you cannot edit posts or make replies. REXX - Do - Not able to LOOP CLIST & REXX 10
No new posts optim extract file - SAS DB2 2
No new posts How to extract the data for current y... DFSORT/ICETOOL 8
No new posts Extract record using 2 input file int... JCL & VSAM 2
Search our Forums:

Back to Top