View previous topic :: View next topic
|
Author |
Message |
jonna
New User
Joined: 18 Apr 2005 Posts: 9
|
|
|
|
HI all,
How to code new jcl for taking daily backup files. For example i am having 80 files stored in different datasets and i need to take daily backup of these files. So, kindly please specify how i need to code jcl for taking daily backup files and also please specify how i need to take up to date update files from this 80 files..please respond soon..
Thanks in advance..
Thanks & Regards, |
|
Back to top |
|
|
mftrigger
New User
Joined: 18 Feb 2006 Posts: 23 Location: chennai
|
|
|
|
jonna,
Daily backup of files can be done using a SORT or any other utility..
U can move a form to schedule uR job in CA7 scheduler if this job has to run daily..
but schedulin is done for batch jobs in different testing regions..
if this is a personal backup not related to any testing region .,, u can then simply submit this job daily..
If u r testin in a particular region ., u can make it as time triggered job to run daily @ a particular time...
actually uR reqmt is somewat vague..
anywayz korrect moi if i m wrong.. |
|
Back to top |
|
|
UmeySan
Active Member
Joined: 22 Aug 2006 Posts: 771 Location: Germany
|
|
|
|
Hi !
Have a look at the IBM-Utility DFDSS. It's the mostly used Tool for
backing up data.
Perhaps also FAVER ist installed in your company. Also a backup tool.
Regards, UmeySan |
|
Back to top |
|
|
murali922
New User
Joined: 25 Jul 2005 Posts: 92 Location: India
|
|
|
|
Code: |
//bkupNAME JOB ,'TX DISC BACKUPS',CLASS=P,MSGCLASS=K
//*
//*-----------------------------------------------------------------
//*BACKUP THE DATASETS
//*-----------------------------------------------------------------
//*
//ASBKUP EXEC PGM=IDCAMS,COND=(0,LT)
//SYSPRINT DD SYSOUT=*
//ASFILE DD DSN=YOUR.CURRENT.FILE,DISP=SHR,AMP=('BUFND=23')
//ASCOPY DD DSN=YOUR.CURRENT.FILE(+1),
// DISP=(,CATLG,DELETE),
// UNIT=DASD,SPACE=(CYL,(10,10),RLSE),
// DCB= (DSCB,BUFNO=10,LRECL=27994,BLKSIZE=0,RECFM=VB)
//SYSIN DD *
REPRO INFILE(ASFILE) OUTFILE(ASCOPY)
/* |
Repeat the same process for all the 80 files. This JCL should then be put on the CA7 scheduled to run daily at a particular time. |
|
Back to top |
|
|
UmeySan
Active Member
Joined: 22 Aug 2006 Posts: 771 Location: Germany
|
|
|
|
Hi murali922 !
Using the ADRDSSU utility you can backup data with wildcard depending on the level-qualifier of the data names.
Example: IER1711.USER.**
-> backing ap alle files beginning with that String.
Also you could backup the comlete files of a specified decice.
Regards, UmeySan |
|
Back to top |
|
|
jonna
New User
Joined: 18 Apr 2005 Posts: 9
|
|
|
|
Hi all ,
Thanks for ur response..but i am having differenet files with different dataset names in this case how i need to take dialy back up.. for that one i need to run this jcl 80 times r wht... and also explain wht is IER1711.
Thanks in advance.. |
|
Back to top |
|
|
raak
Active User
Joined: 23 May 2006 Posts: 166 Location: chennai
|
|
|
|
hi jonna,
i dont understand why u want to run this JCL 80 TIMES.
take the JCL murali has provided on top and simply repeat the backup step 80 times and give the dataset names... and submit it whenever u need to take the backup..
it is only ONE JCL... with the step repeated 80 times... |
|
Back to top |
|
|
Schiavetti
New User
Joined: 24 Jul 2006 Posts: 2 Location: S?o Paulo, Brasil
|
|
|
|
I need to change or optimize my back-up process.
I use Adrdssu in my instalation.
this is a statement used in JCL
***************************** Top of Data ******************************
DUMP ODD( TAPE ) -
DS(INC(BSPB.VF.CO*.*.*.DR&DATADI, -
BALB.VF.CO00SR00.D000333S.VF1P0318.DR&DATANT, -
BALB.JP.CO*.D1CVT005.JPVPD005.DX&DATADI) -
EXCLUDE(BSPB.VF.CO00SR00.D0RED176.VF1P0001.DR&DATADI)) -
TOL(ENQF) OPT(4) COMPRESS
**************************** Bottom of Data ****************************
do yo have any suggestion ??
Thanks.
Marcos Eduardo schiavetti. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
To backup your 80 files you will be much better off if you run 1 jobstep, creating 1 backup that contains all 80 of the files you need backed up. If you create 80 separate backup files you will have to deal with 80 separate restores as well as manage 80 datasets. If you ever need to restore selected files from the combined backup, you may do that also. If the 80 files need to be in sync, you will usually want to restore all at the same time.
The IER1711 is the highest-level qualifier for the datasets that will be backed up when the sample qualifier (IER1711.USER.**) is specified. That example says that ALL datasets beginning with IER1711.USER. will be backed up.
In your case, if your datasets do not have common high-level names, you can specify as many unique names as you need. You might consider changing the 80 datasets so that they do have the same high-level qualifiers. |
|
Back to top |
|
|
nitinsharma_1212 Warnings : 1 New User
Joined: 21 Feb 2007 Posts: 4 Location: DELHI
|
|
|
|
Hi
Can anybody provide me the syntax for using this Adrdssu utility to copy all the datasets with same highest level qualifier into a single dataset.
please reply as soon as possible.
Thanks
Nitin |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
You'll want to bookmark this page as it is the start of the manual
http://publibz.boulder.ibm.com/cgi-bin/bookmgr_OS390/BOOKS/ADR5R104/CCONTENTS
This is from that manual:
Code: |
DUMP -
INDDNAME(yourinpt) OUTDDNAME(yourotpt) -
DATASET(INCLUDE(your.qlfr.**)) -
COMPRESS -
OPTIMIZE(4)
|
Make sure that the dsn you backup to is NOT included in the qualified dataset names you are backing up.
Here's another example:
Code: |
//STEP1 EXEC PGM=ADRDSSU,REGION=2M
//SYSPRINT DD SYSOUT=*
//OUT DD DSN=MY.BACKUP,...
//SYSIN DD *
DUMP -
OUTDDNAME(OUT) -
OPTIMIZE(4) -
TOLERATE(ENQFAILURE) -
DATASET( -
INCLUDE( -
GENERIC.DATASET.NAME.** -
))
/*
|
You need to study the manual and see which options you need. If this example, the tolerate(enqfailure) would back up the dataset(s) if in use - not a good thing. . . There are many more than what is in this e-mail.
A good thing to do is to talk with your storage management people and look at how they do particular backups and which options they specify and why. |
|
Back to top |
|
|
|