View previous topic :: View next topic
|
Author |
Message |
harimca Warnings : 1 New User
Joined: 15 Jul 2006 Posts: 10
|
|
|
|
Hello all
I have thousands of files with below qualifiers
RTSBT.*.*.IC
XXFC.FE*.BZ*
for Which i need to Uncatalog every week
I'm unable to get the entire list as they are in 1000's
Can some please help me any Tool That I can use to UNCATALOG using the above filenames in JCL
Realy appreciate your help on this!!Thanks Again |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello and welcome to the forum,
Your topic has been moved from the DFSORT part of the forum as this does not appear to be related to sort. . .
Suggest you use the forum SEARCH (above in the blue line) and look for examples of dfdss / adrdssu that backup and delete the backed up files. . . |
|
Back to top |
|
|
expat
Global Moderator
Joined: 14 Mar 2007 Posts: 8797 Location: Welsh Wales
|
|
|
|
First question. Do you mean uncatalog or do you mean delete. Please explain exactly what the problem is and the most appropriate help can be offered.
Second question. Tape or DASD. |
|
Back to top |
|
|
harimca Warnings : 1 New User
Joined: 15 Jul 2006 Posts: 10
|
|
|
|
Hi expat thanks for the reply...!!
I'm looking to uncatlog and its based on TAPE
Awaiting for ur reply.. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
When thousands were mentioned, this surely sounded like dasd. . . Good catch, Expat
Why are there 1,000's of tape datasets to be uncataloged each week This should be set up to happen automatically . . . So far, i've not seen thousands of tape datasets that are managed manually per week.
If your organization is determined to keep these processes running this way, suggest you create the control statements to uncatalog each and every one and run it each week. |
|
Back to top |
|
|
harimca Warnings : 1 New User
Joined: 15 Jul 2006 Posts: 10
|
|
|
|
dick scherrer wrote: |
Hello,
When thousands were mentioned, this surely sounded like dasd. . . Good catch, Expat
Why are there 1,000's of tape datasets to be uncataloged each week This should be set up to happen automatically . . . So far, i've not seen thousands of tape datasets that are managed manually per week.
If your organization is determined to keep these processes running this way, suggest you create the control statements to uncatalog each and every one and run it each week. |
Hi d..
These are the Production Captures which are used in other Testlevel,We need to Uncatalog to prepare thm for nxt load.
Can you please provide the solution/approach to handle it through JCL.. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
Quote: |
Can you please provide the solution/approach to handle it through JCL.. |
Yes, it was in my last reply:
d.sch. wrote: |
suggest you create the control statements to uncatalog each and every one and run it each week. |
Quote: |
These are the Production Captures which are used in other Testlevel,We need to Uncatalog to prepare thm for nxt load. |
Many places regularly reload some or all of the production data to a testing environment. None of these need to do uncatalogs wholesale to accomplish this. . . Suggest someone look into why this is necessary.
There are 2 fairly simple ways to accomplish this:
1. Make sure the process that catalogs a new file uncatalogs the dsn before creating the new file.
2. Use a gdg and refer to (0) generation to load in test. |
|
Back to top |
|
|
harimca Warnings : 1 New User
Joined: 15 Jul 2006 Posts: 10
|
|
|
|
There are 2 fairly simple ways to accomplish this:
1. Make sure the process that catalogs a new file uncatalogs the dsn before creating the new file.
2. Use a gdg and refer to (0) generation to load in test.
It's once again you are asking me to change at 1000 places to uncatlog where the files are created.I metioned my requirements to UNCATLOG using
Qualifiers please suggest |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
Personally, i have no suggestion other than to make the investment one time to correct/improve the process(es). . .
Also, are these tapes copies of dasd data?
I am still unclear on why there are thousands of datasets to be uncataloged. . . As i mentinoed earlier, this is not normal. . .
Possibly i am misreading something, but it sounds very much that a rather unmaintainable/unmanagable procedure is being required. If your situation is like many, this "backloading" of producton data will continue for a long time - so i'd encourage implementing a better process.
Quote: |
I metioned my requirements to UNCATLOG using Qualifiers please suggest |
There is neither a business nor a technical reason for this; it is not a requirement. It is simply how you chose to get around the poorly implemented process. . . |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8696 Location: Dubuque, Iowa, USA
|
|
|
|
Your best bet is to generate a data set list (via IDCAMS LISTCAT or whatever tool you have at your site), write a program to parse out the data set names and generate the uncatalog statements you need, then submit a second job into the internal reader to process them. There is a limit on the DD statements per job, so you may have to run multiple jobs to do this.
Your entire "requirement" to uncatalog thousands of data sets a week is nothing less than insane. No site with much of a history of production work would allow any such thing to last long. But then, most sites doing production work have CA-11 or a similar restart product that automatically deletes data sets before running the job again. The fact that multiple people with many years of experience are all telling you that you are going down the wrong path will be taken by a wise person as a sign. |
|
Back to top |
|
|
expat
Global Moderator
Joined: 14 Mar 2007 Posts: 8797 Location: Welsh Wales
|
|
|
|
Click HERE to see a home written utility posted on the forum.
Although you will also need to check with your storage management people regarding the TMS status after the uncatalog to ensure that the volumes are returned to the scratch pool after uncatalog, or not, as the case may be. |
|
Back to top |
|
|
harimca Warnings : 1 New User
Joined: 15 Jul 2006 Posts: 10
|
|
|
|
Hi all
Thanks every one for discussing on the issue,I started listing out the all files and comeup with JCL to UNCATLOG the Tapefiles
Please let me know if you any SAMPLE JCLS Thanks Once again!! |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
Quote: |
Please let me know if you any SAMPLE JCLS Thanks Once again!! |
Code: |
//BR14U EXEC PGM=IEFBR14
//DD1 DD DSN=MY.FIRST.FILE,DISP=(OLD,UNCATLG),
// UNIT=(whatever,,DEFER)
//DD2 DD DSN=MY.SECOND.FILE,DISP=(OLD,UNCATLG),
// UNIT=AFF=DD1
Etc. . . |
Keeping in mind that there while there is a 255 DD limit in a step there is no need to do so many in a step. . . |
|
Back to top |
|
|
Pete Wilson
Active Member
Joined: 31 Dec 2009 Posts: 580 Location: London
|
|
|
|
In later versions of z/OS (1.11 & 1.12) I think, there is a new MASK parameter you can add to an IDCAMS DELETE to allow deletion of a dataset mask.
ew.share.org/client_files/callpapers/attach/SHARE_in__Seattle/S24
04SD101521.pdf
Ease of Use: Catalog and IDCAMS
IDCAMS Delete Masking
• IDCAMS DELETE command is enhanced to include a new function called
DELETE MASK. It allows users to specify the data set name selection
criteria desired with a mask-entry-name and a keyword “MASK”.
• A mask-entry-name (also called as filter key) can have two consecutive
asterisks (**) or one or more percentage signs (%).
• Enhancement: OA29880 - Delete Masking will accept multiple qualifiers
• Delete up to 100 data sets without specifying multiple entry-names
• Enhancement: OA30916/OA31658 – Removes the limitation of allowing deletion
of up to only 100 data sets.
• Additional fixes to be provided with OA31658:
• Delete Masking not deleting entries from the master catalog unless the catalog
name is specified.
• Delete Masking does not follow normal catalog search order.
• Target date for PTF availability 2Q10.
• Enhancement: TSO Support for Delete Masking
• To be provided with APAR OA31526; target date for PTF availability 2Q10. |
|
Back to top |
|
|
|