IBM Mainframe Forum Index
 
Log In
 
IBM Mainframe Forum Index Mainframe: Search IBM Mainframe Forum: FAQ Register
 

Deletion of data in 2500 PS in single go


IBM Mainframe Forums -> All Other Mainframe Topics
Post new topic   Reply to topic
View previous topic :: View next topic  
Author Message
pankaj_kulkarni111

New User


Joined: 17 Aug 2006
Posts: 13
Location: Pune

PostPosted: Wed Dec 24, 2008 1:39 am
Reply with quote

Hi there

Is there any method to delete the data in 2500 PS files in a single go. I've a REXX which does this but is there any other method?

Cheers
Pankaj
Back to top
View user's profile Send private message
Robert Sample

Global Moderator


Joined: 06 Jun 2008
Posts: 8700
Location: Dubuque, Iowa, USA

PostPosted: Wed Dec 24, 2008 2:27 am
Reply with quote

What are you looking for -- a program to read the list & generate the deletes, one command to kill all 2500 files, or something else? If the first, which tools do you have at your site that you could use?
Back to top
View user's profile Send private message
pankaj_kulkarni111

New User


Joined: 17 Aug 2006
Posts: 13
Location: Pune

PostPosted: Wed Dec 24, 2008 2:45 am
Reply with quote

Robert,

I'm looking for a program to read the list & generate the deletes in a single go.

Thanks
Pankaj
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19243
Location: Inside the Matrix

PostPosted: Wed Dec 24, 2008 3:02 am
Reply with quote

Hello,

Suggest you look into adrdssu/dfdss rather than rexx. . .
Back to top
View user's profile Send private message
Anuj Dhawan

Superior Member


Joined: 22 Apr 2006
Posts: 6248
Location: Mumbai, India

PostPosted: Wed Dec 24, 2008 9:49 am
Reply with quote

Hi,

Technology won't limit you for such a task, little work around & it should be doable but what this requirement is all about - how those 2500 empty files will be used and why do you need them at first place ?
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19243
Location: Inside the Matrix

PostPosted: Wed Dec 24, 2008 10:05 am
Reply with quote

Hi Anuj,

Quote:
those 2500 empty files
I suspect they are not empty, but are just no longer needed.
Back to top
View user's profile Send private message
Anuj Dhawan

Superior Member


Joined: 22 Apr 2006
Posts: 6248
Location: Mumbai, India

PostPosted: Wed Dec 24, 2008 10:12 am
Reply with quote

Hi Dick,

May be I misunderstood this then..
Quote:
delete the data in 2500 PS files
once the data is been deleted, what's there in those files?
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19243
Location: Inside the Matrix

PostPosted: Wed Dec 24, 2008 10:17 am
Reply with quote

Hi Anuj,

I suspect the goal is to remove the datasets rather than just the data.

Maybe Pankaj will clarify. . .
Back to top
View user's profile Send private message
expat

Global Moderator


Joined: 14 Mar 2007
Posts: 8796
Location: Welsh Wales

PostPosted: Wed Dec 24, 2008 1:12 pm
Reply with quote

pankaj_kulkarni111 wrote:
Is there any method to delete the data in 2500 PS files in a single go. I've a REXX which does this but is there any other method?Pankaj

Do you want to NULL the datasets or delete the datasets ?
Back to top
View user's profile Send private message
pankaj_kulkarni111

New User


Joined: 17 Aug 2006
Posts: 13
Location: Pune

PostPosted: Fri Dec 26, 2008 2:32 pm
Reply with quote

Well thanks to you all first for your help!

The requirement is clear to me - There are 2500 datasets (PS and NOT PDS) and one of the project team need all them to be empty because they are going to store their data in it. Now as it's client I can't go back and ask them to do on their own, by emptying the dataset manually or via a program which deletes the earlier dataset and creates the new one. We are being paid for every technical work we are doing for them .

I don't want to delete those flat files but just the data in them - hope I'm clear here. I'm also trying to get this done, if I find any method I'll post it then.

Cheers,
Pankaj
Back to top
View user's profile Send private message
enrico-sorichetti

Superior Member


Joined: 14 Mar 2007
Posts: 10886
Location: italy

PostPosted: Fri Dec 26, 2008 9:16 pm
Reply with quote

Quote:
We are being paid for every technical work we are doing for them .

we do not get paid to help You do Your job icon_evil.gif

Code:
//CLEAR  PROC
//IEB    EXEC PGM=IEBGENER
//SYSPRINT DD SYSOUT=*
//SYSUT1   DD DUMMY,DCB=&DS
//SYSUT2   DD DISP=OLD,DSN=&DS
//PEND
//*
//       EXEC CLEAR,DS=YOUR.DSNAME.ONE
//       EXEC CLEAR,DS=YOUR.DSNAME.TWO
//       EXEC CLEAR,DS=....
//       EXEC CLEAR,DS=YOUR.DSNAME.TWO.THOUSAND.FIVE.HUNDRED


edited to correct a glitch
Robert made me notice the number of steps limit in a job

just split the job in 10 jobs with 250 steps each

wonder why the client cannot do it himself,
unless it is a deprecated project manager attitude,
if the client asks, make him pay, do it and hide any ethics
it' s from such things that good IT consultants get a bad fame
Back to top
View user's profile Send private message
Robert Sample

Global Moderator


Joined: 06 Jun 2008
Posts: 8700
Location: Dubuque, Iowa, USA

PostPosted: Fri Dec 26, 2008 9:26 pm
Reply with quote

enrico: don't forget there's a limit on the number of steps in a single job. This will have to be done with multiple jobs (even if done with a program to open each file for output, the limit on DD statements per job would be reached).
Back to top
View user's profile Send private message
enrico-sorichetti

Superior Member


Joined: 14 Mar 2007
Posts: 10886
Location: italy

PostPosted: Fri Dec 26, 2008 9:28 pm
Reply with quote

Thank You Robert for reminding me
post edited to make a point of it
Back to top
View user's profile Send private message
expat

Global Moderator


Joined: 14 Mar 2007
Posts: 8796
Location: Welsh Wales

PostPosted: Sat Dec 27, 2008 3:02 am
Reply with quote

I would (and have done) use a REXX that dynamically allocates the output, then drops the allocation and keeps looping.
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19243
Location: Inside the Matrix

PostPosted: Sun Dec 28, 2008 6:20 am
Reply with quote

Hello,

I suppose there is something with this that completely escapes me, but i must admit that i'm confused on multiple levels. . .

If the files exist and are to be over-written as OLD files, why do anything to them at all? (oh, yeah, the client wants to pay for this) If there is some process that retains all of the dsns but not the space and the files were not originally allocated with sufficient secondaries to handle whatever the "new" data volume, there may be lots of x37's when data is written into them.

I don't recall a system that constantly "wrote over" existing ps files rather than deleting the old (if they exist) and allocating new with proper space allocations. What is to happen other than the first re-use of these 2500 files?

If some client wants to pay for a "thing", i suppose we should not discourage them (as i've been contracting for longer than most have even been in the field icon_wink.gif ). A sweet deal would be to have a fixed-price bid for "emptying" these 2500 at a price of $5 or $10.00 (USD) per file. Pretty good $ return for an hour or 2 of work. . .

And it may be that i am just hopelessly confused. . . icon_rolleyes.gif

d
Back to top
View user's profile Send private message
pankaj_kulkarni111

New User


Joined: 17 Aug 2006
Posts: 13
Location: Pune

PostPosted: Sun Dec 28, 2008 8:59 pm
Reply with quote

Thanks for your help everybody, my problem is resolved by a REXX

I'll post that on Thursday, the new year day.

HAPPY NEW YEAR to you all


Cheers
Pankaj
Back to top
View user's profile Send private message
expat

Global Moderator


Joined: 14 Mar 2007
Posts: 8796
Location: Welsh Wales

PostPosted: Mon Dec 29, 2008 3:08 pm
Reply with quote

Dick, there were one or two times that I have had to do this, actually reset a file rather than delete and define because of some crazy special run scenario where the files were catalogued in two different catalogs on two different LPARs.

Some idiot set it up this way to do some testing, and then as the number of files grew, some poor storage guy icon_lol.gif got stuck with nullifying the files for each test cycle.

REXX worked just fine for this.
Code:

/* REXX *** RESET DSN RATHER THAN DELETE DEFINE                      *
"FREE  FI(SYSIN)"                                                     
"ALLOC FI(SYSIN) DUMMY"                                               
"EXECIO * DISKR DELETES ( STEM DELS. FINIS"                           
DO A = 1 TO DELS.0                                                   
  DSN = STRIP(SUBSTR(DELS.A,1,44))                                   
  "FREE FI(SYSUT1,SYSUT2)"                                           
  "ALLOC FI(SYSUT2) DA('"DSN"') SHR"                                 
  X = LISTDSI(SYSUT2 FILE)                                           
  SYSRECF2 = ''                                                       
  DO C = 1 TO LENGTH(SYSRECFM)                                       
    SYSRECF2 = SYSRECF2 || SUBSTR(SYSRECFM,C,1) || ' '               
  END                                                                 
  SYSRECF2 = STRIP(SYSRECF2,T)                                       
  "ALLOC FI(SYSUT1) DUMMY RECFM("SYSRECF2") LRECL("SYSLRECL")"       
  "IEBGENER"                                                         
END                                                                   
"FREE  FI(SYSIN,SYSUT1,SYSUT2)"                                       
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19243
Location: Inside the Matrix

PostPosted: Mon Dec 29, 2008 8:21 pm
Reply with quote

Hi Expat,

Might this not lead to a flock of x37s?

Might it not be both faster and less error prone downstream to backup a complete set of the "empty" files using adrdssu/dfdss or fdr and then simply restore them as needed?

I've needed to repeatedly "start fresh" in several situations, but forcing the file to empty hasn't (yet icon_smile.gif ) been part of the plan. . .
Back to top
View user's profile Send private message
expat

Global Moderator


Joined: 14 Mar 2007
Posts: 8796
Location: Welsh Wales

PostPosted: Mon Dec 29, 2008 8:33 pm
Reply with quote

Unfortunately not. The files were catalog'd in two different catalogs over two different LPARS, so if they were deleted/redefined on one LPAR then it would take ages to manually update the catalog on the other LPAR to keep all in sync.

It was a diabolical setup, and had they bothered to ask a techie in the first place we could have easily shared a catalog and the DASD.

Unfortunately the outlook at some sites is rather luddite and with the same people being in the same jobs in the same company for 20+ years they know no other approach. Still, it paid the bills for a good few months icon_lol.gif
Back to top
View user's profile Send private message
View previous topic :: :: View next topic  
Post new topic   Reply to topic View Bookmarks
All times are GMT + 6 Hours
Forum Index -> All Other Mainframe Topics

 


Similar Topics
Topic Forum Replies
No new posts db2 vs static data COBOL Programming 1
No new posts combine multiple unique records into ... DFSORT/ICETOOL 2
No new posts External data queue (changes?) CLIST & REXX 0
No new posts Pull data using date difference betwe... DB2 6
No new posts fast data scrambling PL/I & Assembler 10
Search our Forums:

Back to Top