Portal | Manuals | References | Downloads | Info | Programs | JCLs | Mainframe wiki | Quick Ref
IBM Mainframe Computers Forums Index
 
Register
 
IBM Mainframe Computers Forums Index Mainframe: Search IBM Mainframe Forum: FAQ Memberlist Profile Log in to check your private messages Log in
 
SVC Dumps

 
Post new topic   Reply to topic    IBMMAINFRAMES.com Support Forums -> All Other Mainframe Topics
View previous topic :: :: View next topic  
Author Message
vasanthz

Global Moderator


Joined: 28 Aug 2007
Posts: 1522
Location: Chennai

PostPosted: Wed Aug 09, 2017 1:44 am    Post subject: SVC Dumps
Reply with quote

In our shop we have SVC dumps enabled from inception. Occasionally we have SVC dumps and it fills up the real storage and causes slow down of CICS transactions and other workloads.
These SVC dumps are sporadic, like when a DB2 runstats fails or when a third party software fails.
We have had like 30 SVC dumps in past 3 months and not looked at the dumps or sent them to anyone. We sort of know the reason for the failures and did not have the need to look at the dump dataset.

Do you have SVC dumps enabled in your shops? or is it needed only on a ad-hoc basis?

Regards,
Vasanth.S
Back to top
View user's profile Send private message

steve-myers

Active Member


Joined: 30 Nov 2013
Posts: 574
Location: The Universe

PostPosted: Wed Aug 09, 2017 4:22 am    Post subject:
Reply with quote

It's hard to give you good advice about this.

I used to work for an [unnamed] ISV. As you can imagine, SVC dumps represented a major problem on systems where products were alpha and sometimes beta tested. There the goal was to get rid of them fairly quickly, which sometimes hurt us that wanted to analyze them. We often had to make our own copies before automation killed the real dump! And we had to do it quickly.

Now it seems to me you are describing two problems.

'The first problem is a performance hit when the dump is being prepared for writing. Here I don't have any advice I trust. Look at storage limits for hiper spaces. I suspect that's your issue.

The second problem is what to do with the dumps. That's much harder. Someone has to quickly screen the darn things and quickly delete any trash. There's not enough there for automation, so it has to be done manually.
Back to top
View user's profile Send private message
vasanthz

Global Moderator


Joined: 28 Aug 2007
Posts: 1522
Location: Chennai

PostPosted: Thu Aug 10, 2017 5:06 am    Post subject:
Reply with quote

Hi Steve, Thank you for your thoughts.
Quote:
Look at storage limits for hiper spaces

I have not heard about hiper spaces before. I thought an SVC dump just dumped all the virtual memory into a dataset, where does hiper space come into play.

Regards,
Vasanth.S
Back to top
View user's profile Send private message
steve-myers

Active Member


Joined: 30 Nov 2013
Posts: 574
Location: The Universe

PostPosted: Thu Aug 10, 2017 6:21 am    Post subject:
Reply with quote

vasanthz wrote:
Hi Steve, Thank you for your thoughts.
Quote:
Look at storage limits for hiper spaces

I have not heard about hiper spaces before. I thought an SVC dump just dumped all the virtual memory into a dataset, where does hiper space come into play.

Regards,
Vasanth.S
Well, yes, but first it takes an image of the address space to a hiper space and then writes the dump data set from the image. This frees the tasks in the failing address space more quickly. Before this the address space was frozen while the dump data set was being written. The theory is this frees the non failing tasks up more quickly.
Back to top
View user's profile Send private message
vasanthz

Global Moderator


Joined: 28 Aug 2007
Posts: 1522
Location: Chennai

PostPosted: Sat Aug 19, 2017 3:04 am    Post subject:
Reply with quote

This link https://www.ibm.com/support/knowledgecenter/en/SSLTBW_2.1.0/com.ibm.zos.v2r1.ieag100/slpsetp.htm

says about MAXSPACE parameter.
Quote:
SDUMP,MAXSPACE=xxxxxxxxM
Specifies the maximum amount of virtual storage that SVC dump can use to capture volatile virtual storage data, summary dump data, and component-specific data before writing the dump to DASD. The default value is 500 megabytes. The value that can be specified may range from 1 to 99999999 (with, or without, an M suffix). The new value takes effect immediately. If the value specified is lower than the space used, SVC dump will not continue to capture data.


I looked at our SVC dump files and they are at a maximum of 1.5 GB. So if we set the MAXSPACE parameter to 2000MB, would it help avoid performance issues due to paging,?

The current setting is
Code:
SET,SDUMP,MAXSPACE=6000M,BUFFERS=500M,Q=NO
Back to top
View user's profile Send private message
View previous topic :: :: View next topic  
Post new topic   Reply to topic    IBMMAINFRAMES.com Support Forums -> All Other Mainframe Topics All times are GMT + 6 Hours
Page 1 of 1

 

Search our Forum:

Similar Topics
Topic Author Forum Replies Posted
No new posts Reading the abend Dumps rockish CICS 4 Wed May 15, 2013 4:12 am
No new posts Doubt in reading dumps akashs PL/I & Assembler 2 Sun May 03, 2009 6:33 pm
No new posts extracting ddl from dumps mailsaurabh.tripathi DB2 1 Fri Apr 17, 2009 5:50 pm
No new posts Where exactly I can find the Dumps du... rajandhla ABENDS & Debugging 1 Tue Jul 04, 2006 7:10 pm

Facebook
Back to Top
 
Job Vacancies | Forum Rules | Bookmarks | Subscriptions | FAQ | Polls | Contact Us