View previous topic :: View next topic
|
Author |
Message |
pravanjanpatra
New User
Joined: 24 May 2010 Posts: 1 Location: India
|
|
|
|
This is a query I got from my friend, and I am still bugging through it.
Suppose I am reading a file of say 1000 records and when the read has reached 500th line the job abended due to some reason , which is not the main reason for concern. Say the reason for failure was size of the file. After correcting it, i wanted to start reading the file from 501th record. What are the ways to do it.
Thanks,
Pravanjan |
|
Back to top |
|
|
Pandora-Box
Global Moderator
Joined: 07 Sep 2006 Posts: 1592 Location: Andromeda Galaxy
|
|
|
|
You need to set a checkpoint and restart
If you are just doing the file manipulations set the count of records sussesfully processed (A unit of work) in a seperate file while abend on restart you need to read your until the count is greater than processed so far and process till end of file
If you are doing some table manipulations you might need to think the above logic in term for last commited point |
|
Back to top |
|
|
Nic Clouston
Global Moderator
Joined: 10 May 2007 Posts: 2455 Location: Hampshire, UK
|
|
|
|
Assuming you are reading a PS file you have to read all the preceding records although you need not actually process them as long as you have all the report lines, downstream interface data and totals stored away somewhere. With a VSAM KSDS if you know the key of 501st record you can go directly to there and start sequential processing BUT the same caveats apply - you have saved the data from the first 500 records. |
|
Back to top |
|
|
Anuj Dhawan
Superior Member
Joined: 22 Apr 2006 Posts: 6250 Location: Mumbai, India
|
|
|
|
As indicated - if the program already has the check-point restart logic in place, you might restart it from the last check-point -- which might/might-not be pointing to 501th record. That depends on your check-point frequency .
If Check-point-restart is not already deployed in the program, you're out of luck and there is no magic-stick which will tell the system to start reading a sequential file from 501st record. As Nic indicates, the only way out is, modify your program such a way that it comes to know that it has to skip so many number of records before any further activity coded in program. Or you take a back-up of input file and create a temporary file having records only from 501 onwards. These will get you out of the current mess, but then it's not what one will do every day in a PROD environment, and hence you deploy a check-point-restart as a good programming practitioner. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
Suggest that if all of the files are sequential, simply restart from the beginning. There is no need to try some "restart in the middle" for something like this.
Indeed, many organizations have stopped implementing restart in any of their jobs that do not already have it. |
|
Back to top |
|
|
don.leahy
Active Member
Joined: 06 Jul 2010 Posts: 765 Location: Whitby, ON, Canada
|
|
|
|
dick scherrer wrote: |
Hello,
Suggest that if all of the files are sequential, simply restart from the beginning. There is no need to try some "restart in the middle" for something like this.
Indeed, many organizations have stopped implementing restart in any of their jobs that do not already have it. |
A wise policy. I have never coded restart logic in a "sequential file only" program, nor have I ever felt the need to. Sequential I/O is so fast these days that it's not worth the trouble. As always, your mileage may vary.
If you have a "sequential file only" program that takes hours to run (and would therefore seem like a good candidate for restartability) then your time might be better spent tuning the program and/or JCL rather than adding restart logic to it. |
|
Back to top |
|
|
|