View previous topic :: View next topic
|
Author |
Message |
athulvijay
New User
Joined: 01 Jul 2010 Posts: 17 Location: PUNE
|
|
|
|
The output file structre is
OUTFILE DD DSN=DATADEV9.OST.FILELIST.AVK,
DISP=(MOD,DELETE),UNIT=DISK,
SPACE=(TRK,(100,50),RLSE),
DCB=(RECFM=FB,LRECL=128,BLKSIZE=0)
We are writing the header only once in separate para but I could able to see some part of header also getting displayed after every 1000 records in the output file. TCE-CLASS is a part of header which get overwritten after every 1000 records. Could you please help me out here?
Many Thanks |
|
Back to top |
|
|
enrico-sorichetti
Superior Member
Joined: 14 Mar 2007 Posts: 10886 Location: italy
|
|
|
|
You are not telling anything useful...
just that something does not work the way You expect
do not post screen shots, the just clutter the forum,
a plain TEXT cut and paste with the code tags is more than enough
( the screen shot has been deleted )
also because it tells nothing useful |
|
Back to top |
|
|
athulvijay
New User
Joined: 01 Jul 2010 Posts: 17 Location: PUNE
|
|
|
|
Apologies for the screen shot.Thanks for the comments. Let me put it in this way.
Code: |
aaa-NAME aaaR aaa-SNO aaa-FLAG aaa-OST aaaaa-FLAG aaa-aaaaD-T
-------- ---- ------- -------- ------- ---------- -----------
001001 0000000000 A R TJ19527 YAA-FLAG 4159305 YAAAA-FLAG 4159305AA-A
001002 0000000001 A - TK06183 Y------- 4280723 Y--------- 4280723---- |
In the above record R, AA-FLAG, AAAA-FLAG is actually header details which is written into the output file only once using separate para. But after every 1000 records in the output file I could able to see some part of the header also present along with the details. We are suing same record to write header and data. We are initializing it after every write.
Please let me know if any other info is required from my end. |
|
Back to top |
|
|
enrico-sorichetti
Superior Member
Joined: 14 Mar 2007 Posts: 10886 Location: italy
|
|
|
|
as I said showing that the output does not match Your expectations does not give any hint on the why.
would posting a bit of code so hard ?
Your data even if useless has been code tagged to show the look |
|
Back to top |
|
|
athulvijay
New User
Joined: 01 Jul 2010 Posts: 17 Location: PUNE
|
|
|
|
Code: |
***********************
1200-HEADER-BCKFILE.
***********************
*
STRING 'aaa-NAME ' DELIMITED BY SIZE
'aaaR ' DELIMITED BY SIZE
'aaa-SNO ' DELIMITED BY SIZE
'aaa-FLAG ' DELIMITED BY SIZE
'aaa-OST ' DELIMITED BY SIZE
'aaaaa-FLAG ' DELIMITED BY SIZE
'aaa-aaaaa-TIME ' DELIMITED BY SIZE
'aa-FLAG ' DELIMITED BY SIZE
'aaa-aa-IND ' DELIMITED BY SIZE
'aaa-aa-DAYS-OUT ' DELIMITED BY SIZE
INTO BACKOUT-RECORD.
WRITE BACKOUT-RECORD.
PROCESS-PARA
IF DB-STATUS-OK
PERFORM UNTIL DB-END-OF-SET OR WS-LOOP-END = 'Y'
INITIALIZE OUTPUT-PCD-DETAILS BACKOUT-RECORD
PERFORM 4200-WRITE-PARA THRU 4200-EXIT
4200-WRITE-PARA.
*****************
WRITE OUTPUT-PCD-DETAILS.
WRITE BACKOUT-RECORD. |
|
|
Back to top |
|
|
dbzTHEdinosauer
Global Moderator
Joined: 20 Oct 2006 Posts: 6966 Location: porcelain throne
|
|
|
|
well, posting some code obviously is not too hard,
possibly posting some useful code is |
|
Back to top |
|
|
dbzTHEdinosauer
Global Moderator
Joined: 20 Oct 2006 Posts: 6966 Location: porcelain throne
|
|
|
|
1. what is the record count of your input file?
2. unless you are deleting your output file in a previous step
or separate job,
this is sorta stupid
Code: |
DISP=(MOD,DELETE),UNIT=DISK, |
- |
|
Back to top |
|
|
athulvijay
New User
Joined: 01 Jul 2010 Posts: 17 Location: PUNE
|
|
|
|
Sorry Dbz, i copied some other file. Please find below the file detials. I was wondering whether it is something to do with block or track!!
my record length is 128 , The header is getting ovelapped in after 1000 records = 128000
128000 bytes == 5*25600(current block size).
OUTFILE DD DSN=DATDEV9.OST.FILELIST.CITYDB,
DISP=(NEW,CATLG,CATLG),
UNIT=DISK,SPACE=(CYL,(20,20),RLSE),
DCB=(RECFM=FB,LRECL=128,BLKSIZE=25600) |
|
Back to top |
|
|
dbzTHEdinosauer
Global Moderator
Joined: 20 Oct 2006 Posts: 6966 Location: porcelain throne
|
|
|
|
i repeat my question: 1. what is the record count of your input file?
also a tip.
instead of giving us suggestions to a problem that you have been unable to solve,
give us the information that we ask for.
and no, i doubt that the problem stems from
Quote: |
I was wondering whether it is something to do with block or track!! |
|
|
Back to top |
|
|
athulvijay
New User
Joined: 01 Jul 2010 Posts: 17 Location: PUNE
|
|
|
|
The record count of the output file is 2280. We are doing an area sweep and writing it to an output file.
Please let me know if any other info required.
Thanks, |
|
Back to top |
|
|
dbzTHEdinosauer
Global Moderator
Joined: 20 Oct 2006 Posts: 6966 Location: porcelain throne
|
|
|
|
i repeat my question: 1. what is the record count of your input file?
and i'll bite: what is an area sweep? |
|
Back to top |
|
|
enrico-sorichetti
Superior Member
Joined: 14 Mar 2007 Posts: 10886 Location: italy
|
|
|
|
we are just wasting time here.
what You have looks like some kind of storage overlay
the snippet posted is not enough to do any problem determination
what does the program do every 1000 records
what happens when You use as output a certainly empty/NEW dataset |
|
Back to top |
|
|
robert_wood
New User
Joined: 04 Jan 2011 Posts: 8 Location: Portsmouth, UK
|
|
|
|
@ dbz
"Area sweep" is an IDMS record access method.
You have to be mighty careful with this access method if accessing more than one record type, since only one record can be CURRENT of AREA.
Too much information...
Cheers Rob |
|
Back to top |
|
|
dbzTHEdinosauer
Global Moderator
Joined: 20 Oct 2006 Posts: 6966 Location: porcelain throne
|
|
|
|
Rob,
thx
reason I asked is that the TS has provided nothing helpful,
and the obvious solution is some kind of 1000 iteration loop. |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8700 Location: Dubuque, Iowa, USA
|
|
|
|
Quote: |
128000 bytes == 5*25600(current block size). |
The default for QSAM buffering is 5 buffers. It sounds like there's something going on in your program in that your header is not cleared from the buffer and hence recurs every time that buffer comes up for reuse.
Very definnitely a programming problem -- you need to start analyzing the code to determine why this is happening.
Try changing your JCL to see if the header location changes:
Code: |
OUTFILE DD DSN=DATDEV9.OST.FILELIST.CITYDB,
DISP=(NEW,CATLG,CATLG),
UNIT=DISK,SPACE=(CYL,(20,20),RLSE),
DCB=(BUFNO=30,RECFM=FB,LRECL=128,BLKSIZE=25600) |
If it does, then for sure you've got a program problem. |
|
Back to top |
|
|
Bill Woodger
Moderator Emeritus
Joined: 09 Mar 2011 Posts: 7309 Location: Inside the Matrix
|
|
|
|
athulvijay wrote: |
[...]
Code: |
aaa-NAME aaaR aaa-SNO aaa-FLAG aaa-OST aaaaa-FLAG aaa-aaaaD-T
-------- ---- ------- -------- ------- ---------- -----------
001001 0000000000 A R TJ19527 YAA-FLAG 4159305 YAAAA-FLAG 4159305AA-A
001002 0000000001 A - TK06183 Y------- 4280723 Y--------- 4280723---- |
In the above record R, AA-FLAG, AAAA-FLAG is actually header details [...] |
I know you are going to tell me that it is due to you re-typing, but aaa-FLAG is not the same as AA-FLAG etc. The data that is allegedly from the header does not even line up with the fields on the detail records. When you post some data from your screen, paste it, don't re-type it. We don't want to spend time on yout typos.
If parts of the data on some of your records happen to coincide with parts of the data you have written as a header there is very close to zero chance of it being other than an error in your program, or JCL, or something you have done.
Start by considering what you have been asked already. What happens in your program at "1000"? How about showing the code which is creating the detail record, not showing us the code creating the header (which you have not claimed is a problem)?
Why are you using STRING? Are you using STRING also for the detail records? Are all the fields fixed-length and undelimited? They look so from what you have shown. |
|
Back to top |
|
|
athulvijay
New User
Joined: 01 Jul 2010 Posts: 17 Location: PUNE
|
|
|
|
This info is very helpful . Let me check once. |
|
Back to top |
|
|
athulvijay
New User
Joined: 01 Jul 2010 Posts: 17 Location: PUNE
|
|
|
|
This info was a real help. It is the problem with default beffer size.
128000byte = 25600*5
5 is the default buffer number of QSAM. It mean to me like once 5 buffer(let take 1 block as one buffer) is filled actual write takes place. So the filler get displayed only after every 1000 records.
---
I tried running a job with
output dataset having RECL = 128 and BLOCK Size = 27904.
Now,
27904/128 = 218 and 218*5 = 1090.
In the output dataset the header record were repeating after 1090 records
similarly another try , RECL was 112 and BLK Size was 27888.
27888/112 = 249 and 249 * 5 = 1245
The header was repeating after every 1245 records in the dataset
So this shows there is no glitch in our code.
Thanks alot. |
|
Back to top |
|
|
enrico-sorichetti
Superior Member
Joined: 14 Mar 2007 Posts: 10886 Location: italy
|
|
|
|
Quote: |
So this shows there is no glitch in our code. |
horse manure....
but, if You are happy we are happy, ( or rather we do not care )
f Your code was correct then no spurious data would appear, whatever the <i/o> parameters ( buffer setup )
wiser to review the behavior of the WRITE statement the way You are using it ( locate mode )
it has some interesting gotchas,
certainly somebody proficient in COBOL will tell ( laughing ) what happens when mixing in the same program move and locate mode
( it' s for You Bill ) |
|
Back to top |
|
|
Robert Sample
Global Moderator
Joined: 06 Jun 2008 Posts: 8700 Location: Dubuque, Iowa, USA
|
|
|
|
athulvijay, you did EXACTLY what I recommended in my earlier post, and the results were exactly what I thought would happen, yet your conclusion is diametrically opposed to the conclusion I provided.
Contrary to what you think, this particular problem CANNOT be caused anywhere except in your program. And your reaction tells me you are another one of the Immaculate Programmers -- you never wrote a bad line of code, so every problem MUST be a system problem (since your code could never be wrong). Professional programmers see a problem and immediately ask what they could have done in their code to cause that problem; Immaculate Programmer see a problem and blame the system for days (or in some cases, months) until it is proven that their code is flawed. |
|
Back to top |
|
|
enrico-sorichetti
Superior Member
Joined: 14 Mar 2007 Posts: 10886 Location: italy
|
|
|
|
and ...
Quote: |
Mainframe Skills: BEGINNER,8 MONTHS OF MAINFRAME EXPERIENCE,KNOWLEDGE IN COBOL,JCL AND IDMS |
but an expert in self esteem and I am better than thou attitude |
|
Back to top |
|
|
don.leahy
Active Member
Joined: 06 Jul 2010 Posts: 765 Location: Whitby, ON, Canada
|
|
|
|
athulvijay wrote: |
This info was a real help. It is the problem with default beffer size.
|
No, it is NOT. You are not listening to the advice you have been given.
Here's another bit of advice that I am sure you will ignore: STRING is not the same as MOVE. An alphanumeric MOVE pads the receiving field with spaces; STRING does not. |
|
Back to top |
|
|
Bill Woodger
Moderator Emeritus
Joined: 09 Mar 2011 Posts: 7309 Location: Inside the Matrix
|
|
|
|
If you refuse to find out what the problem is yourself, and refuse to let anyone see your code, all I can suggest is this: settle on a particular blocksize. Calculate how many records your glitch-free code can write before somehow causing an error. Make a new version of your program, which takes a parameter. In the parameter, tell your program how many records to read before it starts writing. Write the number-of-records-calculated minus one. Close up and stop. Execute the program, with different parameters each time, until your input is processed. Concatenate the output files as input to the next task.
If you don't fancy that, show us the code that you think is relevant, and any other code that we ask for, and answer any questions we ask. Show us your compile options.
Or, sort it out yourself. But you can't, becuase you don't accept the possibility that the error can be yours. So, go back to the above, or discover a business reason for a smaller input file. |
|
Back to top |
|
|
Bill Woodger
Moderator Emeritus
Joined: 09 Mar 2011 Posts: 7309 Location: Inside the Matrix
|
|
|
|
From an earlier comment, no, I can't really think of a normal way to do anything too clever with Cobol writing a sequential fixed-length file.
The record shown is not a complete header. It means that the "header" data was either put there, sourced from the normal fields, or not all the data in the record-area was overwritten.
Don's point about the STRING could be relevant, but the TS has already failed to provide the code for the creation of the data records on the file as requested. If using STRING to create the data records, I also suspect the problem will be there. Reference-modification would be the other lazy way to possibly screw it up.
It is probably no co-incidence that the single data record shown has hyphens in the positions that are alleged to be part of the header on the record prior to it.
It is possible that TS feels he is initialising the record, but is not. If using STRING or reference modification only to populate, the defintion could be this:
Code: |
01 backout-record.
05 filler pic x(128). |
In which case the INITIALIZE will do diddly-squat.
Perhaps the "header" data is in fields defined as FILLER.
To my mind INITIALIZE is a horrible, lazy thing, invented for people who couldn't code an initialisation of a table correctly. What would have been wrong with
Code: |
MOVE SPACE TO backout-record |
Extra typing? Doesn't look as cool? What?
Anyway, TS only came here for for confirmation that his code was fine. Everyone disagreed, but TS still managed to use something to prove his code "glitch free". I suspect TS's code is a potential "bug heaven". This needn't be a terminal problem for an 8-month starter, as long as they can realise they've made errors, and learn from them. However, "glitch free" is someone heading for failure.
TS is happy, and it is a bit of a wasted topic for anyone else. |
|
Back to top |
|
|
|