View previous topic :: View next topic
|
Author |
Message |
ds Senthil
New User
Joined: 19 May 2011 Posts: 3 Location: India
|
|
|
|
Hi Friends,
Have any of worked on extracting DATA from COBOL Listing. My Compiled Listing is created by ChangeMan.
The requirement for Impact analysis requires me for find out the actual values [program name(s)] in the Dynamic calls.
As of now I'm doing it manually by opening the Listing using ChangeMan [CMN>1.L] and manually search for the variable name and find what is populated in there.
Some times, the program name is populated while defining it in Working storage using VALUE. In some cases, it is moved in the Procedure division. Occasionally it is moved thru some other variable and the level goes up to 3.
I wish to automate this process and practical ways to do it. Browsing the LST library does not make sense as it is in non visually readable format. Is there any formatter/convertor available?
Kindly share your experience if you had done similar to this earlier. |
|
Back to top |
|
|
nigelosberry
New User
Joined: 06 Jan 2009 Posts: 88 Location: Ggn, IN
|
|
|
|
Creating a tool for extracting out the program name depends on how many such programs you have. e.g. If you have less than 50 programs or so then the time to develop such an automating tool will be considerably more than what will be taken in the manual process.
The most important feature of the manual process is its reliability. If you go onto make a tool anyway, you'll have to test it thoroughly to prove its worth.
Existing documentation about the system you are working on may help you in doing the necessary impact analysis for your case. |
|
Back to top |
|
|
enrico-sorichetti
Superior Member
Joined: 14 Mar 2007 Posts: 10889 Location: italy
|
|
|
|
Quote: |
the requirement for Impact analysis requires me for find out the actual values [program name(s)] in the Dynamic calls. |
do You realize the complexity of automating the task ?
the quicker approach would be to make it a multistep process scanning the SOURCES
1st step scan for the CALL statements ( ISPF srchfor )
2nd step process the srchfor output to normalize the called thing
( determine if is a static or dynamic call ) and create the srchfor input for the third step ( the variable names )
3rd step using the variable names run again a srchfor to find out where the variables names are used
4th step cleanup the above input to get rid of false hits
should not take more than a couple of days
the srchfor might be sustituted by REXX scripts for smarter processing
( a bit of analysis might be needed for weird coding styles )
but the first advice is ... scan the sources
steps 1 and 3 can be a standard ISPF srchfor
steps 2 and 4 must be programmed
( I would advise REXX, but I might be biased by my Rexx knowledge )
for static calls the ISPF part listing application is the fastest FREE approach
a tool helpful might also be
publibz.boulder.ibm.com/cgi-bin/bookmgr_OS390/BOOKS/ASMTUG20/5.0?SHELF=ASMSH030&DT=20080715202826 |
|
Back to top |
|
|
Bill Woodger
Moderator Emeritus
Joined: 09 Mar 2011 Posts: 7309 Location: Inside the Matrix
|
|
|
|
enrico has a solution, but I'm not so sure that the compile listing is a bad way to do it.
If you have the verb cross-reference as well as the data-name cross-reference.
If you extract all line numbers were CALL is referenced (from the verb cross-reference) you can then get all the data-names which are referenced on that call.
You can locate the definition of the data-name in data division (from XREF) and extract that from the listing, including the VALUE.
You can identify the first data-name used (have to go to the listed source for that, but stick it in a table referenced by the line number) and then collect all references to that from the data-name XREF which modify that data-name, or the definition if none of the previous.
If you're concerned that the data-name might be redefined, first check if that changes the data-name xref (I doubt) otherwise you'll have to go to the DMAP.
This is only a very rough outline, but should be not too bad as a task because the compiler has done a lot of the work for you. |
|
Back to top |
|
|
dbzTHEdinosauer
Global Moderator
Joined: 20 Oct 2006 Posts: 6966 Location: porcelain throne
|
|
|
|
knowing you code base is important.
many of the dynamic CALLs in my code base are
built by combining parts of 3 separate columns from different rows of 4 different tables.
and the decision of which part from which column is controlled by a matrix in yet another db2 table.
now, is that dynamic or what.
i have to combine strips of srchfors with sql, and then match on existing modules to determine possible CALL identities.
and I work in a place where they answered my question (when I interviewed),
documentation? Ha, Ha, we don't have documentation. |
|
Back to top |
|
|
Bill Woodger
Moderator Emeritus
Joined: 09 Mar 2011 Posts: 7309 Location: Inside the Matrix
|
|
|
|
OK, that is dynamic. And a half. I can imagine the April Fools Day plans to randomly swap the values (in TEST only). Do you have a "book" on the next data-entry error? You could take bets on whether an 806, some other abend in a module, or "working".
This would be covered by including the DMAP (although obviously you couldn't resolve the value from the compile listing). In the DMAP, you identify any references to the eight bytes starting from the defintion of the first data-name on the call.
Obviously, the program being inspected can give more than one value to the data-name, so don't forget that. |
|
Back to top |
|
|
ds Senthil
New User
Joined: 19 May 2011 Posts: 3 Location: India
|
|
|
|
Thanks nigelosberry,
But, I have more than 8000 COBOL programs to be analyzed. Manual process is time consuming. I’ve built a tool and have rejected the entries which cannot be resolved. I had to manually retrace the entries and find the program called manually.
If I can Automate this atleast to some level, then the working on unresolved entries will be minimal.
I’m looking at handling Compile listing because, using unexpanded code may not actually have the Program value for the Dynamic CALL. |
|
Back to top |
|
|
enrico-sorichetti
Superior Member
Joined: 14 Mar 2007 Posts: 10889 Location: italy
|
|
|
|
Quote: |
I’m looking at handling Compile listing because, using unexpanded code may not actually have the Program value for the Dynamic CALL. |
useless remark
if the <real> called program name is assigned from values that You can see in the listing,
then You can see it also somewhere in the sources
if it is taken by <reading> an external source, ( parm,file,database,algorithm)
no listing will tell anything about it |
|
Back to top |
|
|
dbzTHEdinosauer
Global Moderator
Joined: 20 Oct 2006 Posts: 6966 Location: porcelain throne
|
|
|
|
IMUO you are making a mistake by starting with compile listings.
Your analysis tool must start from your code base repository:
From those pds's, one would use a srchfor to extract all ' CALL '.
i used dfsort to parse the output to create:- module name -- member name
- module type
- batch module
- cics module
- batch copybook
- cics copybook
- call type
- call object
- literal used
- working-storage variable
.
module name can be extracted using the process option IDPFX.
i found that I also needed to use ANYC.
here is a list of Process Options and here are the Process statements.
most of the time you can determine the module type by either the pds name, or naming convention of member name.
i work in an environment that uses 1980's technology so I have to deal with garbarge like, batch or cics or dual COBOL (COBC, COBB,COBD) and batchcopy and copybook. Fortunately I work in an ENDEVOR environment, so I just run an ENDEVOR element report, so I can cross-reference any member to its type.
I used DFSORT to parse and match (joinkeys) items to build these lists.
I used BATCH SPUFI to invoke the sql necessary
to create the varied forms of module names
that would/could be invoked by various modules.
again, I used DFSORT to parse and build the matches - resolve the possible dynamic CALLs a module could make.
I wrote only one cobol program
to input all this garbage
and
output all the possible module x calls statically/dynamicaly this module.
then wrote a rexx (because I could use the recusive feature of rexx)
to create a matrix of this module calls this module calls this module ........
now, I can run this ridiculous job of 19 steps anytime to create the matrix.
only thing I have to add is when someone dreams-up another type of dynamic/dynamic/dynamic db2 module name creation.
otherwise, new modules and their CALLs
(which one way or another fit the existing conventions of call module)
are automajically included in the matrix.
Unless you always have all the compile listings, you will never succeed in your task.
and as Enrico mentioned, a dynamic call is based on the value of the variable used in the CALL statement. Unless it is always populated with a VALUE clause, you are not achieving anything by using compile listings which you may or may not have. |
|
Back to top |
|
|
Bill Woodger
Moderator Emeritus
Joined: 09 Mar 2011 Posts: 7309 Location: Inside the Matrix
|
|
|
|
Santhil,
Do you have the compile listings such that you could use them as input to a program?
Do you compile with the verb cross-reference? |
|
Back to top |
|
|
Bill Woodger
Moderator Emeritus
Joined: 09 Mar 2011 Posts: 7309 Location: Inside the Matrix
|
|
|
|
Two people I have a lot of time for are saying this way is not good. I'd like to know why, but don't need to (it's not my requirement :-) ).
Compile listing gives:
"Normalised" line number. Can be used to store program in a table (line number as subscript or source for index);
Data Division Map. Either embedded or seperate. Once "relevent" eight bytes of storage identified, can be used to deal with redefines;
Data cross-reference. Line number of verb using, plus line number where defined. Indicates line number(s) on which field is amended;
Condensed listing. Line number of all verbs;
Verb cross-reference. Lists all verbs used and line numbers used on.
Also identifiable, copybook expansion, pre-processor expansion etc. Comments.
So, CALLs easy to identify - there is a list, giving the line number, go to the table and get the line of source, you can even get the whole statement by referring to the Condensed listing (which will have the line number of the next statement).
How the call is used, literal or data-name, is easy. Second "word" on the line. If it starts quote/apost then it is static, subject to compiler option, which is, in the compile listing. If not quote/apost then is dynamic.
If dynamic, locate in data xref. If modified, locate statements which modify. If modified by literal (easy, etc), if data-name look for the next.
If any of the data-names involved have a value, list those as well. Also can check the DMAP to see if any of the data-name involved are redfined, and pick up the references to those items.
Mmm. That's pretty-much the backbone for extracting the data. The key is the verb cross-reference. More dificult without it, but still doable.
Squirt it out. Sort it. Produce "report(s)".
Using the basic method there is a heck of a lot you can get out of the compile listing. Without having to re-invent wheels at any stage, or try to match the compiler/preprocessor whatever. |
|
Back to top |
|
|
Marso
REXX Moderator
Joined: 13 Mar 2006 Posts: 1353 Location: Israel
|
|
|
|
I have spend a whole lot of time lately scanning tons of COBOL-CICS programs, looking for affinities that prevents a CICS transaction to run dynamically in a CICSPLEX environment.
Because a transaction runs a program that calls another program, I had to make sure there are (or no) affinities in the whole "tree" of programs.
Of course, to build this "tree", I must check for EXEC CICS XCTL and EXEC CICS LINK as well as CALL statements...
That is why I decided not to use the compile listing: EXEC CICS statements are commented, and it is quite difficult to recognize them.
Instead, I wrote a first utility that scans the programs and creates a copy of it, with all comments, empty lines and extra spaces removed, all COPYs expanded and all statements reformatted (one statement on one line).
This is very useful because then I can look for "EXEC CICS LINK" for example, also the VALUE is always on the same line as the variable name.
This utility is almost like a compiler. The PDS containing the reformatted programs has variable records with lrecl of 23000.
The second utility I wrote loads the reformatted program in memory and looks for all CALL, LINK and XCTL statements.
If the program name is a constant, it's done. If it is a variable I look for its VALUE, its 88 levels and its MOVE.
It seems to be a lot of work, but using the reformatted program allows me to look also for VSAM, TS, TD, CWA and more, as these can be affinities.
My utilities are written in REXX. |
|
Back to top |
|
|
Atom.Padhy
New User
Joined: 06 May 2013 Posts: 1 Location: India
|
|
|
|
<link to site with advertising removed> check this out |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19243 Location: Inside the Matrix
|
|
|
|
Not a very good start . . .
We do not allow advertising.
You also need to reply to something current - not something over 2 years dormant. That is when you have something appropriate to post.
If this was not your first post, it would have just been deleted.
d |
|
Back to top |
|
|
|