In our project, I used to run the DB2 queries and prepare the transaction reports as per the given details like Client id,transaction date...etc.
I follow the below steps to create the reports:
1.Run the DB2 queries
2.Execute the COBOL program to seperate each columns with a delimiter.
3.Download the data from Mainframes to a text file
4.Then, export the data to a excel sheet and format it.
But,there is much time wasting by downloading it to text file and again export it to Excel sheet.So,I am looking for way to export the data from mainframe to Excel sheet directly.
Can anyone please let me know the tools that are used to export the data to excel sheets directly? I found a tool called 'Spectrum Mainframe Exporter' which can satisy my requirement but it's licence is too expensive,we cannot bear that much amount.
Any other tool which will satisy my requirement with less amount?
Joined: 06 Jun 2008 Posts: 8154 Location: East Dubuque, Illinois, USA
Based upon what I see, the Spectrum Mainframe Exporter is quite cheap -- $4995 per year. Most mainframe software packages run anywhere from $10K to $250K depending upon your machine size and what the software is doing.
You are not very likely to find many products to do what you want for less than $5K. You could always write your own, of course, but that's probably about a 1 to 2-year effort for a 4 to 6 person programming team since the file specifications for Excel format are well over 1000 pages at this point.
enrico has pointed out one method to simplify the mainframe side.
On your csv, include header with business date and trailer with count and hash values.
On the PC, write a macro to allow the user to "import" their new file. The macro looks in the place the file is supposed to be, reads the file, prompts the user as to the correct business date, loads up the CSV, confirms the values on encountering the trailer, reports any problems along the way.
So, extract from DB2 direct to a CSV (including appropriate header and trailer). FTP/other method of transfer. User-option to load new file.
Everything smooth. OK, it wasn't DB2, but this worked 20 years ago, so should still work now.
You could do it all in the same batch job with no additional cost. Just use the ISPF workstation agent (WSA). Run it on your workstation as a TCP/IP server interacting with ISPF. The workstation agent is an official component of ISPF and can be downloaded from an ISPF installation dataset and requires no additional or separate licence. Although the product has not been enhanced during the last years, IBM still supports the ISPF WSA.
Let batch job do the data collection and formatting. Then add an IKJEFT01 step with invoking a small REXX program via ISPSTART using the connection details of your workstation. The REXX program can use the ISPF services WSCON and WSDISCON to contact your workstation. The ISPF service FILEXFER is designed to transfer your dataset to any directory of your PC network using various options. After copying the data you can use SELECT WSCMD(%comspec% /C START /MAX c:\filename.xls) MODELESS to invoke your spreadsheet processor and format your output asynchronously as desired.
You'll find good documentation in the ISPF manuals and a comprehensive summary on Gilbert St. Flour's website.
Check it out and give it a try.
Joined: 26 Apr 2004 Posts: 4650 Location: Raleigh, NC, USA
pullaiah.cts, I guess what I don't understand is why you said that there's a lot of wasted time for this process. Since you and/or your site designed and implemented it, what attempts have you made to either eliminate a step or two, or at least reduce the amount of "time wasted"? Don't you have software designed to accumulate and distribute reports company-wide? Simple spreadsheets are relatively easy to create and distribute in an automated fashion.
Thank you so much for your valuable advices.I just came to know that there is a tool in out company which generates report from mainframe flat file to excel spreadsheet.It is devloped based on VB and it is fit to my requireent.
OK, that's good. It was a year later we got to VB and Access.
The problem was, once the users knew we could extract data for them, they wanted ever more. We still regarded it as "our" data: users aren't generally congnisant of the implications of holding data securely and correctly. Plus, if we ever had to re-run something, we also had to re-supply the data to the user, their spreadsheets became part of our "impact" for failures.
So, we treated all the data we handed-off exactly as we would for our systems handing-off data to another internal system or an external system, except that we had to be the ones "catching" the data on the PC and ensuring its integrity.
For our data, we ended up with a VB system collecting and logging the files (business date, run date, counts, hash counts, source files name, target file name) and providing data for the spreadsheets. OPS would fire-up the collecter in the around 5am and it would periodically check for files until all collected. Individual files could be overriden as not available etc.
Except for system problems, users had their spreadsheets ready to go as they arrived, including business day and totals reconcilliations which we included in all their spreadsheets. Files were backed-up on the PC network.
This is a mainframe forum, so why am I saying this?
Because it is, as far as I'm concerned, still mainframe data in the same way as any report or screen. All those same people (accountants, auditors, etc) are going to come and bounce on you if you don't do everything necessary, because no-one (OK, not many) down in the user/PC world is used to the rigours of looking after the data.
When you release data to anyone (another system, an external system, users) you do it at your peril if you don't do it properly. If a user (worse, accountant, auditor, compliance staff) says "this line on this spreadsheet doesn't exist on the mainframe" how are you going to sort it out if you haven't done it properly? Bearing in mind, that the "line" in question can easily be from many months ago.