Pls anyone give me the job to import an excel file from server/desktop into a mainframe flat file. I don't want to use the manual option START 6- option. I am looking to automate one of the processes in my project.
Joined: 06 Jun 2008 Posts: 8700 Location: Dubuque, Iowa, USA
Unless you have software that handles the Excel format on your mainframe, you cannot directly transfer an Excel file to the mainframe and expect it to be usable. This is usually done by converting the Excel file to .CSV (comma delimited) or tab delimited format and transferring that.
Hii Robert! Ok is the conversion of xcel to CSV a simple process? it may be a very basic question but I really don't know how to convert pls help me.
My basic requirement is I have a row of tabular data with 5 fields. I need to import that into a mainframe file and it should be done automatically if I run a job. Other things I will take care in the job.
forget about it!
since the excel data is supposedly created by humans hands ( I said hands not brain )
the hands that created the data should start the whole shebang!
all the <file transfer> activity which involves a desktop should be started by a <desktop> action
open a FTP session with Your host, and transfer the file
schedule the job to be run by submitting it, or using the trigger facilities of Your job scheduling system
You could minimize the typing by scripting the whole process.
any other way of proceeding is against any sound IT practices!
Joined: 23 Nov 2006 Posts: 19243 Location: Inside the Matrix
Hello,
Unless your organization is willing to pay $ for an integrated solution (they may already have done so - suggest you find out) you will need 2 "pieces".
One "piece" to unload the spreadsheet into a text file and ftp this to the mainframe.
The other to process the uploaded text fle on the mainframe.
forget about it!
since the excel data is supposedly created by humans hands ( I said hands not brain )
the hands that created the data should start the whole shebang!
all the <file transfer> activity which involves a desktop should be started by a <desktop> action
open a FTP session with Your host, and transfer the file
schedule the job to be run by submitting it, or using the trigger facilities of Your job scheduling system
You could minimize the typing by scripting the whole process.
any other way of proceeding is against any sound IT practices!
You are 100% right excel is created by human hands..You are reading between the lines
I appreciate the attention you give to our posts.But why are you emphasizing on opening a FTP session on the host..Are you speaking about the start 6 option what we generally do?
Unless your organization is willing to pay $ for an integrated solution (they may already have done so - suggest you find out) you will need 2 "pieces".
One "piece" to unload the spreadsheet into a text file and ftp this to the mainframe.
The other to process the uploaded text fle on the mainframe.
Yes, Dick my organization will be happy to pay $ if I am able to optimize their process by automation Definately it will save manual work and reduce the chances of error.
But why you wanna split the work into 2 pieces? Can't we do it in one shot if we have the formatted CSV file in right plce in the server/desktop. What I am thinking is to frame only 1 job which will automatically pick the CSV feed from the specified server location and convert it into a mainframe file and then start processing it once it gets the data in a PS. And yes Excel to CSV conversion will be done manually.
I hope I am able to state my thoughts clearly. Is this approach feasible?
NO, TSO is NOT needed, TCP IP connectivity is enough!
everything is done on the PC
search the forums for FTP file transfer and You will find lots of examples
speak to Your support, they are the ones who have the final word on the process
so getting into too much detail is a waste of time if they do not know/understand
and the burden of explanation is on You!
step0 ==> fill the EXCEL worksheet
step1 ==> save it as a CSV
step2 ==> from a command prompt use FTP to do a PUT of the local file to the mainframe dataset
step3 ==> option 1 ( the better one ) use Your scheduler dataset trigger on the just uploaded file to automagically schedule the job to propcess the uploaded file
step3 ==> option 2 , logon to TSO/use an existing TSO session to submit the job
step3 ==> option 2a , use the same FTP session to submit the job
option 1 is better because the Job will not be stored on Your PC
in any case if Your organization is willing to spend $ on such an issue...
well they are idiots
in any case running everything from a job is not reasonable,
how are You going to make the JOB aware that the file is ready to be processed ?
Thanks a lot for the detailed explanation. Now the picture is clear to me..Step 3==> option 1 & 2 both looks good. I will enquire with the support guys and select the best approach!
In my case actually currently they are doing everything manually like counting the no of rows in excel etc. seting the breakpoint to commit and lot of tededious processes. and finally arrive at the update query..So my duty is to cut short all the manual work and design job to automate the process so that the users will just give the input file and we will convert it into CSV in the server location and run the job. And everything elese will be taken care by the job..This small automation if tested properly and implemented shall reduce the room for any errors which seems to be frequently happening as humans are bound to commit errors..
May I ask why do you feel so as you stated "in any case if Your organization is willing to spend $ on such an issue...
well they are idiots" ?? Is there any better approach to such such issues...I would be happy if you give a candid feedback
Joined: 23 Nov 2006 Posts: 19243 Location: Inside the Matrix
Hello,
Quote:
Yes, Dick my organization will be happy to pay $ if I am able to optimize their process by automation
The $ would be to purchase some existing solution. . . It would not be for you to develop.
Quote:
But why you wanna split the work into 2 pieces?
It has nothing to do with what "i wanna". It has to do with the simple fact that spreadsheet creation/update is most often pc work while mainframe jobs run on the mainframe. While on the pc, the data needs to be saved in a "mainframe compatible" format (usually a [delimited] text file).
If the users working on these spreadsheets save the files to some server(s) that happen to also be ftp servers, a job on the mainframe could "pull" the file from the "file server" where the spreadsheets are stored and then continue with the mainframe process.
Even with this there is still the co-ordination of when to run the mainframe process to pull the files as it may be difficult to know when all of the fles are ready to be uploaded.
This is an adhoc request mostly a weekly activity.And we get an email from the users that the excel is in place in server to initiate the process..
But manually it takes around 1-2 hour to do all the stuff in excel..check the database & frame the queries.. and lot of errors happening.. Tats why my manager asked me to automate..I don't have say here but to complete the task..Now I know it is not a very big issue technicaly but the business impact due to errors is significant so the issue has been highlighted..
Let me explain, in current scenario the starting feed (spreadsheet1 - supplied by user) has only one row.. We manually evaluate the spreadsheet data and accordingly run around 5 to 6 queries using the data from that spreadheet to get more information and construct a new spreadsheet2 which may have lakhs of rows. Again some manual work like counting is required on the consructed spreadsheet2 to find the rows where to commit. After thsi finally we arrive at the database update query and hard cord the commit points.. And the adhoc batch SPUFI job is submitted to update the database for spreadsheet2. This process looks very messy amd cumbersome and prone to errors.
So I am trying to create a job which will have 2 steps:-
Step 1: import that starting CSV feed excel file into mainfrme file
Step 2: a program which will do all the operations (logic for all the manual work + update)
So we will ask the users to put the input feed in fixed CSV format at the FTP server and send an email notification. Then we will run the job. And the process completed.
topic is going to get locked pretty soon ..
first... because it belongs to the category
Quote:
the following repetitive topics are no longer eligible for discussion in any of the following Forums:
Quote:
How to transfer/FTP a file. This topic has already been posted many times. Please search for the answer that best suits your needs, or refer to the proper documentation.
second because it seems to be shifting toward stubborness/incompetence,bad judgement :
from one automatic excel automatic row run an automatic query to produce automatic lakhs of more automatic rows
to be automatically downloaded to an automatic excel automatic spreadshit and from here again...
for each row run a query that will produce lakhs of rows to be downloaded ...... on and on and on ...
oops are ... rows, excel things, queries that are going to be automatically spread around or something else ???
topic is going to get locked pretty soon ..
first... because it belongs to the category
Quote:
the following repetitive topics are no longer eligible for discussion in any of the following Forums:
Quote:
How to transfer/FTP a file. This topic has already been posted many times. Please search for the answer that best suits your needs, or refer to the proper documentation.
second because it seems to be shifting toward stubborness/incompetence,bad judgement :
from one automatic excel automatic row run an automatic query to produce automatic lakhs of more automatic rows
to be automatically downloaded to an automatic excel automatic spreadshit and from here again...
for each row run a query that will produce lakhs of rows to be downloaded ...... on and on and on ...
oops are ... rows, excel things, queries that are going to be automatically spread around or something else ???
I dare to say that the main issue here is that the process is totally unclear to the TS
from a simple
save an excel sheet to a CSV PC file,
FTP the same to a <server>
upon the receipt of the file schedule some process (**)
we went to an unstructured/unclear/fubar' ed <thingie> !
(**) hope You noticed that I did not say mainframe!
whatever the operating system running on the MAINFRAME/SERVER
the logical approach of the issue does not change
c /<server>/mainframe/
c /process/job/
and the terminology will become mainframe friendly
even on non zOS systems <job/process> schedulers exists to trigger something based on the presence of a dataset/file
Yes you are absolutely right..but I m a learner and this may be a petty thing bu new for me.. but now I m clear with my approach & many thanks for your support
Joined: 20 Feb 2009 Posts: 108 Location: Kansas City
Here is the issue:
Any process that it is important to an organization should be analyzed and understood. And only then implemented. If it is in fact important to the organization then management should allocate resources to implement a process that has the feature of being 'useful'. Here 'useful' means the people involved can perform the process with the provided implementation quickly, easily, and accurately.
Your process sounds like an ad-hoc cluster flop and only poor management would allow any real business to be based on such a thing.
Therefor, everyone here is incredulous and suspect at the prospect of automating the process you've described hence the responses you've received.