IBM Mainframe Forum Index
 
Log In
 
IBM Mainframe Forum Index Mainframe: Search IBM Mainframe Forum: FAQ Register
 

Import excel into mainframe file.


IBM Mainframe Forums -> PC Guides & IT News
Post new topic   Reply to topic
View previous topic :: View next topic  
Author Message
Rijit

Active User


Joined: 15 Apr 2010
Posts: 168
Location: Pune

PostPosted: Sat Feb 12, 2011 8:10 pm
Reply with quote

Pls anyone give me the job to import an excel file from server/desktop into a mainframe flat file. I don't want to use the manual option START 6- option. I am looking to automate one of the processes in my project.

pls advise
Back to top
View user's profile Send private message
Robert Sample

Global Moderator


Joined: 06 Jun 2008
Posts: 8696
Location: Dubuque, Iowa, USA

PostPosted: Sat Feb 12, 2011 8:13 pm
Reply with quote

Unless you have software that handles the Excel format on your mainframe, you cannot directly transfer an Excel file to the mainframe and expect it to be usable. This is usually done by converting the Excel file to .CSV (comma delimited) or tab delimited format and transferring that.
Back to top
View user's profile Send private message
Rijit

Active User


Joined: 15 Apr 2010
Posts: 168
Location: Pune

PostPosted: Sat Feb 12, 2011 8:49 pm
Reply with quote

Hii Robert! Ok is the conversion of xcel to CSV a simple process? it may be a very basic question but I really don't know how to convert pls help me.

My basic requirement is I have a row of tabular data with 5 fields. I need to import that into a mainframe file and it should be done automatically if I run a job. Other things I will take care in the job.
Back to top
View user's profile Send private message
Phrzby Phil

Senior Member


Joined: 31 Oct 2006
Posts: 1042
Location: Richmond, Virginia

PostPosted: Sat Feb 12, 2011 10:49 pm
Reply with quote

Try typing CSV into the Excel help window.

Learning the HELP feature of any software is mandatory.
Back to top
View user's profile Send private message
enrico-sorichetti

Superior Member


Joined: 14 Mar 2007
Posts: 10872
Location: italy

PostPosted: Sat Feb 12, 2011 11:03 pm
Reply with quote

Quote:
it should be done automatically if I run a job

forget about it!
since the excel data is supposedly created by humans hands ( I said hands not brain )
the hands that created the data should start the whole shebang!

all the <file transfer> activity which involves a desktop should be started by a <desktop> action

open a FTP session with Your host, and transfer the file
schedule the job to be run by submitting it, or using the trigger facilities of Your job scheduling system
You could minimize the typing by scripting the whole process.

any other way of proceeding is against any sound IT practices!
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19244
Location: Inside the Matrix

PostPosted: Sat Feb 12, 2011 11:35 pm
Reply with quote

Hello,

Unless your organization is willing to pay $ for an integrated solution (they may already have done so - suggest you find out) you will need 2 "pieces".

One "piece" to unload the spreadsheet into a text file and ftp this to the mainframe.

The other to process the uploaded text fle on the mainframe.
Back to top
View user's profile Send private message
Rijit

Active User


Joined: 15 Apr 2010
Posts: 168
Location: Pune

PostPosted: Sun Feb 13, 2011 1:14 am
Reply with quote

enrico-sorichetti wrote:
Quote:
it should be done automatically if I run a job

forget about it!
since the excel data is supposedly created by humans hands ( I said hands not brain )
the hands that created the data should start the whole shebang!

all the <file transfer> activity which involves a desktop should be started by a <desktop> action

open a FTP session with Your host, and transfer the file
schedule the job to be run by submitting it, or using the trigger facilities of Your job scheduling system
You could minimize the typing by scripting the whole process.

any other way of proceeding is against any sound IT practices!


You are 100% right excel is created by human hands..You are reading between the lines icon_smile.gif

I appreciate the attention you give to our posts.But why are you emphasizing on opening a FTP session on the host..Are you speaking about the start 6 option what we generally do?
Back to top
View user's profile Send private message
Rijit

Active User


Joined: 15 Apr 2010
Posts: 168
Location: Pune

PostPosted: Sun Feb 13, 2011 1:23 am
Reply with quote

dick scherrer wrote:
Hello,

Unless your organization is willing to pay $ for an integrated solution (they may already have done so - suggest you find out) you will need 2 "pieces".

One "piece" to unload the spreadsheet into a text file and ftp this to the mainframe.

The other to process the uploaded text fle on the mainframe.


Yes, Dick my organization will be happy to pay $ if I am able to optimize their process by automation icon_biggrin.gif Definately it will save manual work and reduce the chances of error.

But why you wanna split the work into 2 pieces? Can't we do it in one shot if we have the formatted CSV file in right plce in the server/desktop. What I am thinking is to frame only 1 job which will automatically pick the CSV feed from the specified server location and convert it into a mainframe file and then start processing it once it gets the data in a PS. And yes Excel to CSV conversion will be done manually.

I hope I am able to state my thoughts clearly. Is this approach feasible?

Thanks.
Back to top
View user's profile Send private message
enrico-sorichetti

Superior Member


Joined: 14 Mar 2007
Posts: 10872
Location: italy

PostPosted: Sun Feb 13, 2011 1:28 am
Reply with quote

NO, TSO is NOT needed, TCP IP connectivity is enough!

everything is done on the PC
search the forums for FTP file transfer and You will find lots of examples

speak to Your support, they are the ones who have the final word on the process
so getting into too much detail is a waste of time if they do not know/understand
and the burden of explanation is on You!
Back to top
View user's profile Send private message
enrico-sorichetti

Superior Member


Joined: 14 Mar 2007
Posts: 10872
Location: italy

PostPosted: Sun Feb 13, 2011 1:36 am
Reply with quote

as usual there is a language barrier here....

step0 ==> fill the EXCEL worksheet
step1 ==> save it as a CSV
step2 ==> from a command prompt use FTP to do a PUT of the local file to the mainframe dataset
step3 ==> option 1 ( the better one ) use Your scheduler dataset trigger on the just uploaded file to automagically schedule the job to propcess the uploaded file
step3 ==> option 2 , logon to TSO/use an existing TSO session to submit the job
step3 ==> option 2a , use the same FTP session to submit the job
option 1 is better because the Job will not be stored on Your PC

in any case if Your organization is willing to spend $ on such an issue...
well they are idiots icon_biggrin.gif

in any case running everything from a job is not reasonable,
how are You going to make the JOB aware that the file is ready to be processed ?

I do not see anything to automate!

Quote:
server/desktop

a desktop is not supposed to play a server role!
Back to top
View user's profile Send private message
Rijit

Active User


Joined: 15 Apr 2010
Posts: 168
Location: Pune

PostPosted: Sun Feb 13, 2011 3:09 am
Reply with quote

Thanks a lot for the detailed explanation. Now the picture is clear to me..Step 3==> option 1 & 2 both looks good. I will enquire with the support guys and select the best approach!

In my case actually currently they are doing everything manually like counting the no of rows in excel etc. seting the breakpoint to commit and lot of tededious processes. and finally arrive at the update query..So my duty is to cut short all the manual work and design job to automate the process so that the users will just give the input file and we will convert it into CSV in the server location and run the job. And everything elese will be taken care by the job..This small automation if tested properly and implemented shall reduce the room for any errors which seems to be frequently happening as humans are bound to commit errors..

Thank you again

icon_smile.gif
Back to top
View user's profile Send private message
Rijit

Active User


Joined: 15 Apr 2010
Posts: 168
Location: Pune

PostPosted: Sun Feb 13, 2011 3:14 am
Reply with quote

May I ask why do you feel so as you stated "in any case if Your organization is willing to spend $ on such an issue...
well they are idiots" ?? Is there any better approach to such such issues...I would be happy if you give a candid feedback icon_lol.gif
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19244
Location: Inside the Matrix

PostPosted: Sun Feb 13, 2011 3:26 am
Reply with quote

Hello,

Quote:
Yes, Dick my organization will be happy to pay $ if I am able to optimize their process by automation
The $ would be to purchase some existing solution. . . It would not be for you to develop.

Quote:
But why you wanna split the work into 2 pieces?
It has nothing to do with what "i wanna". It has to do with the simple fact that spreadsheet creation/update is most often pc work while mainframe jobs run on the mainframe. While on the pc, the data needs to be saved in a "mainframe compatible" format (usually a [delimited] text file).

If the users working on these spreadsheets save the files to some server(s) that happen to also be ftp servers, a job on the mainframe could "pull" the file from the "file server" where the spreadsheets are stored and then continue with the mainframe process.

Even with this there is still the co-ordination of when to run the mainframe process to pull the files as it may be difficult to know when all of the fles are ready to be uploaded.
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19244
Location: Inside the Matrix

PostPosted: Sun Feb 13, 2011 3:28 am
Reply with quote

You were submitting while is was typing. . . icon_smile.gif

Quote:
would be happy if you give a candid feedback
Have no fear - many here are quite candid icon_wink.gif

d
Back to top
View user's profile Send private message
enrico-sorichetti

Superior Member


Joined: 14 Mar 2007
Posts: 10872
Location: italy

PostPosted: Sun Feb 13, 2011 3:29 am
Reply with quote

because if Your organization does not have inside the skills for such a simple task ...
well, meditate about it!

this is an issue that anybody with average skills should be able to solve in about 10 minues!
Back to top
View user's profile Send private message
Rijit

Active User


Joined: 15 Apr 2010
Posts: 168
Location: Pune

PostPosted: Sun Feb 13, 2011 3:43 am
Reply with quote

This is an adhoc request mostly a weekly activity.And we get an email from the users that the excel is in place in server to initiate the process..

But manually it takes around 1-2 hour to do all the stuff in excel..check the database & frame the queries.. and lot of errors happening.. Tats why my manager asked me to automate..I don't have say here but to complete the task..Now I know it is not a very big issue technicaly but the business impact due to errors is significant so the issue has been highlighted..


icon_lol.gif

Thanks everyone for their inputs!
Back to top
View user's profile Send private message
dick scherrer

Moderator Emeritus


Joined: 23 Nov 2006
Posts: 19244
Location: Inside the Matrix

PostPosted: Sun Feb 13, 2011 4:22 am
Reply with quote

Hello,

Why does the database or the query code need any work after once being implemented?

If the database needs work every iteration, i'm not sure what might be automated. . . Same with the queries.

It sounds like there may be much more to the situation than simply uploading data frm a spreadsheet.

And yes, many of us have experience with "things" that become elevated/highlighted like this icon_smile.gif
Back to top
View user's profile Send private message
Rijit

Active User


Joined: 15 Apr 2010
Posts: 168
Location: Pune

PostPosted: Sun Feb 13, 2011 12:56 pm
Reply with quote

Hi Dick,

Let me explain, in current scenario the starting feed (spreadsheet1 - supplied by user) has only one row.. We manually evaluate the spreadsheet data and accordingly run around 5 to 6 queries using the data from that spreadheet to get more information and construct a new spreadsheet2 which may have lakhs of rows. Again some manual work like counting is required on the consructed spreadsheet2 to find the rows where to commit. After thsi finally we arrive at the database update query and hard cord the commit points.. And the adhoc batch SPUFI job is submitted to update the database for spreadsheet2. This process looks very messy amd cumbersome and prone to errors.

So I am trying to create a job which will have 2 steps:-

Step 1: import that starting CSV feed excel file into mainfrme file
Step 2: a program which will do all the operations (logic for all the manual work + update)

So we will ask the users to put the input feed in fixed CSV format at the FTP server and send an email notification. Then we will run the job. And the process completed.

Thanks,
Back to top
View user's profile Send private message
Rijit

Active User


Joined: 15 Apr 2010
Posts: 168
Location: Pune

PostPosted: Sun Feb 13, 2011 1:00 pm
Reply with quote

I hope this gives some insight. Any suggestions are most welcome!
Back to top
View user's profile Send private message
PeterHolland

Global Moderator


Joined: 27 Oct 2009
Posts: 2481
Location: Netherlands, Amstelveen

PostPosted: Sun Feb 13, 2011 1:55 pm
Reply with quote

Create an Excel macro in VBA to :

1. convert the sheet to a csv file
2. ftp this file to the host

At the host let a scheduler trigger a job when the data has arrived
to process it.

For the Excel macro do some googling, enough samples to find
on the internets. I used for example : ftp from excel as search argument
Back to top
View user's profile Send private message
enrico-sorichetti

Superior Member


Joined: 14 Mar 2007
Posts: 10872
Location: italy

PostPosted: Sun Feb 13, 2011 3:46 pm
Reply with quote

Quote:
spreadsheet2 which may have lakhs of rows.


and who is ever going to look at them ?

topic is going to get locked pretty soon ..
first... because it belongs to the category
Quote:
the following repetitive topics are no longer eligible for discussion in any of the following Forums:
Quote:
How to transfer/FTP a file. This topic has already been posted many times. Please search for the answer that best suits your needs, or refer to the proper documentation.
ibmmainframes.com/viewtopic.php?t=20820&highlight=

second because it seems to be shifting toward stubborness/incompetence,bad judgement :icon_cool.gif

from one automatic excel automatic row run an automatic query to produce automatic lakhs of more automatic rows
to be automatically downloaded to an automatic excel automatic spreadshit and from here again...
for each row run a query that will produce lakhs of rows to be downloaded ...... on and on and on ...

oops are ... rows, excel things, queries that are going to be automatically spread around or something else ???
Back to top
View user's profile Send private message
Rijit

Active User


Joined: 15 Apr 2010
Posts: 168
Location: Pune

PostPosted: Sun Feb 13, 2011 4:16 pm
Reply with quote

enrico-sorichetti wrote:
Quote:
spreadsheet2 which may have lakhs of rows.


and who is ever going to look at them ?

topic is going to get locked pretty soon ..
first... because it belongs to the category
Quote:
the following repetitive topics are no longer eligible for discussion in any of the following Forums:
Quote:
How to transfer/FTP a file. This topic has already been posted many times. Please search for the answer that best suits your needs, or refer to the proper documentation.
ibmmainframes.com/viewtopic.php?t=20820&highlight=

second because it seems to be shifting toward stubborness/incompetence,bad judgement :icon_cool.gif

from one automatic excel automatic row run an automatic query to produce automatic lakhs of more automatic rows
to be automatically downloaded to an automatic excel automatic spreadshit and from here again...
for each row run a query that will produce lakhs of rows to be downloaded ...... on and on and on ...

oops are ... rows, excel things, queries that are going to be automatically spread around or something else ???



I am sorry..pls don't misunderstand me.. icon_cry.gif
Back to top
View user's profile Send private message
enrico-sorichetti

Superior Member


Joined: 14 Mar 2007
Posts: 10872
Location: italy

PostPosted: Sun Feb 13, 2011 4:24 pm
Reply with quote

I dare to say that the main issue here is that the process is totally unclear to the TS

from a simple
save an excel sheet to a CSV PC file,
FTP the same to a <server>
upon the receipt of the file schedule some process (**)

we went to an unstructured/unclear/fubar' ed <thingie> !

(**) hope You noticed that I did not say mainframe!
whatever the operating system running on the MAINFRAME/SERVER
the logical approach of the issue does not change

c /<server>/mainframe/
c /process/job/
and the terminology will become mainframe friendly

even on non zOS systems <job/process> schedulers exists to trigger something based on the presence of a dataset/file
Back to top
View user's profile Send private message
Rijit

Active User


Joined: 15 Apr 2010
Posts: 168
Location: Pune

PostPosted: Mon Feb 14, 2011 1:53 am
Reply with quote

Yes you are absolutely right..but I m a learner and this may be a petty thing bu new for me.. but now I m clear with my approach & many thanks for your support icon_smile.gif
Back to top
View user's profile Send private message
mtaylor

Active User


Joined: 20 Feb 2009
Posts: 108
Location: Kansas City

PostPosted: Fri Feb 18, 2011 5:48 am
Reply with quote

Here is the issue:

Any process that it is important to an organization should be analyzed and understood. And only then implemented. If it is in fact important to the organization then management should allocate resources to implement a process that has the feature of being 'useful'. Here 'useful' means the people involved can perform the process with the provided implementation quickly, easily, and accurately.

Your process sounds like an ad-hoc cluster flop and only poor management would allow any real business to be based on such a thing.

Therefor, everyone here is incredulous and suspect at the prospect of automating the process you've described hence the responses you've received.
Back to top
View user's profile Send private message
View previous topic :: :: View next topic  
Post new topic   Reply to topic View Bookmarks
All times are GMT + 6 Hours
Forum Index -> PC Guides & IT News

 


Similar Topics
Topic Forum Replies
No new posts How to split large record length file... DFSORT/ICETOOL 7
No new posts Extracting Variable decimal numbers f... DFSORT/ICETOOL 17
No new posts Mainframe openings in Techmahnidra fo... Mainframe Jobs 0
No new posts SFTP Issue - destination file record ... All Other Mainframe Topics 2
No new posts Access to non cataloged VSAM file JCL & VSAM 18
Search our Forums:

Back to Top