View previous topic :: View next topic
|
Author |
Message |
Pradeep Thangapandy
New User
Joined: 22 Feb 2008 Posts: 12 Location: United Kingdom
|
|
|
|
When i try to read huge file having millions of records i could not do it using EXECIO command. Is there any approach for reading large files in REXX? |
|
Back to top |
|
|
expat
Global Moderator
Joined: 14 Mar 2007 Posts: 8797 Location: Welsh Wales
|
|
|
|
Yes, using EXECIO - but you need to process the file in sections.
There is an example on the forum that I posted some time ago. |
|
Back to top |
|
|
ofer71
Global Moderator
Joined: 27 Dec 2005 Posts: 2358 Location: Israel
|
|
|
|
Here's a rule-of-thumb for you: Reading large datasets in REXX is a bad idea. Not only the storage is exhausted, the process consumes a lot of CPU.
O. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19244 Location: Inside the Matrix
|
|
|
|
Hello,
FWIW - there is always a better way to process large files than to use rexx/execio. . .
Keep in mind that things that are bad performers originally will only get worse as volume increases (and it usually does). |
|
Back to top |
|
|
|