View previous topic :: View next topic
|
Author |
Message |
sun_job
New User
Joined: 18 Sep 2007 Posts: 73 Location: Bangalore
|
|
|
|
Hi
Please find my problem below:
We have a file containing 21 million records and its taking 30 minutes for doing a business logic check.
The thought is to split files 5M each dynamically and run parallel jobs, but there could be a case where input file will have more than 21 million records.
Appreciate all veterans suggestions to proceed ahead for this issue? |
|
Back to top |
|
|
Bill Woodger
Moderator Emeritus
Joined: 09 Mar 2011 Posts: 7309 Location: Inside the Matrix
|
|
|
|
If you look a SPLIT and relations on OUTFIL, you should find something that will work as you seem to want.
You get 21m-ish new records per day which need to be verified? |
|
Back to top |
|
|
RahulG31
Active User
Joined: 20 Dec 2014 Posts: 446 Location: USA
|
|
|
|
Just a small suggestion :
Rather than splitting the files in 5M each, you should decide how many split files you want (e.g 10) and then split the input file into that number.
This way you'll have a fix number of split JCLs and this will take minimum time to process the input file. |
|
Back to top |
|
|
sun_job
New User
Joined: 18 Sep 2007 Posts: 73 Location: Bangalore
|
|
|
|
Bill Woodger wrote: |
If you look a SPLIT and relations on OUTFIL, you should find something that will work as you seem to want.
You get 21m-ish new records per day which need to be verified? |
21M is just a record for a particular day. Each day the number of records will changes |
|
Back to top |
|
|
|