View previous topic :: View next topic
|
Author |
Message |
pinakimishra
New User
Joined: 21 Jul 2007 Posts: 24 Location: phoenix
|
|
|
|
I have a table which has 6 million records. I can't extract all the records from the table due to a limitation. So I am planning to pull the records in 6 batches of 1 million records in one batch. What are the options fro me to do it?
1. Rownum(): I can do a rownum() for all the records and then select rownum<= 1000000 in first batch, rownum > 1000000 and rownum<= 2000000 in 2nd batch and so on. But will the rownum always give the same records i.e. can it happen that when I run for the first time with rownum<= 1000000, can the smae record come when I run rownum > 1000000 and rownum<= 2000000 as its a new query and the order of the records might change as its a new connection. i.e. Will i get the same 1000000 records from the table if I run rownum<= 1000000 multiple times or records might change?
2. Rown_number Order by or Rank Order by: I can do both of these options but will like to avoid these as the Order by Clause will have performace issues based on our requirements.
Any other options we can consider? |
|
Back to top |
|
|
Rohit Umarjikar
Global Moderator
Joined: 21 Sep 2010 Posts: 3075 Location: NYC,USA
|
|
|
|
How about order by keys and hard code that key > last one in next query for fetch next 1M?
What limitation ? You can extract them in batch . |
|
Back to top |
|
|
don.leahy
Active Member
Joined: 06 Jul 2010 Posts: 765 Location: Whitby, ON, Canada
|
|
|
|
If you UNLOAD the data from an image copy data set, you can minimize the cost of the extract. |
|
Back to top |
|
|
|