I'd say yes. 100% out is too much. If DFSORT is not able to estimate the records itself (because it is not processing the files itself) then the better the estimate of records (and perhaps average record size) the better it can do the dynamic allocation.
I had a similar problem a couple of years ago, so will give you the solution that worked for me with help from Frank Yaeger.
Only a mere 27 million records though
S400 abends, with memory shortage messages given if the memory (no pun intended) still works correctly.
I found at the time that for some reason SAS would invoke its own SORT routines rather than DFSORT which we have installed here. So the above forced SAS to use DFSORT from the start and ignore the initial usage of the SAS SORT routines.
Can't recall why we came up with NOSORTBLKMODE, but the original thread and responses are on this forum somewhere.
- - - Just read the link posted by Skolusu, so maybe from there.
Hopefully this may be of some use to you. Funny enough I'm back at the very same site where this oproblem arose, and the job is still running just fine using a REGION=64M