View previous topic :: View next topic
|
Author |
Message |
manishmittal
New User
Joined: 25 Apr 2008 Posts: 49 Location: Gurgaon
|
|
|
|
Hi All,
I am having a file which has million records.I want to split that file based on a field which is having Hex data.
Code: |
/a xxxxxxxxxxxxxxx ACT LVLGRMnØ =åÚ
000683AAAAAAAAAAAAAAA000200004CCE4DEDCDD9840074F00
00411C777777777777777000C00000133035379450027E7E00
|
My field starts at 22nd column and is of length 4.Now i want to split my file on the basis of this field such that records in which the value of this field is less than X'0000002C' will be present in my output file.
Please suggest how to get it done using Syncsort. |
|
Back to top |
|
|
expat
Global Moderator
Joined: 14 Mar 2007 Posts: 8796 Location: Welsh Wales
|
|
|
|
Take a look at INCLUDE or OMIT in the product documentation. |
|
Back to top |
|
|
manishmittal
New User
Joined: 25 Apr 2008 Posts: 49 Location: Gurgaon
|
|
|
|
I am using the below sort card.
Code: |
//SYSIN DD *
SORT FIELDS=COPY
INCLUDE COND=(28,02,BI,GE,X'0000',AND,28,02,BI,LE,X'0032')
|
But the above card will select records for 0 to 50 value.Then for next 50 to 100 i have to edit this card and run it.Similarly for next 50.
Can we automate this process in such a way tht i will get the output records for range 0-50,50-100,100-150 and so on(let say upto 100000). |
|
Back to top |
|
|
Alissa Margulies
SYNCSORT Support
Joined: 25 Jul 2007 Posts: 496 Location: USA
|
|
|
|
Manish,
Your requirement is not clear to me. Are you trying to produce a separate data set for each range? If so, you can code a single step with multiple OUTFIL statements, each with their own INCLUDE conditions to accomplish this task. |
|
Back to top |
|
|
dbzTHEdinosauer
Global Moderator
Joined: 20 Oct 2006 Posts: 6966 Location: porcelain throne
|
|
|
|
at range of 50 per outfile you are looking at 2000 files.
maybe the range break is unrealistic. |
|
Back to top |
|
|
Alissa Margulies
SYNCSORT Support
Joined: 25 Jul 2007 Posts: 496 Location: USA
|
|
|
|
I think we need clarification from the OP as to what his real requirement is... |
|
Back to top |
|
|
manishmittal
New User
Joined: 25 Apr 2008 Posts: 49 Location: Gurgaon
|
|
|
|
Hi All,
After reading the post again ,i could understand that i didn't mention the requirement very clearly.
Now , my req is : I am having a file which has million records.Now i want to split this file on basis of a field which contains Binary(Hex) data.Now the range for which i want to split is like X'0000' to X'0032'(0 to 50 in decimal) then X'0032' to X'0064'(50 to 100) and so on.Range can be different also.
So my output should be a number of files consist of records for different ranges.e.g my first output file should contain records coming under o to 50 range.
Hope it is clear now. |
|
Back to top |
|
|
murugan_mf
Active User
Joined: 31 Jan 2008 Posts: 148 Location: Chennai, India
|
|
|
|
Hope this may solve..
Code: |
//EXEC PGM=SYNCSORT
//SORTIN DD DSN=YOUR.INPUT.DS
//SORTOF1 DD DSN=OUT1.XX
//SORTOF2 DD DSN=OUT2.XX
.
.
.
.
//SYSIN DD *
OUTFIL FILES=1
INCLUDE=(22,2,CH,GE,X'0000',AND,22,2,CH,LT,X'0032')
OUTFIL FILES=2
INCLUDE=(22,2,CH,GE,X'0032',AND,22,2,CH,LT,X'0064')
.
.
.
.
/* |
|
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19243 Location: Inside the Matrix
|
|
|
|
Hello,
Quote: |
So my output should be a number of files consist of records for different ranges |
How many different values/ranges might there be? |
|
Back to top |
|
|
manishmittal
New User
Joined: 25 Apr 2008 Posts: 49 Location: Gurgaon
|
|
|
|
Hi Dick,
There are minumum 50 ranges.
Hi Murugan,
Thnx for the reply.Yes it solves the problem but i have 2 doubts here.
First is how many OUTFIL statements we can add like this and the other is why you have changed BI to CH. |
|
Back to top |
|
|
dick scherrer
Moderator Emeritus
Joined: 23 Nov 2006 Posts: 19243 Location: Inside the Matrix
|
|
|
|
Hello,
What is the maximum number of ranges? |
|
Back to top |
|
|
manishmittal
New User
Joined: 25 Apr 2008 Posts: 49 Location: Gurgaon
|
|
|
|
maximum is 100000/50= 2000.
Thanks |
|
Back to top |
|
|
|