I know this is actually a hugh topic to discuss at forum, but i need some help in automating DB2 tables testing:
1. I have a DB2 table GAA1.STRUCTURE loaded monthly which contains 100K + rows. I need to test the data (each and every column, row by row) from the input file, to make sure everything got loaded fine in table.
2. I can do a count check and some random records testing, but i need to check the DATA here for all the records.
3. I was thinking to unload the table into a file and then compare it line by line in Excel, but if i want to compare it with previous table load, with the new load, how can i do that..
4. Also, since the rows are 100K+, excel compare would not work i guess.
I don't think this is a legitimate testing, why would data be not same from the load file Vs same table unload? unless being altered by real-time.
why do you need excel? Do you not aware of ISRSUPC or COMPAREX or DFSORT JOINKEYS?
Using Excel is Not a good idea. If you pass the data multiple times then there are more chances of a mess up. How would you make sure that the data in excel is same as the data in the table? So, you would add another possibility of doing a comparison and it's never ending.
What I would suggest is that you get the input data (from where you loaded the table) and the output data (possibly unload) into the same format. So, there will be 2 different files to compare (remember to convert in the same format) and then use ICETOOL's Splice Job to get matching/non-matching rows.