Came across a piece of code in one program, where the program is taking user input from online screen and inserting the same in db2 table.
Now,the input field corresponding to the online map is defined as alphanumeric in cobol; say
IPFLD-FID pic x(03).
The corresponding column in db2, where this value will be inserted is defined as SMALLINT.
When numeric input is keyed thru online, the insertion is good as expected; but if character input is given(for testing purpose,which ideally should not be the case), i am seeing the value is getting changed to numeric and getting inserted. Ex: abc is getting converted to 123 and getting inserted.
Could you throw some light on this.
Its because of the collating sequence that has the values 0 through 9 .
but if character input is given(for testing purpose, which ideally should not be the case), i am seeing the value is getting changed to numeric and getting inserted. Ex: abc is getting converted to 123 and getting inserted.
Raise a defect and ask to validate before inserting.
Hi Rohit and Akatsukami,
Thanks for your help.Today i tried with both ábc' and 'ABC' .Both the cases it took the value 123. I was trying with random characters like xyz, and it took 789. I think as rohit said ,it is for the collating sequence 0 thru 9.
However i want to know more, how this actually works.I am looking for it in google, but if you have any more information, please pass on the same.
Joined: 03 Oct 2009 Posts: 1789 Location: Bloomington, IL
You of course know that the characters (as well as everything else) are represented in memory and on mass storage as binary patterns. If you were to edit a data set using the ISPF editor, and set HEX ON, you might see something like this:
(Note that space is a character, and occupies one byte.)
The first line is glyphs (displayed or printed characters), one per byte. The third line is the high nybble of each byte, and the fourth is the low nybble.
The high nybble is called, for historical reasons, a "zone"; the three leftmost bytes is the example are "zoned decimal" (COBOL DISPLAY, PL/I CHAR or PIC). When converting to binary (COBOL COMP-5, PL/I FIXED BIN}, the code doing the conversion ignores the zones and interprets the low nybbles as decimal digits.
Knowing this (and EBCDIC mapping of glyphs to binary patterns), you can see why, if each of those three-character fields is moved to a binary field,
the value of the field will be 123, 123, 123, and 789, respectively.