Having spent a few hours this week reading protocol messages in hexadecimal, I’ve been reflecting on the use of hex (base 16) and how much part of my life it has been. Some of the earliest computers that I saw, like the Science of Cambridge MK14, had only keypads and displays capable of the most rudimentary character set, which meant that you had to program the things in hex. The manuals had program listings in hex, and the only built-in (ROM) software on the machine was a simple monitor that allowed you to read/write parts of the menu and set a program running.

So pretty quickly computer users had to immerse themselves in base 16, and learn how to split numbers into groups of 4 bits to be encoded into hex. Now having numbers containing ABCDEF as well as 0-9 no longer has any sense of strangeness for me, but I’ve never reached the same level of comfort with Octal. In octal (base 8), numbers are split into groups of 3 bits, each of which can be expressed as the digits 0 to 7 (I've always wondered how someone could come up with the extraordinary plan of taking computer words of 32, 16 or 8 bits and dividing it into threes?)

So the number 255 (0xFF in hex, i.e. 8 bits all set to 1) gets rendered as ‘377’. Octal for me belongs to the generation of users before the microcomputer revolution when programming languiages like Algol 68 and BCPL were common in universities. Long-haired, sandal wearing hippies could perhaps be found in the early 1970s gazing at numbers in octal, but the fad soon passed and hex moved in to take its place.

If you look carefully you see the legacy of the octal “summer of love” in C and C++. Even now if you write some C++ code like this, you might received a surprise:

int c1=077;
int c2=77;

printf(“c1 = %d, c2 = %d\n”, c1, c2 );

The results are c1 = 63 and c2 = 77. Why? If you write a leading zero in front of a number, the compiler assumes that you are specifying a number in octal and not decimal. Therefore 077 = 000 111 111, which is hex 0x3F or 63.

I think I’ll stick to hex.