Here’s a good joke:
“Why do programmers always mix up Christmas and Halloween. Because Dec 25 is Oct 31”
Get it? If you do, then you should probably stop reading here because this is likely to be old news to you. If you’d like to know why it’s so damned funny, and learn a little computing basics for good measure, read on!
If you’re not a programmer, you (hopefully!) recognise the abbreviations for December 25th (Dec 25) and October 31st (Oct 31), and (hopefully!!) recognise them as Christmas and Halloween. So far, so good, but not very funny – even to a programmer.
To a programmer, the abbreviations ‘Dec 25’ and ‘Oct 31’ can also mean something entirely different. Dec 25 is an abbreviation for “Decimal 25” and “Oct 31” is an abbreviation for “Octal 31”, and for reasons that I’d like to explain, these are exactly the same thing.
To see why, let’s start at looking at the normal decimal number system – the number system we use in day-to-day life. Decimal is simply a way of counting – or more precisely, of representing, numbers – in blocks of ten. In day-to-day use we use the arabic numerals 0 through to 9 to represent the English numbers “zero” through to “nine”. After nine, we start counting in units of ten, so ten is 10, eleven is 11, twelve is 12 and so forth. After ninety-nine, we start with another block of 10, so hundred is 100, one-hundred and one is 101 and so on and so forth.
This is, of course, fairly elementary stuff – it’s the kind of thing we’re all taught in primary school. As an aside, this, unfortunately, is also one of those things that everyone knows so instinctively that they stop thinking about how numbers work. 10 is ten. 12 is twelve, and that’s all there is to it. It’s so ingrained that it’s easy to forget what an inspiration the arabic numbering system actually is and how out of kilter with our actual counting system it is.
Okay, so when we say “twenty five”, we write out the arabic numerals 2 and 5, representing 2 x 10 + 5. When we say “thirty one”, we do the same thing: 3 x 10 + 1.
But why stop at 10? Why should the arabic numerals ’31’ necessarily represent the number thirty-one. What if – for reasons that I will shortly explain – I want each column to represent a multiple of a different number? What if I wanted each column to represent multiples of 8?
There is absolutely nothing stopping us from doing that, and that is precisely what Octal is. In Octal, the numbers zero through to seven are represented in the normal way: 0, 1, 2, 3, 4, 5, 6 and 7. When we get to eight though, things get slightly skewed. Eight in Octal is represented by the arabic numerals 10. Remember, each column is representing multiples of eight, not multiples of ten; so 10 is Octal is 1 x 8 + 0 – Eight. And we carry on from there: nine is 11, ten is 12, eleven is 13, and so forth 20 is sixteen, and 31 is 3 x 8, which is 24, plus 1 which gives us twenty five.
(Incidentally, I bet you’re reading that and your mind is reading it as, “so ten is [twelve], eleven is [thirteen] and so forth. Try to think of each arabic numeral as an independent figure: “so ten is [one, two], eleven is [one, three]”)
So that’s the joke in a nutshell: to a programmer, the number represented by the arabic numerals ’25’ (two, five) in decimal is the same as the number represented by the arabic numerals ’31’ (three, one) in octal, because 2 x 10 + 5 is the same as 3 x 8 + 1. It’s simply an amusing coincidence that, to the programmer, Dec can mean both “December” and “Decimal”, and Oct can mean both “October” and “Octal”; and that Dec 25 is just happily, and coincidentally, Oct 31.
But that’s not a truly satisfying answer; the obvious question is why would anybody want to do that? What’s the point? If 10 is “ten”, then why would anyone find any use in 10 being “eight”? We all have ten fingers, it’s easy to count in tens, so why change things and count in eights?
The answer lies in how computers store numbers. Computers, you see, don’t count in tens. Computers count in “on” and “off”.
This will be the subject of part II 🙂