From the full body from Steve Yegge's article,
Candidates should know what bits and bytes are. They should be able to count in binary; e.g. they should be able to tell you what 2^5 or 2^10 is, in decimal. They shouldn't stare blankly at you when you ask with 2^16 is. It's a special number. They should know it.
I was thrown off from the bit you quoted in the question; it sounded like a candidate should be able to describe it's significance, but in context he's saying that candidates should know, off the top of their head, what the decimal conversion of 216 is.
The significance of this is that since we humans still use decimal for counting, especially in our heads (in most circumstances), we need to know the rough capacities of the common byte blocks that we use for storage, memory, or even character encoding. Since a byte is 8 bits, the most common are 8, 16, 24, 32, and 64.
At the present time I would say 232 is the most commonly occurring capacity a developer deals with. I am suspicious of developers who don't know that 232 is roughly 4 billion (max value of ~2 billion if signed), since it means they've never bothered to find out roughly how many records can be stored in their databases that use 32-bit int
s for primary keys, or when old code using 32-bit int
s for IDs, dates, etc. will need to be refactored to 64-bit.1
216 is the total capacity of the Java short
. (Total numbers between -215 and 215-1)
A developer should know by heart what 8-bit is. Among the many common usages is ASCII character encoding.
I wouldn't expect a programmer to know 214 or 218 at all, but I would probably expect that they know 216 since it's a very commonly occurring number and short enough number (65536) to easily remember the full number.
1: If you browse the leaderboards of Call of Duty: MW2 or iPhone Game Center you'll often see cheaters at the top with high score values of 2,147,483,647, which is 231-1, the max value of a signed 232 integer.