In which I take a photo every day that I'm 50, and post it here on this blog, with a bit of related blurb.

Friday 31 January 2014

Day 48 - Superb Owl

XLVIII


Roman numerals...what are they about, then?

Seriously, isn't it a weird way of writing numbers?   If you must insist on doing everything in base 10, why only have written symbols for 1, 5, 10, 50, 100 etc?   It seems an overly complex way of expressing a relatively simple idea. 

And whilst we're on the subject, why base 10?!  Clearly, the Romans were heavily influenced by the number 10...but why?  ("Why do I care?", I hear you cry...

I've just done a quick bit of internet research (and you know how reliable that is, of course - I looked on wikipedia and everything), and it really does seem to be simply because we have 10 fingers - decimal is by far the most common numeral system in human history. 

There are a few exceptions.  The Mayans used base 20, using all fingers and toes, for example.  And some ancient Californian/Mexican cultures use base 8, counting the gaps between the fingers, rather than the fingers themselves. 

Mainly, we humans seem to choose base 10, when given a free choice.

Maybe it's just me, but it seems to me that decimal is the least efficient system to use with 10 fingers...allow me to explain my reasoning...

How high can you count on your fingers, using decimal?  10.

But if we use any base lower than 10, we can count higher.  The lower the base, the higher the possible count. 

For example, if we used base 9, and use our left thumb to represent "10" (in base 9, indicating 1 x 9 plus 0 x 1), then we can easily count to 19...try it!

If we used base 2 we could count to 1023 on our fingers, easily.   It's no more difficult than decimal, other than that you've been conditioned to think in decimal for your whole life. 

Like if you've spoken English exclusively, then French might seem strange and complicated.  But if you were French, it might be the other way around.   Neither is objectively more difficult than the other, except for your familiarity with it.

So it is with numeral systems.  Perhaps I'm biased from a few decades in IT, where hexadecimal (base 16) was prevalent, back in the day.   I used to be able to do multiplication in hexadecimal, once upon a time.  It's a great system, and is also a simple, large number version of binary (base 2).

Ah, base 2,  the purest of all numeric systems!  I could explain that grand claim, but it'd take a while, so I'll spare you the tedium.

So why didn't we just adopt binary?   We have two hands and two feet.  We walk left, right, left, right, 1, 2, 1, 2...our hearts beat in rhythmic couplets.

It seems so much more natural, and would have been much more expansive, logical and useful, had we adopted binary instead of decimal. 

It feels (in my gut) as though there must be some underlying evolutionary effect whereby decimal is an emergent fundamental property of nature under certain circumstances.  There must be some sort of evolutionary advantage to it.   

Maybe it's that until you've exhausted all 10 fingers, you don't need another digit - being able to count to 11 doesn't matter if you're only just learning to count past 5, for instance.   So by the time we were advanced enough to need to count higher than 10, base 10 was already established.

Yeah, that'll probably be it, it's not like I'm just pulling this stuff out of my arse, eh?!

Seriously, it's all abject speculation...I've no idea of the veracity of any of that, I'm just making (sh)it up as I go along, late on a Friday night after a particularly taxing week at work...

B-) 

He is a cool and funky little Owl, though, isn't he?   He lives on our bookcase, perched on a picture frame.

Superb Owl XLVIII!

No comments:

Post a Comment