Computers 101. It’s all magic, right?
You’ve probably heard that deep down inside, computers are just 1s and 0s. Maybe someone told you that when you write a text or make a phone call everything is coded into this magical digital language of ones and zeros and then those ones and zeros are sent along some cable at the speed of electrons (light) and then it’s with decoded on the other end. In a non-literal way, there is some truth in that, but it’s actually not the whole story.
Thinking that the computer’s language is ones and zeroes and leaving it at that, makes computers seem magical… because we can’t envision doing that with ones and zeros. The truth of the matter is that computers actually don’t operate with 1 or 0, but they do operate logically.
Computers Don’t Actually Understand Ones and Zeros.
To be precise, computers don’t understand anything as they’re not self-aware (yet?), but even in a non-literal context of how computers work as machines, they don’t actually operate using ones and zeroes. They operate using boolean logic, which reduces every value to either TRUE or FALSE.
Logically, Computers are Basically Light Switches
If you reduce a computer down to its most basic elements, a computer essentially is just turning millions of extremely fast switches on and off. Like a standard single-pole light switch, it can exist in two states. ON and OFF. Every operation in a computer can be reduced to a sequence of ON/OFF states.
Any two symbols could represent the two states a computer uses, even emojis.
Boolean logic calls these states TRUE and FALSE. The binary number system uses 1 and 0, but it’s important to realize that at the simplest level, computers turn electricity on and off and it’s humans that use the symbols 1 and 0 to represent these two states. In reality, you could any two symbols to represent the two states a computer uses, even emojis. They’re just representations of an electrical state inside a piece of silicon. However, 1 and 0 is much easier to type than 👍 and 👎 or 🐶 and 🐱.
Logic Helps Understand Binary?
Understanding how computers work is not essential for learning binary. However, understanding how computers work on a basic level will help you understand why the binary number system is a good fit for a computers. It may also help you overcome the familiarity bias of seeing the binary number 100 and thinking one hundred, when we could also choose to write that as 🐶🐱🐱.
When it comes to learning binary, we have to fight some of our biases and break down the meaning of what we are seeing.
Fight Familiarity Bias!
When you see the number 100, what do you think of? Perhaps, you think of a perfect 100% on a test, a common speed limit (100 km/h) on Canadian highways, 100¢ making up $1, 100cm in a metre. Whatever comes to mind, you probably don’t immediately think of no ones, no tens, and one hundreds. Although you know that essentially is true, your brain has internalized those three symbols, 1-0-0 to have a specific meaning… one hundred.
Position is everything
If you rearranged the symbols 100 to be 010, you no longer think one hundred. Rearranging those symbols into a different order will more than likely have you automatically remove that first 0 and consider it to be 10, rather than assume that it’s just a mixed up 100. You may think that it’s weird that someone left a 0 on front and just discard it as insignificant.
And that is what is significant. Position is the MOST important part of the decimal number system. This may seem obvious, but it really becomes everything when you move to a different number system with a different number of symbols. By understanding that the position of a numeral is more significant than the number itself, it makes it easier to overcome your biases.
So… given that this subject is kind of esoteric and a bit complex, can we really expect a 6-year old to understand binary? YES! Yes we can… and I’ll be writing another article on the approach that I used to teach my 6-year old binary using beeps, boops and chocolate squares. More on that later.