Computer Science 101: How Computers Work
(Dec 12th, 2013 at 01:30:48 AM)Welcome back! If it's alright with you, I'm just going to dive in...
Back to Binary
So at this point, you (hopefully) know how binary represents numbers but could also be used to represent "other things." However, the (arguably) most important use for binary, is something called machine code, which is the key to making computers actually do things.
Computers are built out of components that are essentially either on or off (in reality it's more like 'high' or 'low') which we treat as a 1 and 0 respectively. By putting a handful of these together we can create store more and more data (8 bits = 1 byte; 1024 (2^10) bytes = 1 kilobyte, etc.). But data is useless you can do something with it. That's where machine code comes in.
Instruction Set Architectures
Machine code is binary which acts as instructions for computers (this is the binary that compilers turn programming languages into, as we discussed last time). The number one thing that computer hardware does is load machine code into its circuits so that it can do whatever the machine code says.
An instruction set architecture (ISA) is a description of what a computer can do and how it keeps track of what it's doing. For example, it might say that a computer is capable of adding, subtracting, loading data from memory, and storing data in memory. The ISA will supply opcodes (operation codes) which map binary numbers to the operations built into the hardware. In our example the following four opcodes could be assigned:
00 add 01 subtract 10 load 11 store
Every instruction in the machine code for your computer would then start with one of these opcodes. Of course, that begs the question: how long is each instruction? As with nearly everything in computers, the answer is "a multiple of 2." If your instruction length is 8 bits, we say that you have an 8-bit computer. If each instruction is 64 bits, you have a 64-bit computer. Depending on the number of operations your hardware supports, the number of bits you need to represent your opcode will change (in the same way that you can't count eight people with only one hand). With the rest of the bits in an instruction, you are typically specifying what data to do that instruction on.
On our computer (assuming it's 8-bit) say the following is an instructions from a program (I've added vertical bars to show separate pieces):
00|01|10|01
The first chunk "00" is our opcode to perform addition. The second chunk "01" is the value 1 and might mean something like "the first number the user typed." The next chunk, "10" is 2 in decimal, which might refer to the second value the user entered. So we know we are adding the first two numbers we found. The last chunk "01" is 1 again. This might mean that after we determine what #1 + #2 is, we want to store that result on top of #1. Obviously modern computers are much more complicated than this, but on some level this is still basically what's happening.
Compilers
While all of this is cool from a "getting to know computers" perspective, it's not necessarily the most actionable knowledge. Almost no one actually writes programs in binary anymore (if anyone). Even if they were creating the most basic software for a new kind of computer hardware, they would still use something known as assembly code. Assembly still works instruction-by-instruction just like machine code, but instead of memorizing the binary, the programmers might do the same thing we did above (add #1 and #2 and store it over #1) with something like:
add 1 2 1
It's still not great, but it can save a good amount of time if you're doing it a lot.
As I mentioned last time, most programmers use programs called compilers to convert programming languages into binary (machine code) for them. To add two numbers, x and y, they will often just type "x + y" and do what they want with the result. Modern computer programming can get very complicated, but it's nothing compared to what it once was. Keeping in mind that all of this is still happening with every program that is written, and every time someone runs it, makes all technology seem that much more impressive to me.
Not to mention that understanding all of this makes me at least a little more valuable in a post-apocalyptic society should that ever come about. :)
Regardless, for the last two days I plan to talk more in depth about the differences between the various programming languages, what makes "good code", and (on the last day) how I feel about several current trends and "hot topics" in the programming world.
Comments