CPU Components – Adders

This post is part of a series in which I am following the structure of J. Clark Scott’s book But How Do It Know? (Affiliate Link – commission supports my knowledge quest). You can check out his website at http://www.buthowdoitknow.com/ (not an affiliate link). It’s written with no assumption of knowledge or background in computers, which has been great for me 😉

How Does an Adder Work?

As its name suggests, an adder takes two binary values and adds them together. My first thought was “that’s cool but how do you make it 32 or 64 bit?” which is a reasonable question. We’ll get to that, and it’s fairly straightforward. But understanding the simple (half) adder is key to understanding how the more complex ones work.

So if A = 1 and B = 1, this contraption using an XOR and AND gate will add them together to get O (output) = 0 and C (carry) = 1. Binary 10, aka 2. 1 + 1 = 2 right? You’ll find if you change out other values for A and B they’ll come out correct.

This also serves to illustrate that there are 10 kinds of people in the world, those who understand binary and those who don’t! Haha! ………ahem.

The Full Adder

A full adder is just a simple adder with another input. It’s called “carry in”, because the idea is to string together however many adders you want so you can add together 2, 4, 8, 32, 64 or some other number of bits. The full adder logical implementation is more complicated, but it works (I don’t show it here, I didn’t find it particularly important for the discussion. Also I’m a bit lazy).

So when you hook two together, the carry out of the first adder in the chain becomes the carry in of the second adder in the chain.

So if the the two A’s were the two bits of the first number you want to add, and the two B’s are the two bits of the second number that you wanted to add to the first, for example 10 and 11, the first O would be 1, the second would be 0, and the Co at the far right would be 1.

2 + 3 = 5 (binary 10 + 11 = 101), am I right?

J. Clark Scott (affiliate link, supports my knowledge quest) makes a good point – it’s an amazingly simple device for what it does. Although as it turns out, simply chaining adders together produces a fairly inefficient device for chemical and physical reasons. Modern adders are apparently more complex, but faster. It’s a pity life can’t be simple.

Understanding Logic Gates

This post is part of a series in which I am following the structure of J. Clark Scott’s book But How Do It Know? (Affiliate Link – commission supports my knowledge quest). You can check out his website at http://www.buthowdoitknow.com/ (not an affiliate link). It’s written with no assumption of knowledge or background in computers, which has been great for me 😉

What Do Logic Gates Actually Look Like?

A computer is essentially a rat’s nest of fairly straightforward contraptions called logic gates. The two components that make a computer a computer are full of them – the CPU and RAM (See link for my post on how they communicate). They operate using electrical wires (turning them on and off to represent 1’s and 0’s) that serve as “input” (going into the gate) and “output” (coming out of the gate). There are lots of circuit diagrams out there that use symbols, but it’s kind of tough to get a look at what a logic gate might physically look like. The best photo I could find of a real device was actually here on Wikipedia (where all good things come from) on the page for NAND gate. I like it because it juxtaposes the symbol diagram alongside the actual chip. (There was no attribution information for this photo, please let me know if you find it. I’d be more than glad to give credit where it’s due)

YouTuber LPG‘s “redstone” computer.

J. Clark Scott’s book purposefully avoids talking about the physical construction of such devices, as it’s outside of the scope of the discussion of logic. Since you can’t really see anything that actually shows the internal workings of the 7400 chip shown above, perhaps the next best thing is a really amazing YouTube video on a computer constructed inside Minecraft (a computer game focused on building stuff with blocks, in case you’re not familiar). My hat’s off to the creator, I’m in awe of his creation. In any case, suffice it to say that logic gates are constructed by some physical medium using using chemistry and physics that I don’t yet understand. For now, at least, that awesome YouTube video will have to do.

Universal Gates

What struck me in learning about logic gates was that a number of gates are actually just combinations of other gates. There seems to be a bit of confusion out there on various sites regarding what gates are used to construct what gates, but the general consensus seems to be that NAND (negative-AND) and NOR (negative-OR) are the universal building blocks to build other gates, especially ones that are a little easier to understand like AND, OR, and NOT. In any case, most places purport the existence 7 gates (AND, OR, NOT, NAND, NOR, XOR, XNOR), but the best site I found was an article at All About Circuits on logic gates that shows a total of 16 (although some of them aren’t really gates or binary).

The reason why NAND and NOR are used to build everything else seems to be that the are easily built physically. If I understood more about chemistry and physics I could probably give you a more specific reason based on physical and chemical properties, but most descriptions I can find just say that you need to trust that NAND and NOR are built, then the rest are assembled with multiple interconnected NAND and NOR gates. For example, I have an AND implementation using two NAND gates shown below:

This contraption will turn O on only if both A and B are turned on, just like an AND gate.

How Does It All Fit Together?

Another shameless plug for J. Clark Scott’s book But How Do It Know? (Affiliate Link – commission supports my knowledge quest) because he does a really good job explaining how you would put these gates together to store 8 bits, which is also called a byte. Typically the register size is the size of the CPU’s computational “width”. Nowadays most CPU’s 64 bit, so their registers would be 64 bit as well, although in some cases there may be smaller ones. This tutorial on howstuffworks.com also does a good job explaining registers (didn’t like the section on the gates themselves though, heads up), and uses some terminology that you’ll definitely come across looking at logic gates, such as flip-flop and feedback. Storing a collection of on or off bits into a collection of fancily interconnected logic gates using a “set” wire as input allows you to store the state of the bits when the set wire was activated. Another wire, called “enable” allows you to access or read the bits. Such a collection is called a register, allows you to write bytes and read bytes. Amazing.

You’ll see registers all over the place, usually with a word or words before it to specify what its purpose is, such as Memory Address Register that says you want to access a certain address in memory and transfer the contents stored there (also in a register, called a Memory Data Register) to the CPU, where it will likely store said contents temporarily in a CPU register. All of the actions I have described are executed using combinations these gates, using voltage rising and falling as signalling to represent 1’s and 0’s across the wires. Some of the circuit diagrams and logical collections of gates in modern boards, memory and CPU’s can be intimidating, but I just try remember back to how simple the gates themselves are, and it makes me feel better.

3 Ways the CPU and RAM Communicate

This post is part of a series in which I am following the structure of J. Clark Scott’s book But How Do It Know? (Affiliate Link – commission supports my knowledge quest). You can check out his website at http://www.buthowdoitknow.com/ (not an affiliate link). It’s written with no assumption of knowledge or background, which has been great for me 😉

CPU/RAM Communication – An Elevator Pitch

The CPU and RAM of a computer communicate using system of wires, called a bus. This bus usually has three parts, the address bus, the control bus, and the data bus. Since RAM holds data, and CPU performs some action (processing) on the data, the CPU first sends and address to RAM by turning on (I use “lighting”) wires to indicate (in binary) an address number. It then uses the control bus whether it’s reading from, or writing to, RAM. Finally, if it’s reading from, RAM will send the data across the data bus. If it’s writing to RAM, the CPU sends processed data back to RAM using the same data bus.

A Cook in the Kitchen

I see quite a few explanations made on the relationship between CPU and RAM, usually they are likened to the human brain. I suppose that works, but what helped me understand the basics of this communication was thinking of a cook retrieving and following recipes in the kitchen. It’s an OK analogy, although you probably wouldn’t store food in a cook book. So pretend this cook’s cookbook is really magical.

  1. Get Recipe -> Address Bus.
  2. Read Recipe or Store Food -> Control Bus
  3. Follow Recipe or Place Food in Cookbook (magic) -> Data Bus

The cook will follow the basic 3 steps above. First she finds the Scrambled Eggs recipe by its name (they’re in alphabetical order, of course). Then she decides whether she wants to read the recipe, or store some eggs she just made according to the recipe. In this case, she’ll read it. Finally, she follows the recipe and makes Scrambled Eggs.

In my diagram of a simple CPU, the buses are arbitrarily 2 bits wide and the number of address slots in RAM are 4, mostly because I ran out of room and 4 seemed like a nice easy number. In reality, you would have many, many more slots and bus wires.

1. Address Bus

The CPU first sends an address to RAM using the address bus wires. Each wire represents a bit (a 1 or a 0). In this example there are 2 of them, for a total of 4 possible addresses (00, 01, 10, and 11). If the CPU wants the contents of RAM address 2, it would light up the first wire and keep the second dark to represent 10, or 2 in binary. RAM would send the letter “Picked”, as requested, across the data bus if the Enable wire is lit. Or if the set wire were lit, data would be written to RAM’s data in the 3 slot, perhaps something like “A Peck”.

2. Control Bus

This controls receiving and sending. Enable means “receiving”, set means “sending”. You can also think of this as read or write. In my simple CPU above, lighting the enable wire transfers data from RAM to CPU, lighting the set wire writes data already processed by the CPU to RAM.

3. Data Bus

This is where all the goods are stored in binary. I just picked some funny words to represent data because it seemed like a nice way to illustrate the point, but the data could be anything. Every kind of data out there has some interesting way of encoding it. For example, letters use a system called ASCII or Unicode to translate letters to binary. Pictures use a variety of formatting, like JPEG or PNG, etc. But it’s all binary in a computer.