Ricochet is the best place on the internet to discuss the issues of the day, either through commenting on posts or writing your own for our active and dynamic community in a fully moderated environment. In addition, the Ricochet Audio Network offers over 50 original podcasts with new episodes released every day.
How to Build a Computer 19: Logic Gates
Welcome back to How to Build a Computer. You recall where we’re at, right? Hah! Trick question. As if I’d stick to a rational sequence. Today we’re going over some of the details in how you go from electrical circuits doing whatever it is that electrical circuits do and turn that into logic. We’re talking Logical Gates
Logic gates are transistor circuits that you can use to modify a signal. Let’s take the NOT gate as an example. If the input is on, the output is off. If the input is off, the output turns on. Whatever you put in, you get not-that coming out. Simple enough. Except the part where you’re creating energy out of nothing; what’s up with that? Well, not pictured you’ve got a five-volt source and a grounded drain. When you’re creating energy out of nothing you’re actually stealing it from that source. Those details make laying out your circuits more complicated but can generally be ignored when you’re drawing logic gates. Heck, as long as I’m recycling my drawings have another:
There are eight basic logic gates. To keep track of them you use something called a truth table. It’s a collection describing what you put in and what you get out. The NOT gate is easiest to understand. Here’s it’s truth table:
Actually, there’s one gate that’s simpler. It’s called a buffer. You ready for it?
So, a wire, right? I mean, if the output is off when the input is off then why the fancy symbol and all? Actually it’s not just a wire; typically these are built by sequencing two not gates together. The old double-negative. Fiendish! Why? It’s in the name ‘buffer’. It isolates one circuit from another. Keeps pesky physics things from interfering with your neat and clean logic. Speaking of which, let’s go over the other gates.
The rest of the gates transform two input signals into one output signal. Makes things more complicated. We’ll try an AND gate next:
Because you’ve got two input wires, each one which could be on (1) or off (0), you’ve got four possible combinations of ons and offs. An AND gate, as the name implies, only works when input A is on and input B is on. Makes sense. Slap a NOT gate on the end of that and you get a NAND gate (“not and” in case you hadn’t figured that part out.)
As long as both inputs aren’t on at the same time the output will be on. I think you’re getting this, so I’m going to do the remaining four all at once.
If you tum either input on in an OR gate you’ll get the output on. It’s not picky. The XOR gate is on a diet; it’ll take either input, but if you offer it both it’s out. And again if you slap a not gate on the front of either of those you get the opposite effect. Matter of fact, if you want to get the opposite effect on any of those gates you put one of those stylish little circles on the front.
Okay, we’ve got all eight logic gates. Could we make more? Sure. These are either useless (two entry wires, whatever combination of on and off they are the output is always off. Dead end.) or they’re redundant. Lemme draw you a quick truth table:
00 0
01 1
10 0
11 1
That one’s different, right? No, not really. Actually, only the one input matters; when the second input is on the result is on, and when the second input is off both are off. What you’re actually looking at here is a digital buffer on the second gate and never mind the first. This one is a combination of redundant and useless, much like some people I could name.
Okay, the eight gates though. We can see how you’d be able to run electricity through them, and hey, we’re probably also good with actually figuring out the transistors for these gates. Just one thing; how do we actually get logic out? Join us next week when we cover that in “Crazy Eights and Even Crazier Eights” or “The Eights are on Prozac Now, They’re Feeling Much Better.”
This is part nineteen of my ongoing series on building a computer, the brandy old-fashioned way. You may find previous parts under the tag How to Build a Computer. This week’s post has been brought to you by Sam Elliot. Sometimes you eat the bar, and sometimes the bar, well, he eats you.
[First – Silicon] [Previous – Quantum Mechanics] [Next — Digital Watches]
Published in Science & Technology
This reminds me of a kids computer game from the mid 80s, Robot Odyssey. It was fun but we never were able to beat it, probably because we were illogical kids.
Hank Rhody gives brains a good name, and it’s about time someone did. This man could take a handful of sand and turn it into a billion transistors that allow us to see Scarlett Johansson’s naked selfies. That’s a key evolutionary skill that must never be lost.
Seriously, what a series. Good for a membership all by itself. Everyone who writes about science or engineering should study Hank’s style. I know I do.
Ignorant question: I vaguely recall that neural networks were once considered to have the hopeless flaw of being unable to execute the Exclusive-OR (XOR) function. True? Or propaganda from the Von Neumann wing of history?
I know! I Like these even though the only words I understand are “the” and “and.”
The different symbols you use to represent the different types of gates… are those by convention, or do they represent the physicality of how they are constructed? Or both? Or, I guess, neither?
From my perspective, this all falls into the category of ‘hardware problem’.
Those symbols are in the game I played so I wonder about that too.
We were promised no backdoor sneak to Boolean algebra, and there is no cheese….who moved our cheese
-NYT review
If it isn’t illustrated with cheese, I’m not reading it.
How about thin slices of Wiener Schnitzel mit Ei, alternating with slices of pickle? Base/acid, should work.
Yeah, the symbols are by convention. If I were making it up I probably would have carved them out of cheese or something.
I don’t know why that would be. Neural networks move computation from binary to analog (between 0 and 1).
A few comments on gates. As implied the negation gates (not, nor, nand) require fewer transistors. Gates also require very little power when steady (leakage) and bit more with each transition. Gates also scale horizontally efficiently nand3 (3 input), nand4, nand5,…
Just this once, for you, pal.
I assume it’s kinda like a diode? The point is to make sure the current is only going in one direction?
At last I can add to this discussion! Because I found this clever life hack! You can thank me later!
Sort of. Transistors enforce the direction of current too, and any time you’re talking gates you’re going to have a lot of transistors hanging about. The reason given to me (and I’m not an expert in this; you might have better luck asking DonG or The Major) is that it functions to isolate one device from another. Semiconductors after all are only semi conductors; they have a higher resistance than you’d expect out of a copper wire. You’d expect some attenuation over a number of devices. Since a NOT gate doesn’t actually transmit the incoming signal (it shunts a new one down from the 5v source) then you wouldn’t get the signal attenuation.
You should probably point out–just for functional completeness–that you don’t need 8 gates. Any function you want can be done with one or more NAND gates, or–if you’re out of those–one or more NOR gates. Simple logic. And fewer inventory headaches.
Myself I usually think in terms of AND, OR, and NOT. Gets the job done.
Sensible. It’s a difference between design notation and implementation. Reading (or drawing) a schematic with NAND (or NOR) equivalents would be a major pain in the ass. Unless you’re implementing the circuit in discrete transistors as our primitive ancestors did, you’re probably going to want to take advantage of the equivalence. If you’re manufacturing Programmable Logic Devices you are almost certainly going to do it as a sea of uniform gates on a chip and provide compilers for whatever notations your customers want. I think that’s how they do it, anyway.
There’s a similar thing in propositional calculus (boolean algebra, zero-order logic–mathematicians just love changing up names for the same thing). A NOT and oneof(AND, OR) gets you there, but you’re probably going to want a nice symbol for material implication ( NOT a OR b) if you’re doing proofs. And also remember what the nice symbol really means.
Hey!
Yeah hey!
Hey Hey Hey!
Hahaha Minnesooota
What little circuit work I did back in the late 70 was with “discrete transistors”, I got lost track so fast trying to follow the outputs when it we started in on the integrated circuits. My “analog thinking” mind just rebelled at the magic that came from a string of IC’s.
Forget Heisenberg, the world we live in is analog, not digital.
Go Newton!