Boolean Logic: How Does it Work?

    Boolean Logic: How Does it Work?

      Computers are Boolean logic machines. By using

      • binary arithmetic
      • relational operators
      • bit string manipulation
      • binary decision diagrams

      computers can actually compute things. Not bad for something invented about 100 years before the first real-life computer.

      Binary Arithmetic

      Computers add using logic gates.

      In the case of computers and calculators, logic gates are used to make Half Adders. No, not a snake cut in half; computer scientists as a whole are against animal cruelty—even when it comes to the slithery kind. In the Key Parts section, we added binary numbers and included a carry in the next column over when adding 12 and 12—just like when you include a carry digit when for adding anything 510 and 510.

      With the Half Adder, a computer can do the same exact thing with binary numbers. This time, though, you wouldn't need to add the numbers manually. The computer computes one bit at a time for each of the two numbers to be added. It's just like normal addition, when you line everything up by the tens place, except using the twos place instead.

      The truth table for a Half Adder has four columns:

      • Two for the inputs (bit 1 and bit 2). 
      • One for the Sum, which holds the 20 place of the two-bit sum.
      • One for the Carry, which holds the sum for the 21 place.

      Because we're going one bit at a time, the Carry will only ever hold a value if we add 12 and 12. It'll add to 210, which means we need a new place (a carry, you could say). If that's confusing, Shmooper, head back to that Key Parts section to refresh yourself on addition in binary.

      Here's a table of what happens for each possible value combination of Bit 1 and Bit 2.

      Bit 1Bit 2SumCarry
      1101
      1010
      0110
      0000

      Looks a bit like a logic table, no?

      A logic gates half-adder looks something like this.

      If you smash two half-adders together with an OR gate, you get a full-adder, which adds two bits together (and takes care of Carry leftover from any additions before that point). All together it's going to look like this:

      Bit 1Bit 2Carry InSumCarry Out
      11111
      11001
      10101
      10010
      01101
      01010
      00110
      00000

      The Carry In holds any Carry from the previous addition. The Carry Out is any Carry that needs to go into the next addition. A logic gate of all that looks like this.

      Adding two two-bit numbers is going to need two full-adders to get the right answer. Guess how many it takes to add three? Hint: it's three. For any n-bit number, you're going to need n full adders.

      No matter how many adders you use, addition's going to go in the same order as base ten addition. The ones place goes first, and its Carry feeds into the next place. String enough full adders together and your computer can add some really big numbers, like those in the hundreds of thousands or millions—and do it in a fraction of a second.

      But enough about addition. Let's talk about strings of data.

      Bit String Manipulation

      You can do at lot more with bit strings (collections of bits) than just adding them together. Just off the top of our head, we can think of

      • changing all the 1s to 0s to clear out the data.
      • make sure that the first and last bit are both set to 1 for a specific encoding.
      • removing all the 0s just because.

      Most programming languages include what's called bitwise operations that allow you to use Or, And, or Xor operations on two bit strings.

      Bitwise operations let the computer set certain bits to 1 or 0 or flip the bits from 0 to 1 and vice versa (otherwise known as negating them). Don’t use them on binary numbers unless you want to permanently mess up the magnitude of the number, like changing 0101 (510) to 1010 (-210).

      When you take the Or of two bitwise numbers, it's going to insert a 1 any time either of the numbers has a 1 in that position. So if you have these numbers:

      00101001 Or 10110100

      You're going to get:

      10111101

      That operation's great when you want to set the bits to 1.

      When you take the And of two bit strings, each bit will be 1 when both numbers have a one in that place. Otherwise it'll be a 0. If you have these two numbers:

      00101001 And 10110100

      You're going to get:

      00100000

      Use it when you want to turn the values into 0, or do similar things.

      The Xor function will turn a bit to 1 if one of the strings—but not both—has a 1 in that place. These numbers:

      00101001 Xor 10110100

      Are going to give:

      10011101

      This one's nice for flipping bits.

      After you know those basic operations, you can do an infinite number of complex things with the strings.

      Relational Operators

      Computers can evaluate mathematical propositions to look for equality (or lack thereof) between two numbers or variables. And they wouldn't be able to do it without Boolean logic. If a computer evaluates an expression as False, it's going to return a 0. If it finds the expression to be True, then it's going to return "Hero" by Enrique Iglesias.

      Just kidding; it's going to return 1.

      Statements like

      • 4 = 5 
      • 62 < 12 
      • 7 > 100

      are all going to return 0, just like in real-life math. Statements like

      • 10 ≤10
      • (20 + 5) = (24 + 1)
      • 3 > 0

      are all going to return 1s. (Sometimes just non-zero values in general. It kind-of depends on the computer and what you're doing.)

      Relational operators can control just about anything in a computer program to make sure that

      • a specific piece of code should run.
      • a user gave the right input.
      • a loop of code should keep going.

      The relational operators themselves rely on some important properties of boolean algebra:

      • Transitivity
      • Reflexivity
      • Symmetry

      Any time you want to alphabetize your vintage K-pop collection, you're actually using transitivity to do it. By saying that A comes before L and L comes before Q, you can generalize to say that A also comes before Q. It goes both ways: Q comes after L and L comes after A, Q also comes after A.

      Every time you add two numbers together, you're relying on addition's symmetrical property An expression's symmetric if it doesn't matter what the order is. Writing 4 + 1 is the same as writing 1 + 4. Division would also be considered symmetric if 5 / 4 was the same thing as 4 / 5.

      But it isn't, so we don't need to worry about the implications.

      Reflexivity says that if you have two groups of things and everything in one group is also in the other, the two groups are symmetric. If you and a friend both have a K-Pop collection with the exact same albums, your collections are reflexive.

      We're going to go ahead and assume that you got all the same albums legally.

      All of these properties may seem elementary, but they're crucial to a computer’s understanding of the world and the inputs it receives. Also for math.

      Binary Decision Diagrams

      If you've ever seen one of these flowcharts, you were looking at a human version of a binary decision diagram. Each question has a Yes or No option (the English equivalent of True or False) that leads you to another Yes/No question. At the end you get an answer to the original question/statement at the top, like whether or not you understand flowcharts.

      Like people use flowcharts to answer the big questions, computers use binary decision diagrams (BDDs) to represent Boolean functions and then make the truth tables that answer their big questions. Their big questions being whether it's true or false.

      Binary decision diagrams are a kind of tree. Trees are diagrams used in computer science to show relationships between the objects that connect to other objects. A binary tree—which is what a BDD is—has no more than two nodes below every node. The node at the top of the tree is called the root. It's where the tree begins.

      In BDDs, the root is the same thing as the first variable in the boolean function. Let's call it X for funsies. X splits into two different things: X (usually represented by a solid arrow) and ~X (represented by a dotted arrow). Both X and ~X each point to a their own separate node for the next variable—let's call it Y. Each Y node splits just like X does (into Y and ~Y). This keeps going until the last variable can be evaluated.

      At very bottom tree are 1 and 0 nodes, which represent the Truth or Falsehood of the proposition. By following the arrows that point to each of these last nodes, you can determine what combinations of X/ ~X and Y/ ~Y make the proposition True or False.

      The simple BDD of XY looks like this . From the BDD, a computer can make the matching truth table (or visa versa). Both will give you the same answer, though.