Boolean Algebra

The History of Computing - A podcast by Charles Edge

Categories:

Boolean algebra Welcome to the History of Computing Podcast, where we explore the history of information technology. Because understanding the past prepares us to innovate (and sometimes cope with) the future! Today we’re going to talk a little about math. Or logic. Computers are just a bunch of zeroes and ones, right? Binary. They make shirts about it. You know, there are 10 types of people in the world. But where did that come from? After centuries of trying to build computing devices that could help with math using gears that had lots of slots in them, armed with tubes and then transistors, we had to come up with a simpler form of logic. And why write your own complicated math when you can borrow it and have instant converts to your cause? Technical innovations are often comprised of a lot of building blocks from different fields of scientific or scholastic studies. The 0s and 1s, which make up the flip-flop circuits computers are so famous for, are made possible by the concept that all logic can be broken down into either true or false. And so the mathematical logic that we have built trillions of dollars in industry off of began in 1847 in a book called The Mathematical Analysis of Logic, by George Boole. He would follow that up in a book called An Investigation of the Laws of Thought in 1854. He was he father of what we would later call Boolean Algebra once the science of an entire mathematical language built on true and false matured enough for Charles Sanders Peirce wrote a book called The Simplest Mathematics and had a title called Boolian Algebra with One Constant. By 1913, there were many more works with the name and it became Boolean algebra. This was right around the time that the electronic research community had first started experimenting with using vacuum tubes as flip-flop switches. So there’s elementary algebra where you can have any old number with any old logical operation. Those operators can be addition, subtraction, multiplication, division, etc. But in boolean algebra the only variables available are a 0 or a 1. Later we would get abstract algebra as well, but for computing it was way simpler to just stick with those 0s and 1s and in fact, ditching the gears from the old electromechanical computing paved the way for tubes to act as flip-flop switches, and transistors to replace those. And the evolutions came. Both to the efficiency of flip-flop switches and to the increasingly complex uses for mechanical computing devices. But they hadn’t all been mashed up together. So set theory and statistics were evolving. And Huntington, Jevons, Schröder, basically perfected Boolean logic, paving the way for MH Stone to provide that Boolean algebra is isomorphic to a field of sets by 1936. And so it should come as no surprise that Boolean algebra would be key to the development of basic mathematical functions used on the Berry-Attansoff computer. Remember that back then, all computing was basically used for math. Claude Shannon would help apply Boolean algebra to switching circuits. This involved binary decision diagrams for synthesizing and verifying the design of logic circuits. And so we could analyze and design circuits using algebra to define logic gates. Those gates would get smaller and faster and combined using combinational logic until we got LSI circuits and later with the automation of the design of chips, VLSI. So to put it super-simple, let’s say you are trying to do some maths. First up, you convert values to bits, which are binary digits. Those binary digits would be represented as a 0 or a 1, expressed in binary algebra as . There’s a substantial amount of information you can pack into those bits, with all major characters easily allowed for in a byte, which is 8 of those bits. So let’s say you also map your algebraic operators using those 0s and 1s, another byte. Now you can add the number in the first byte. To do so though, you would need to basically translate the notations from classical propositional calculus to their expression in Boolean algebra, typically done in an assembler. Much, much more logic is required to apply quantifiers. And simple true values are 0 and 1 but have a one step truth table to define AND (also known as a conjunction), OR (also known as a disjunction), and NOT XOR (also known as an exclusive-or). This allows for an exponential increase in the amount of logic you can apply to a problem. The act of deceasing if the problem satisfies the ability to translate into boolean capabilities is known as the Boolean satisfiability problem or SAT. At this point though, all problems really seem solvable using some model of computation given the amount of complex circuitry we now have. So the computer interprets information the functions and sets the state of a switch based on the input. The computer then combines all those trues and false into the necessary logic and outputs an answer. Because the 0s and 1s took too much the input got moved to punch cards, and modern programming was born. These days we can also add Boolean logic into higher functions, such as running AND for google searches. So ultimately the point of this episode is to explore what exactly all those 0s and 1s are. They’re complex thoughts and formulas expressed as true and false using complicated Boolean algebra to construct them. Now, there’s a chance that some day we’ll find something beyond a transistor. And then we can bring a much more complicated expression of thought broken down into different forms of algebra. But there’s also the chance that Boolean algebra sitting on transistors or other things that are the next evolution of boolean gates or transistors is really, well, kinda’ it. So from the Barry-Attansoff computer comes Colossus and then ENIAC in 1945. It wasn’t obvious yet but nearly 100 years after the development of Boolean algebra, it had been combined with several other technologies to usher in the computing revolution, setting up the evolution to microprocessors and the modern computer. These days, few programmers are constrained by programming in Boolean logic. Instead, we have many more options. Although I happen to believe that understanding this fundamental building block was one of the most important aspects of studying computer science and provided an important foundation to computing in general. So thank you for listening to this episode. I’m sure algebra got ya’ totally interested and that you’re super-into math. But thanks for listening anyways. I’m pretty lucky to have ya’. Have a great day