Boolean Algebra
Part of Boolean Logic and Gates
Boolean algebra is the mathematical foundation of digital logic — a formal system for reasoning about two-valued variables that enables systematic circuit analysis, simplification, and proof.
Why This Matters
Boolean algebra turns circuit design from an art into a science. Without it, complex logic circuits must be designed by intuition and verified by exhaustive testing. With Boolean algebra, any logic function can be expressed as an equation, manipulated according to precise rules, simplified to use fewer components, and proved equivalent to another implementation — all on paper before any hardware is built.
George Boole published his algebra of logic in 1854. Claude Shannon connected it to electrical circuits in his 1937 master’s thesis, proving that the algebra of relay switching circuits is isomorphic to Boolean algebra. That single proof made digital computing theoretically grounded: any Boolean function has a physical realization, and any physical switching circuit implements some Boolean function.
For practical circuit builders, Boolean algebra is the tool for minimizing gate count, identifying equivalent circuit transformations, and deriving NAND-only or NOR-only implementations from mixed-gate designs.
The Algebraic System
Boolean algebra operates on a set {0, 1} with three operations: AND (·), OR (+), and NOT (overbar). These obey axioms that are verifiable from the definitions.
Commutativity: order of operands doesn’t matter.
- A + B = B + A
- A · B = B · A
Associativity: grouping doesn’t matter.
- (A + B) + C = A + (B + C)
- (A · B) · C = A · (B · C)
Distributivity: two forms (unlike ordinary algebra, both AND and OR distribute over each other).
- A · (B + C) = A·B + A·C (AND distributes over OR — like ordinary algebra)
- A + (B · C) = (A+B) · (A+C) (OR distributes over AND — unique to Boolean algebra)
Identity: operations with 0 and 1.
- A + 0 = A (OR with false is identity)
- A · 1 = A (AND with true is identity)
- A + 1 = 1 (OR with true absorbs to true)
- A · 0 = 0 (AND with false absorbs to false)
Complement: a variable and its negation.
- A + NOT A = 1 (a variable OR its complement is always 1)
- A · NOT A = 0 (a variable AND its complement is always 0)
- NOT(NOT A) = A (double negation)
These axioms are sufficient to derive all other theorems.
Important Derived Theorems
Idempotent laws: A + A = A; A · A = A. Repeating an OR or AND with itself changes nothing.
Absorption laws: A + (A·B) = A; A · (A+B) = A. The first term absorbs the second when it contains the first as a factor. These are the most useful simplification identities.
De Morgan’s laws:
- NOT(A · B) = NOT A + NOT B
- NOT(A + B) = NOT A · NOT B
De Morgan’s laws are perhaps the most frequently used Boolean theorems. They show how negation distributes through AND and OR, converting each to the other. Applications:
- Converting AND-OR networks to NAND-NAND (push inversions inward, convert ORs to ANDs with double inversion)
- Converting OR-AND to NOR-NOR
- Deriving NAND-only or NOR-only circuits from any mixed design
Consensus theorem: A·B + NOT A·C + B·C = A·B + NOT A·C. The term B·C is redundant — if A·B=1, then B=1 and A=1, so NOT A=0 and the third term can’t contribute; if NOT A·C=1, then A=0 and B·C can’t add new 1-outputs beyond what NOT A·C already provides. Removing redundant terms simplifies circuits.
Canonical Forms
Any Boolean function can be expressed in two canonical (standard) forms:
Sum of Products (SOP): OR of AND terms (minterms). Each AND term (product) contains every variable either complemented or uncomplemented. Write one minterm per row where output=1 in the truth table.
For F = 1 at inputs (A,B,C) = (0,1,0), (1,0,1), (1,1,1): SOP = Ā·B·C̄ + A·B̄·C + A·B·C
Product of Sums (POS): AND of OR terms (maxterms). Each OR term contains every variable. Write one maxterm per row where output=0.
For the same F, output=0 at (0,0,0), (0,0,1), (0,1,1), (1,0,0), (1,1,0): POS = (A+B+C)·(A+B+C̄)·(A+B̄+C̄)·(Ā+B+C)·(Ā+B̄+C)
SOP maps naturally to two-level AND-OR (or NAND-NAND) circuits. POS maps to two-level OR-AND (or NOR-NOR) circuits. Choosing which is smaller depends on the function — some functions are more compact in SOP, others in POS.
Simplification by Boolean Algebra
Manual simplification applies theorems to reduce the number of terms and literals. Example:
F = A·B + A·B·C + A·NOT B + A·NOT B·C
Step 1: Apply absorption: A·B + A·B·C = A·B (A·B absorbs A·B·C) F = A·B + A·NOT B + A·NOT B·C
Step 2: Absorption again: A·NOT B + A·NOT B·C = A·NOT B F = A·B + A·NOT B
Step 3: Factor out A: F = A·(B + NOT B)
Step 4: Complement law: B + NOT B = 1 F = A·1 = A
The function simplifies to just A — all that complexity collapses to a single variable. This is the power of Boolean algebra: verifiable algebraic proof that a complex circuit and a single wire are logically identical.
In practice, Karnaugh maps (visual grouping tool) are faster than algebraic manipulation for functions up to 5–6 variables. For larger functions or automated design, Quine-McCluskey algorithm or modern EDA (Electronic Design Automation) tools perform algorithmic minimization. But understanding the algebraic basis makes all of these tools comprehensible rather than magical.