Logic Gates

03

Abstractions

In this lesson we will look at some additional examples of how the Big Idea of abstraction is used in computing. We will focus on low-level hardware abstractions, in particular, on logic gates, the fundamental computational building blocks of electronic circuits. We'll tak a first look "under the hood," so to speak, to see how computers process binary information.

Integrated Circuits

In order to have the modern computer age, in which people can carry extremely powerful computers in their pockets (their cell phones), circuits had to become much smaller and cheaper. The invention of the integrated circuit made this possible. Integrated circuits combine ("integrate") millions or billions of very tiny electrical parts (transistors, resistors, capacitors, and others) packaged into a small plastic box.

An integrated circuit ("IC" or "chip") is a single physical device that contains millions or billions of basic electrical parts. A processor is an IC, but not all ICs are processors; there are also special-purpose chips inside a computer.

This is the 64-bit Intel Core i7-8700K, © 2017 Intel

The fundamental enabling technology for the computer age was the transistor. In digital circuits, transistors are used as switches.

Transistors

A wire can either have a voltage or not have a voltage on it. The reality is more complicated. The on-or-off picture of a wire, a transistor, or a logic gate output is a simplification—an abstraction.

This is a rough graph of the actual input-output behavior of a transistor. Don't worry about the details; just notice the two blue flat parts of the graph. Within the "cutoff" region, small changes to the input voltage do not change the output voltage at all; the output is always zero volts. Likewise within the "saturation" region, small input changes hardly impact output voltage; the output is interpreted as a one. This is how transistors are used as switches in a computer.

Transistors are versatile devices. When used in the middle, linear (pink) part of the graph, they're amplifiers; a small variation in input voltage produces a large variation in output voltage. That's how they're used to play music in a stereo.

The transistor is the fundamental building block of electronic circuits, where they are used as on/off switches.

The lower region of the curve is called "cutoff" because the transistor's output is cut off (the output is zero) for any input in that region. The upper region is called "saturation" because, like a sponge that can't get any wetter, the transistor can't give more output no matter how big the input. The central region is called "linear" because it behaves like a linear (straight) function.

The digital domain is not a law of nature; circuit designers have to work at ensuring that each wire in a circuit is always either fully on or fully off. The digital domain is an abstraction.

Moore's Law

In 1965, Gordon Moore, one of the pioneers of integrated circuits, predicted that the number of transistors that could be fit on one chip would double every year. In 1975, he revised his estimate to doubling every two years. This prediction is known as Moore's Law.

It turns out that other important measurements have also shown roughly the same doubling behavior, such as processor speed and the amount of memory that fits in a computer. Doubling hardware speed improves the size of problems that you can efficiently handle.

The importance of Moore's Law isn't just that computers get bigger and faster over time; it's that engineers can predict how much bigger and faster, which helps them plan the software and hardware development projects to start today, for use five years from now.

Limitations to Moore's Law

For transistor counts to keep growing, the size of a transistor must keep getting smaller. But chip density and processor speed have run up against an important limit: denser chips and faster signal processing both generate increased heat. Current technology is right at the edge of generating enough heat to melt the chips, destroying the computer. This is why processor chips are surrounded by metal heat sinks, which conduct heat away from the chip and into the air.

Logic Gates - Video

Logic Gates - Slides

D03-Hardware Abstractions: Gates and Logic

Truth Tables

Given 2 inputs, A and B, a truth table displays the result of the given operation for every possible pair of A/B values: TT, TF, FT, FF.



Make a copy of this spreadsheet, and toggle the two checkboxes for A and B to simulate the differenct circuit gates.


Truth Tables

AP CSP Pseudocode Logical Operators

In Python and in the AP CSP pseudocode, the logical operators AND, OR, and NOT can be used to combine boolean expressions in programming, and they behave in the same way that the AND, OR, and NOT logic gates behave in computer hardware. The exam reference sheet provides the definitions for the following logical operators where the condition can be a single boolean value or a boolean expression made up of other values and operators.


OR confusion

The word OR has different meaning in the following two sentences; which meaning corresponds to the Boolean OR gate?

The first sense of OR (soup or salad) is known as Exclusive OR and the second sense (accident or illness) is known as Inclusive-OR. Inclusive-OR is the same as Boolean OR, which is what is used in Python and most other programming languages.

Still Curious?

Still curious about logic gates? There is much written about logic gates and lots of material available online.