Electrons, Units, and Semiconductors
CS 441/641 Lecture, Dr. Lawlor
A Brief History of Computing, starting in 150 BC
Folks have been using physical devices to perform computations for a long time.
Mechancial Devices
- 150 BC: Greeks, likely including Archimedes, built clockwork-like chains of gears such as the Antikythera mechanism to predict astronomical events such as eclipses, and to measure time and convert between calendars.
- 1640's: Blaise Pascal built a series of adding machines, which used a series of hand-cranked cogs to add (similar to a car's mechanical odometer), or via complement arithmetic, subtract; or via repeated addition, multiply.
- 1820's: Charles Babbage designed (but never built) a fully-mechanical polynomial evaluator, the difference engine, via the method of finite differences. He also started work on a fully programmable model, the analytical engine, but building the thing with rod logic would have taken a huge amount of labor.
- 1948: CURTA, a mass-produced fully-mechanical pocket calculator.
- 1949: MONIAC, a hydraulic computer, models the country's financial system using water.
- 1950's: the automatic transmission, a hydraulic computer, becomes cheap enough for ordinary people to buy.
Of course, there are serious limitations to mechanical devices:
it's hard to even turn a corner with a rotating axle, just like it's
hard to make a leak-free joint with liquids. One huge advantage
of electronics is that wires are very easy to route, bend, and join.
Electromechancial Devices
- 1890: IBM corporation uses the patented electromechanical (mercury switches and relays) Hollerith tabulator to count up the punched cards that represent the 1890 census results. The 1891 Electrical Engineer raved: "This apparatus works unerringly as the mills of the gods, but beats
them hollow as to speed."
- 1941: Konrad Zuse builds the world's first fully-programmable computer, the Zuse Z3. Sadly, it used scavenged telephone switching relays, and was built in wartime Germany, so it was ignored for years.
Fully Electronic: Vacuum Tubes and Transistors
- 1944: John von Neumann
proposes using the same memory to store both program and data, now
known as a "von Neumann machine". Previous designs used separate
memories for program and data, known as the Harvard architecture.
- 1946: ENIAC, the
first vacuum-tube electronic automatic computer, built by the US
military. ENIAC is fully programmable. Vacuum tubes can
switch in nanoseconds, like transistors, rather than milliseconds, like
relays.
- 1956: IBM releases Fortran,
the first successful programming language. Prior to Fortran,
machines were typically programmed using a soldering iron, patch cables, machine code, or assembly.
- 1960's: IBM's System/360, which adds microcode and binary backward compatability using those newfangled transistors.
- 1964: Seymore Cray's CDC 6600 achieves amazing performance using superscalar
processing, caching, newfangled transistors, liquid cooling, and offloading I/O
to dedicated "peripheral processors", which were hyperthreading-style barrel processors.
Integrated Circuits
- 1971: Upstart Intel creates a single-chip CPU, the 4004, which computes 4-bit values at up to 0.74MHz, 0.1MIPS. 2,300 transistors.
- 1972: HP-35, the first electronic pocket calculator good enough to replace the slide rule, for only $395.
- 1972: Intel's 8008, 8-bit values at up to 0.5MHz. 2,500 transistors.
- 1978: Intel's 8086, 16-bit values at up to 10MHz, 1 MIPS. 29,000 transistors. Instruction set is "x86", still in use today!
- late 1970's: "micro" digital computers, like the Apple I, become cheap enough for dedicated hobbyists to buy and solder together.
- 1981: digital computers, like the IBM PC,
become cheap enough for ordinary people to buy pre-assembled. The
notion of selling software is popularized by the upstart "Micro-soft"
corporation.
- 1984: Apple releases a 32-bit personal computer, the Mac 128K.
- 1985: Intel's 80386, 32-bit values at up to 40MHz. 10 MIPS. 275,000 transistors.
- 1985: The notion of specialized hardware for graphics is popularized by Silicon Graphics corporation. RISC instruction sets are pushed by MIPS corporation.
- 1990: IBM introduces a superscalar RISC machine for personal computers, PowerPC.
- 1994: Intel's releases a 100MHz Pentium (P5) CPU. 120 MIPS (superscalar). 3 million transistors.
- 1990's: graphics hardware for personal computers takes off with GLQuake and other 3D games.
- 2000: Intel releases a 1 GHz Pentium III CPU. 1000+ MIPS. 10 million transistors.
- 2002: Intel releases a 3 GHz Pentium 4 CPU, with hyperthreading. 55 million transistors.
- 2002: Graphics cards become programmable in assembly language (ARB_fragment_program), and support dozens of threads.
- 2003: NVIDIA releases "Cg", a C++-like language for programming
graphics cards. Limitations include a single write per program.
- 2003: AMD corporation introduces chips with a 64-bit extension to the x86 instruction set, which Intel later adopts.
- 2004: Intel abandons plans for a 4GHz Pentium 4 chip.
- 2006: Intel releases dual-core and quad-core CPUs at around 2GHz. The great multithreaded programming model panic begins.
- 2007: Intel announces "V8" eight-core systems. Transistor counts reach billions.
- 2007: NVIDIA releases CUDA, a very C-like language for
programming graphics cards for non-graphics tasks. Supports
arbitrary reads and writes.
- 2008: Graphics hardware now supports between thousands and millions of threads, and use billions of transistors.
Electrical Units
OK, so we're going to use electricity. That means we need to know the units used.
Electronics
|
Electrons
|
Wire
|
Battery
|
Resistor
|
Voltage (volts)
|
Current (amps)
|
Capacitor
|
Transistor
|
Plumbing
|
Water
|
Pipe
|
Pump
|
Clog
|
Pressure (psi)
|
Flow rate (gpm) |
Accumulator Tank
|
Valve
|
Electron: it's a fundamental particle with negative charge (drawn as e-).
As far as we know, there's nothing inside an electron (other than...
electron!). Electrons are the charge carriers in most solid-state
conductors. For example, metals conduct electricity because they
don't mind trading their spare "valence" electrons with their neighbors.
Current: one amp is a coulomb of moving electrons per second (6.24x1018 e-/sec)
.
Typical microcontroller output signals are measured in milliamps;
on-chip logic signals might only be a few picoamps. Typical wall
plug currents are up to a few dozen amps, and typical arc
welding current is about a hundred amps. A desktop PC CPU also
typically uses about a hundred amps!
Voltage: measures how much
electrons want to be somewhere, in volts. Electrons will try to
flow from a lower voltage to a higher voltage region. We define
the planet Earth's voltage as "ground", or zero volts, and connect to
it with a metal rod stuck into the dirt (usually outside your
electrical pole!). Back in the 1980's, it was common to use 5
volts direct current (5vdc) to represent "true", and 0v as false; in
the 1990's they switched to 3.3v == true; now most desktop CPUs use
only about 1v == true internally, to save power. For example:
- PC power supply, black wire = 0vdc
- PC power supply, red wire = 5vdc
- PC power supply, yellow wire = 12vdc
- PC power supply, orange wire = 3.3vdc
- One AA battery = 1.4vdc
- Lead-acid car battery = 12vdc
- A typical Fairbanks electrostatic charge = 10,000v or more(!)
Resistance: V = I R (volts =
amps * ohm). A 1 ohm resistor will lose 1 volt when conducting 1
amp of current; a 10 ohm resistor will drop 10 volts when conducting
one amp. Ohm's law
(V=I R or volts = amps * ohms) is more of a guideline, an assumption of
linearity that is only valid for "resistive" materials (OK for most
metals; but poor for most insulators, liquids, or
semiconductors). The beautiful part about semiconductors is their
resistance can be varied electronically (typically depending on nearby
voltages). For example:
- Typical wire resistance is milliohms.
- Typical "pull-up" logic resistor is Kohms.
Power: P = I V (watts = volts *
amps, at least for DC circuits). A 0.1 ohm piece of wire will
drop 1 volt if you push 10 amps through
it, which takes 10 watts: the wire might get fairly warm, but will
still be there. The same wire taking 100 amps will drop 10 volts,
so must dissipate 1000 watts: the wire is going to feel some serious
heat. Generally:
- Anything over a megawatt needs a dedicated liquid cooling *tower*.
- Anything over a kilowatt is going to glow red to white hot without dedicated liquid cooling, like a
radiator. Your car is a few hundred kilowatts. Ordinary personal computers haven't hit this point yet,
although supercomputers and data centers have been liquid cooled since
the Cray-2 in 1985.
- Air cooling with a heatsink works well between a few dozen and a
few hundred watts. This is the thermal regime of most modern PCs.
- Air cooling without a heatsink is fine below a dozen or so
watts. This is the design range of most cellphone and some
portable processors.
For example, a current-signaling network might represent individual
byte values as groups of 0 to 255 electrons. 255 electrons per
sample at 100 million samples per second is 25.5x109 Ge-/s. One Coulomb of charge is 6.24x1018 e-, so that's a current of 4x10-9 C/s, or 4 nano-amps.
At 1V signal strength, a one-ohm wire will lose 4 nano-volts. Not
much! Of course, in practice we usually use voltage-signaled
networks, and single electrons have a bad habit of obeying only the
funky quantum laws instead of ordinary classical dynamics, which makes
things much more complicated.
Semiconductors and the "Depletion Region"
Silicon doesn't conduct well: it's not a conductor like copper, or an
insulator like plastic, it's a semiconductor. Silicon doped with
a few extra electrons ("n-type") conducts current because the electrons
move. Silicon doped with a few electron holes ("p-type") conducts
current because the holes between electrons move. Silicon without extra electrons
or holes is a reasonably good insulator because it's "depleted" of electrons.
If you put p-type silicon next to n-type silicon, and apply a charge
across the two types, something very interesting happens: the electrons
and holes can move in opposite directions. With the charge in one
direction, both electrons and holes leave the boundary making an
insulating "depletion region". If you put the charge the other
direction, the electrons and holes both converge on the boundary and cancel
each other out, conducting current. This is a semiconductor diode!
An analogy might be a static front in a war zone, like world war I: as
long as soldiers from both sides keep flowing toward the front, they
kill each other off and the war continues; if the soldiers of both
sides start moving *away* from the front, the front is deserted, and
the war's over!
Modern transistors are FET transistors:
you
charge
up a small channel called the "gate", and that electrostatic charge
pulls carriers into the depletion
region, allowing current to conduct between two terminals. With
the gate uncharged, the depletion region insulates the two terminals,
so no current flows.
In an "n-channel FET", you attract electrons to the gate with a
positive voltage, to narrow the depletion region and allow it to conduct. In a
"p-channel FET", you open the gate with a negative voltage. I
like Wikipedia's pictures for these: n-channel is a positive logic
input, and p-channel is an inverting input with an inverting circle.