The Chip That Changed the World


Sixty years ago, Jack Kilby built the first integrated circuit. Soon we'll need a new Moore’s Law.


By Andy Kessler




Sixty years. But how much longer? In 1958 Jack St. Clair Kilby—from Great Bend, Kan.—created one of the greatest inventions, a great bend, in the history of mankind. Kilby recently had started at Texas Instruments as an electrical engineer. Most everyone left on a mandated summer break, but he stayed in the lab and worked on combining a transistor, capacitor and three resistors on a single piece of germanium. On Sept. 12, he showed his boss his integrated circuit. At a half-inch long and not very wide, it had ugly wires sticking out, resembling an upside-down cockroach glued to a glass slide.


In January 1959 Bob Noyce, another Midwesterner, was keeping busy at Fairchild Semiconductor in Palo Alto, Calif. He deployed a photographic printing technique—the planar process, which uses glass as insulation—to deposit aluminum wires above silicon transistors. Without the messy cockroach-leg wires, the integrated circuit, or chip, became manufacturable.


In March 1960, TI introduced the Type 502 Flip Flop—basically one bit of memory for $450. A few weeks later, Fairchild announced its own. The U.S. Air Force used them in 1961. So did new computer companies and even NASA in its Apollo rockets. One bit turned into four, then 16, then 64. This started the shrink, integrate, shrink, integrate, rinse, repeat motion that’s still going strong today. This relentless cost decline creates new markets out of nothing.


In 1965 Fairchild research chief Gordon Moore wrote a now-famous article for Electronics magazine, “Cramming More Components Onto Integrated Circuits.” Today known as Moore’s Law, the article’s thesis was that chips would double in density every 18 to 24 months. By 1969, the 3101 64-bit memory chip shipped at $1 a bit. We’ve come a long way. Your iPhone probably has a trillion bits at picocents a bit. Kilby received the Nobel Physics Prize in 2000 and credited Noyce, who had died a decade earlier, in his acceptance speech. You have to wonder what took the Nobel Committee so long.


Integrated circuits are the greatest invention since fire—or maybe indoor plumbing. The world would be unrecognizable without them. They have bent the curve of history, influencing the economy, government and general human flourishing. The productivity unleashed from silicon computing power disrupted or destroyed everything in its path: retail, music, finance, advertising, travel, manufacturing, health care, energy. It’s hard to find anything Kilby’s invention hasn’t changed.


Now what? Despite the routine media funeral for Moore’s Law, it’s not dead yet. But it is old. We are starting to count finicky atoms. Intel and Samsung are making chips with 7-nanometer features, and silicon atoms are 0.2 nanometer wide. Atoms and the speed of light are hard limits. Three-dimensional structures might add time. There are even lab demos of nanotechnologies spraying on integrated circuits. Still, I give Moore’s Law another decade.


Brace yourself. When Moore’s Law finally gives up the ghost, productivity and economic growth will roll over too—unless. The world needs another Great Bend, another Kilbyesque warp in the cosmos, to drive the economy.


One hope is quantum computing, which isn’t limited by binary 1s and 0s, but instead uses qubits (quantum bits) based on Schrödinger’s quantum mechanics. I’ve seen 17-, 50- and now 128-qubit processors—a familiar progression? Quantum computers are wicked fast but operate just above zero Kelvin (minus 459 Fahrenheit), a little cold for iPhones.


Maybe architecture will keep the growth alive. Twenty years ago, Google created giant parallel computer systems to solve the search problem. The same may be seen for artificial intelligence, which is in its infancy. It’s only been, what, three years since voice and facial recognition really started working? And that’s thanks to machine learning and neural networks, mere baby steps. Economics are uncertain, but I’d bet a Moore’s Law for AI is postulated soon.


Energy is being disrupted but not fast enough. Where is that battery breakthrough? Nuclear fusion has been 30 years away for the past 30 years. But between Tokamaks, Stellerators, JET and Z Machines, perhaps some fusion technique will output more than we put in.


Biocomputing is another fascinating area. We already have gene editing in the form of Crispr. New food supplies and drugs may change how humans live and not die and bend the curve. But I’ve learned over the years that anything involving biology is painfully slow. Computing takes nanoseconds; biology takes days, weeks, even years. Breakthroughs may still come, but experiments take so long that progress lags behind. Still, I’d watch this space closely.


Something better work. Humanity will need a new stimulus soon. Let’s hope the next Jack Kilby skipped this summer’s vacation.



PBS VIDEO on the Subject. (Docu starts after 2 min cartoon intro)