NEW YORK — A small electronic component that has caused one of most dramatic outpourings of technological progress in human history turned 60 last week.
The integrated circuit, better known as the semiconductor chip, has unleashed change comparable to the Industrial Revolution by making the computer revolution and the digital age possible.
Today there are far more chips on earth than people. All around us, millions of them are tirelessly at work – in computers, phones, television sets, printers, copiers, CD players, PlayStations, cars, trains, airplanes, in almost all electronics.
Without them, there would be no iPod, no Blackberry. There would be no laptop computer or International Space Station or the Hubble Space Telescope. There would be no Apple, Intel, Samsung, Nokia, Microsoft or Google; no Silicon Valley; no multibillion-dollar semiconductor industry. There would be no Internet.
When Jack Kilby invented the integrated circuit at Texas Instruments on Sept. 12, 1958, electronics still meant mostly vacuum tubes. Transistors – small electronic switches capable of amplifying current – had been invented about a decade earlier, but were still not widely used.
In 1906, Lee de Forest had discovered that an electrified mesh placed between two electrodes in vacuum could amplify electrical current and act as a switch. Vacuum tubes were born and soon became indispensable in radios and the rapidly expanding telephone system. During World War I, half a million vacuum tubes were manufactured every year by Western Electric; by the end of the war, the number had doubled to 1 million.
In the mid-1940s, scientists at AT&T’s Bell Labs, foreseeing the problem with vacuum tubes, created a team to find a replacement. The goal was to make a solid-state device that would have no vacuum, no filaments and no moving parts. The team bet on semiconductors – novel materials whose physical properties were just beginning to be understood.
By December 1947, Bell Labs researchers had struck gold. Transistors drastically reduced the power needed to run electronic circuits. But a circuit still needed to be assembled out of individual transistors and other components such as resistors and capacitors that had to be connected together with wires and solder. A single faulty connection meant that the circuit would not work.
Kilby’s revolutionary idea was to make all the different components of a circuit out of the same flat block of semiconductor material. Not only would this get rid of wires and faulty connections, it would make the entire circuit much more compact. Kilby demonstrated his first “integrated circuit” on Sept. 12, 1958. It worked perfectly.
Six months later, in California, another engineer, Robert Noyce, independently came up with the idea of making an integrated circuit. Noyce’s chip was better suited to be manufactured in large numbers, and soon he was part of a young company called Intel.
Thus was launched a revolution. The first chip-based computer was the first U.S. Air Force computer, built in 1961. The true potential of the integrated circuit was shown when Texas Instruments unveiled the pocket calculator. Previously calculators had been bulky devices that needed to be plugged in to electrical mains. The pocket calculator, small enough to hold in one’s palm, had a chip inside and batteries were adequate to power it.
Progress was rapid thereafter. Many have already heard of Moore’s law, which has become a mantra of the digital age. First put forward by the Intel co-founder Gordon Moore in the 1960s, it says that the processing power of a chip doubles every two years, while the price falls by half. For more than four decades, Moore’s law has held, driving incredible growth and miniaturization – and wealth.
As Moore pointed out once, “If the auto industry advanced as rapidly as the semiconductor industry, a Rolls Royce would get a half a million miles per gallon, and it would be cheaper to throw it away than to park it.”
Today, a state-of-the-art chip contains more than a billion components packed into an area the size of a human fingernail.
The question is whether the semiconductor industry can sustain this pace. Further increasing the processing power of chips is proving to be problematic as certain fundamental physical barriers are being reached. At the same time, new frontiers are opening up. The quest is on to make chips that are powered by light instead of electricity, which will enable much faster computers.
As you switch on the television or answer your mobile phone or sit at your computer to surf the Internet today, stop for a second to fete the amazing chip that has made it all possible.