Computers On Verge of a Quantum Leap
3D transistor could transform computing
- Computer scientists working at the nanoscale to keep up with technology demands
- Silicon chips in all our electronic devices now contain billions of transistors
- Tiny nano-sized switches are building blocks of modern computing
- Quantum computing could supersede traditional computers soon
(Matthew Knight and Nick Glass @ CNN) -- We swipe, we tap, we scroll and click, but rarely do we pause to think about what goes on in the maze of electronics beneath our fingertips.
But next time you are marveling at the computer hardware in your hands spare a thought for the tiny transistors in our computer chips. Without them all our modern gizmos wouldn't work.
"I think transistors really are the unsung heroes of the information age," says Kaizad Mistry, vice president at the world's leading chip maker, Intel. "These tiny little switches ... these are the things that our computers, our servers, our smartphones and laptops (depend on)."
Since Intel introduced the first commercial microprocessor in 1971, all chip manufacturers have been striving to increase processor speeds by cramming more and more transistors -- the tiny switches that control electrical signals -- onto surfaces no bigger than the size of a fingernail.
Intel's landmark 4004 chip contained 2,300 transistors, each measuring a few micrometers (a millionth of a meter) across. Today, the most advanced silicon chips contain billions of nano-sized switches controlling the flow of electrical currents.
Peter J. Bentley, University College London
Measuring one billionth of a meter across, objects at the nanoscale are almost impossible to imagine.
Imagine that a marble measured one nanometer across, says the U.S. government's National Nanotechnology Initiative by way of comparison. If it did, then a meter would be equal to the diameter of the Earth.
"Nanoscale devices are important because our societies have gone crazy about information," says professor Peter J. Bentley from University College London and author of "Digitized: The science of computers and how it shapes our world.
"The amount of data produced and consumed every day is reaching unthinkable levels, and it increases every day. But that data has to be stored somewhere, and processed by something," Bentley told CNN via email.
"So basically, we either keep getting smaller so that we can store and process more information in the same small size for about the same power cost ... or we will have to start rationing data usage per person because the computing power needed will eventually use more energy than the planet can support," he added.
In a bid to keep pace with Moore's Law -- Intel founder Gordon E. Moore's assertion that the number of transistors on a chip should double roughly every two years -- Intel have developed new "3-D" or "Tri-Gate" transistors, each one measuring 22 namometers across.
Peter J. Bentley, University College London
"For the last 40-50 years, the transistors we've made have conducted electricity along a planar surface of a silicon wafer. A 3-D transistor is a new concept, a new architecture for making tiny transistors ... it's just a fundamentally better switch," Mistry says.
"What we've done is create these pillars or fins on the surface of the wafer and now the current can flow on all three sides of that fin so in any given footprint you can have more of a current conduction."
Intel's Ivy Bridge chip contains an astonishing 1.4 billion transistors that switch on and off more than 100 billion times a second and run 4,000 times faster and use 5,000 times less energy than the 4004 microchip.
"The Tri-Gate transistors are a very nice redesign of the traditional planar transistor that sits inside the chips," says Bentley.
"One of the key problems with keeping up with (Moore's Law) is heat dissipation. Traditional chips are getting way too hot. The Tri-Gate transistor will almost certainly help in that respect as it can operate at lower voltages."
Smaller 14 nanometer transistors are currently being developed, with Intel planning a release date in 2014. But it won't be long before computer chip manufacturers will be having to think even smaller.
"We have just about hit the limits now," Bentley says. "Already we are so small that quantum tunneling -- where electrons magically zip through solid objects because of quantum effects -- can cause real problems in chip design. Go smaller and quantum effects will stop the transistors working at all."
Our only choice is to learn how to perfect the science of quantum computing, he says.
Peter J. Bentley, University College London
Quantum computers work in a different way from conventional computers, manipulating objects like electrons and photons to perform processing tasks.
Rather than using transistors which switch on and off the zeros and ones or binary digits (bits) of information, quantum computers work with quantum bits (qubits) which can represent all combinations of zeros and ones simultaneously.
"In theory, (quantum computers) would be able to solve difficult problems by finding one elusive solution out of gazillions because they kind of look at all the solutions at the same time," Bentley says.
In May, Google and NASA announced they were teaming up to share one of the world's first commercial quantum computers.
The machine made by Canada's D-Wave Systems will be installed at the new Quantum Artificial Intelligence Lab at NASA's Ames Research Center in California.
Crunching enormous amounts of data is expected to spur advances in climate and economic modeling as well as improving understanding of human genetics. But that's all still quite a long way off, says Bentley.
"In practice, there are bound to be many practical limitations, so we've got to spend the next few decades perfecting this radical new technology before we'll really know how far we can push it."
- Printer-friendly version
- Log in to post comments
- 4389 reads