advertisement

What will come after the computer chip?

TEMPE, Ariz. — Intel co-founder Gordon Moore famously wrote about how the number of transistors on silicon chips would double roughly every two years — an observation now known as Moore’s Law. But even as Intel pushes into nanotechnology, computing is now reaching the limits of that law.

We are literally running out of atoms with which to make individual transistors. Recently, nanotechnology has led to many new and exciting materials — such as semiconductor nanowires, graphene and carbon nanotubes. But as long as computing is based on digital logic (ones or zeros) moving electronic charge around to turn on and off individual transistors, these new materials will only extend Moore’s law two or three more generations. The fundamental size limits still exist, not to mention limitations due to heat generation. Some new paradigms of non-charge-based computing may emerge that for example, could theoretically use the spin of an electron or nuclei to store or encode information. However, there are many obstacles to creating a viable, scalable technology based on “spintronics” that can keep us on the path of Moore’s law.

It’s important to remember, though, that Moore’s law can be viewed not merely as a doubling of density of transistors every two years, but as a doubling of information processing capability as well. While bare number-crunching operations are most efficiently performed using digital logic, new developments in digital imagery, video, speech recognition, artificial intelligence, etc., require processing vast amounts of data. Nature has much to teach us in terms of how we can efficiently process vast amounts of sensory information in a highly parallel, analog fashion like the brain does, which is fundamentally different than conventional digital computation. Such “neuromorphic” computing systems, which mimic neural-biological functions, may be more efficiently realized with new materials and devices that are not presently on the radar screen. Similarly, quantum computing may offer a way of addressing specialized problems involving large amounts of parallel information processing. The most likely scenario is that the computer chip of the future will marry a version of our current digital technology to highly parallel, specialized architectures inspired by biological systems, with each performing what it does best. New computational paradigms and architectures together with improved materials and device technologies will likely allow a continued doubling of our information processing capability long after we reach the limits of scaling of conventional transistors.

Ÿ Goodnick is a professor of electrical engineering at Arizona State University, the deputy director of ASU Lightworks, and the president of the IEEE Nanotechnology Council.

Article Comments
Guidelines: Keep it civil and on topic; no profanity, vulgarity, slurs or personal attacks. People who harass others or joke about tragedies will be blocked. If a comment violates these standards or our terms of service, click the "flag" link in the lower-right corner of the comment box. To find our more, read our FAQ.