By Sharon Gaudin, Computerworld | July 11th, 2014
Cognitive, quantum and even wearable computers will need non-silicon processors.
It’s no secret that computing is changing.
This is no longer a world of mainframes and desktop computers.Laptops, tablets and smartphones are ubiquitous. And soon they’ll have company from wearables, smart homes, smart cars, cognitive computers and perhaps even quantum computers.
“What comes after the mobile phone and the smart phone? It’s not a clear-cut answer,” said Jon Erensen, a research director with Gartner Inc. “It’s an array of simple to complex products. We’ll need chips that are much more power efficient and much more powerful. We’ll need chips with whole new architectures that work in different ways…. Chips need to keep changing with the applications they’re going into.”
IBM said yesterday that it will spend $3 billion over the next five years on research and development into computing and chip materials. In essence, this powerhouse in computing research is looking to rethink computer design.
As it looks to the future of the semi-conductor industry, IBM sees a lot of changes. What it may not see is more silicon.
“IBM engineers and scientists, along with our partners, are well suited for this challenge and are already working on the materials science and device engineering required to meet the demands of the emerging system requirements for cloud, big data, and cognitive systems,” said John Kelly, senior vice president of IBM Research, in a statement. “This new investment will ensure that we produce the necessary innovations to meet these challenges.”
IBM, along with industry analysts, say researchers should be working on not only new materials but new architectures for future processors. Silicon, the basis for today’s chips, can only go so far.
As semiconductor manufacturers move from the current 22 nanometer (nm) architecture down to 14nm and then 10nm in the next several years, it’s going to be increasingly difficult to build these shrinking chips. Seven nanometers may be the lower limit.
When manufacturers shrink chips to that size, their gates, which act as electronic switches, are only a handful of atoms thick. At that level of thinness, gates increasingly allow electrons to squeeze out between them. That means leakage, more heat and the need for more error checking processes.
In other words, scientists see the coming end to Moore’s Law.
To continue building more powerful computers, or smaller devices that slip in a pocket or can be worn, or for cars that communicate with homes, we’re going to need a better computer chip.
This is exciting news for Yehia Massoud, head of the electrical and computing engineering department at Worcester Polytechnic Institute.
“This industry has been moving very incrementally with silicon,” he said. “Now, it’s going to be very difficult to move further with it. We need to get to the post-silicon era. Silicon is running out of steam. We’re hitting the physical limits of it. I don’t think we can go past, probably, 7nm with silicon.”
Massoud estimated that there’s likely just five years worth of advances left with silicon. And since it takes five to 10 years from creating a new chip technology to getting it into production, it’s more than time for researchers to be working on something new.
So in what direction is IBM likely to go?
Analysts and academics agree that researchers will be focused on carbon nanotubes — for flexibility and energy efficiency — and nanophotonics, which relies on light instead of electrical current.
Analysts like Dan Olds, with The Gabriel Consulting Group, and Patrick Moorhead, with Moor Insights & Strategy, say researchers will be looking for new materials to replace silicon. But they’ll also be looking at new chip architectures.
“We’ll see more hybrid systems and we might even see them doing things like stacking chips on top of each other, reducing the distance between them and increasing performance,” said Olds.
Of course, researchers at IBM and elsewhere also are looking at whole new types of processors to power quantum computers and cognitive computers, which function more like the human brain.
“We have tremendous amounts of data surrounding us and if we can actually have a computer that is fast enough and powerful enough to handle it, there’s so much we can do,” said Massoud. “Your cell phone is not as fast as a computer. But let’s assume it gets 10 or 100 times stronger. It could almost act as your personal doctor. It could handle your medical data, monitor your blood pressure, monitor your activity level and your heart rate. It can analyze the data and send it to the doctor. All of this means you need a faster, smaller computer that can handle all of that.”
Erensen noted that semiconductor research under way now will have implications across all semiconductor categories. Researchers might, for example, develop processes specifically for cognitive computing, but then they might be able to use those techniques for different chips.
Olds noted that new chips, and especially new chip designs, will be disruptive, calling for a lot of software to be rewritten to take advantage of them.
“Today’s operating systems and applications will have to be drastically re-architected to take advantage of some of these new system designs,” said Olds. “The size and speed of these new systems will certainly result in big advances in science, medical research — and they’ll impact our everyday lives in a variety of ways.”
Sharon Gaudin covers the Internet and Web 2.0, emerging technologies, and desktop and laptop chips for Computerworld. Follow Sharon on Twitter at @sgaudin, on Google+ or subscribe to Sharon’s RSS feed. Her email address is firstname.lastname@example.org.