What happened to clockless computer chips?

21,924

Solution 1

Here's an article from a few years ago that's gung-ho on the technology, but I think the answer can be found in this quote:

Why, for example, did Intel scrap its asynchronous chip? The answer is that although the chip ran three times as fast and used half the electrical power as clocked counterparts, that wasn't enough of an improvement to justify a shift to a radical technology. An asynchronous chip in the lab might be years ahead of any synchronous design, but the design, testing and manufacturing systems that support conventional microprocessor production still have about a 20-year head start on anything that supports asynchronous production. Anyone planning to develop a clockless chip will need to find a way to short-circuit that lead.

"If you get three times the power going with an asynchronous design, but it takes you five times as long to get to the market—well, you lose," says Intel senior scientist Ken Stevens, who worked on the 1997 asynchronous project. "It's not enough to be a visionary, or to say how great this technology is. It all comes back to whether you can make it fast enough, and cheaply enough, and whether you can keep doing it year after year."

Solution 2

There's some information on this subject availableboth here Asynchronous CPU and here History of general purpose CPUs, including a list of (some not so) recent implementations.

Looking at some of the benefits (power consumption, speed) and disadvantages (increased complexity, more difficult to design) it seems logical that in recent years the development seems to have focussed on embedded designs:

  • Epson ACT11
  • SEAforth® 40C18
Share:
21,924
GeoffreyF67
Author by

GeoffreyF67

bbbb

Updated on February 28, 2020

Comments

  • GeoffreyF67
    GeoffreyF67 about 4 years

    Several years ago, the 'next big thing' was clockless computers. The idea behind it was that without a clock, the processors would run significantly faster.

    That was then, this is now and I can't find any info on how it's been coming along or if the idea was a bust...

    Anyone know?

    For reference:

    http://www.cs.columbia.edu/~nowick/technology-review-article-10-01.pdf

  • pjc50
    pjc50 over 13 years
    This. In order to make async chips you have to develop a whole new set of tools, train new design engineers, develop new working practices, and probably along the way have a few spectacular design failures. This costs a hell of a lot, compared to relying on the current process scaling.
  • Darren Ringer
    Darren Ringer about 7 years
    Eight years later I am sure more could be said on this subject; even this answer doesn't really paint the concept as a dead end as far as I can tell.
  • Mark Ransom
    Mark Ransom about 7 years
    @DarrenRinger as implementing Moore's Law becomes harder and harder it certainly seems like there'd be an opportunity for an alternate approach. If you have information about any recent attempts it would be worth leaving your own answer, I haven't heard of any.