Chip Designers Search for
Life
After Silicon
By JOHN MARKOFF
It was
a chance meeting between a self-described
"physicist
gone bad" and a chemist. And it may
someday lead to the creation
of a new breed of
computers based on tiny
electronic switches, each no
thicker than a single molecule.
Three years ago Phil Kuekes,
a Hewlett Packard
physicist with three decades
of experience designing
computers, was pondering
new ways to use a computer
he had developed using massively
parallel architecture
-- technology that attacks
computing problems by
breaking them into hundreds
or even thousands of
pieces to be simultaneously
processed by many
individual chips.
At about this same time,
Kuekes (pronounced
KEE-kus) happened to
make the acquaintance
of James Heath, a
chemist at the University
of California at Los
Angeles whose lab had
been experimenting with
tiny structures based on
molecules of a synthetic
substance called
rotazane. It seemed that
these molecular
structures might be able
to function as digital switches
-- the basic, binary on-off
information gateways of
modern computing.
Soon the two scientists were
brainstorming about how
it might be possible to
blend Kuekes' computer design
with Heath's Lilliputian
switches. In the fashioning of
tiny switches, or transistors,
from small clusters of
molecules a single layer
deep, the researchers see a
coming era of computers
that would be 100 billion
times as fast as today's
fastest PCs.
The work, detailed in a paper
published Friday by the
Hewlett-UCLA teams in Science
magazine, is a
noteworthy example of the
groundbreaking research that
suddenly seems to be flourishing
at computer
laboratories around the
country -- a flowering of ideas
that some leaders in the
field have begun to consider a
renaissance in computer
science and design.
In corporate and academic
labs, researchers appear to
be at or near breakthroughs
that could vastly raise the
power and ubiquity of the
machines that have insinuated
themselves into almost every
facet of modern society.
And since many of these
efforts, like the
Hewlett-UCLA research, are
focused at the
microscopic level, computing
could become an
increasingly invisible part
of everyday life.
"A lot of exciting stuff
is happening
outside the mainstream PC
market,"
Marc Snir, a senior research
manager at IBM Corp., said.
"We
are entering a world where
there
will be billions of small
devices
spread everywhere."
The Hewlett-UCLA team is
just one of six groups
around the country working
under an ambitious program
of the federal Defense Advanced
Research Projects
Agency that is trying to
create a new kind of
molecular-scale electronics
-- known as moletronics --
that researchers hope will
one day surpass the power
and capacity of the silicon-based
technology used in
today's computers. Last
year researchers at Yale and
Rice universities took earlier
steps toward the same
goal of assembling computers
from molecular
components.
Meanwhile, separate from
the military research
program, researchers at
the Massachusetts Institute of
Technology's Laboratory
of Computer Science are
trying to meld the digital
with the biological by
"hacking" the common E.
coli bacterium so that it
would be able to function
as an electronic circuit --
though one with the marvelous
ability to reproduce
itself.
It may all sound esoteric.
And even the most
enthusiastic researchers
concede that practical
applications of their theories
and methods may be a
decade or more away. But
researchers seem intent on
renewing the emphasis on
the science in computer
science, venturing beyond
electrical engineering and
physics to draw upon diverse
disciplines like
biochemistry as they form
their hypotheses and test
them with empirical evidence.
MIT's computer scientists,
for example, are pursuing
the idea of building --
or possibly growing, if the E.
coli experiments pan out
-- vast numbers of almost
identical processors that
might act as sensors or even
as the task-performing devices
called actuators.
"We would like to be able
to make processors by the
wheelbarrow-load," said
Harold Abelson, an MIT
computer scientist.
Abelson and his colleagues,
who call their approach
amorphous computing, are
experimenting with the idea
of mapping circuitry onto
biological material. That
might let living cells function,
for example, as digital
logic circuits. Such circuits
are information pathways
that, whatever their complexity,
ultimately involve a
multitude of quite simple,
binary choices: 0 or 1, on or
off, this but not that.
Biological cells, of course,
would be able to compute
only as long as they remained
alive. But the premise is
the same as in the molecular-scale
work: Pack as many
millions or billions of
these tiny decision switches as
possible into the smallest
spaces conceivable.
The resulting "smart" materials
might be used for new
types of paints or gels,
for example, that would enable
a highway to be "painted"
with computerlike sensors to
issue traffic reports, or
let floors be given a surface of
infinitesimal actuators
that could detect dirt and dust
and silently and instantly
whisk it away.
In the case of the Hewlett-UCLA
work, researchers
have successfully created
digital logic circuits, but not
yet any in which the molecular
switches can be restored
to their original state
-- returning to the off position, for
example, after having switched
to on. And still to be
developed are the molecular-scale
wires that would be
needed to interconnect the
switches.
The really significant implication
of the Science article
is that for the first time
researchers have built
molecular-scale computing
components using chemistry
rather than the time-honored
technology of
photolithography, the ultraviolet-light
etching of
circuitry onto silicon that
is the process for making
today's chips. The chip
industry has not yet reached the
theoretical limits of photolithography,
but the day may
come when it is no longer
possible to etch circuits any
closer together. That is
where molecular chemistry
could take over -- and possibly
wring more computing
power from a single chip
than exists today in the
biggest, fastest supercomputers.
Last week, Kuekes said his
team had been in contact
with the MIT group and was
now discussing the
possibility of combining
the Hewlett-UCLA molecular
switching technology with
the MIT lab's biological
processor work with an eye
toward future computer
designs.
"Think of us as the Sherwin-Williams
of the
Information Age," Kuekes
said, referring to the vision
of suspending billions of
tiny processors in a paint.
"This is the raw material
for super-intelligent
materials."
The scientists acknowledge
that their projects are
gambles and that any practical
applications may be a
decade or more away. But
the work on both coasts
indicates the breadth of
the renaissance now sweeping
through computer-design
circles.
To some extent, the most
recent work is a continuation
of efforts that began about
five years ago and quickly
grew into the commercial
field of
microelectromechanical systems,
or MEMS chips.
MEMS are microscopic mechanical
structures etched
into the surface of silicon
chips; they have spawned a
multibillion-dollar business,
largely around the chips in
silicon accelerometers that
are now standard equipment
as sensors for air-bag collision
systems in cars.
But as the Hewlett-UCLA and
MIT research makes
clear, even the conventional
circuitry of a silicon chip
-- or the silicon, for that
matter -- can no longer be
assumed. Indeed, computer
architects are rethinking the
whole concept of the microprocessor,
the master chip
that begat the PC and that
has been the building block of
modern computers for a quarter
of a century.
"It's time to do something
different. It's time to look at
the things we've ignored,"
said David Patterson, a
computer scientist at the
University of California at
Berkeley.
In the early 1980s, Patterson
helped pioneer one of the
most significant computer
design innovations of its era,
a technology known as reduced
instruction set
computing, or RISC.
Building on ideas first advanced
during the 1970s by a
team of researchers working
under the computer
scientist John Cocke at
IBM's Thomas J. Watson
Laboratory, Patterson and
his Berkeley graduate
students proved that sharp
increases in the speed of
processor chips could be
achieved by simplifying
computer hardware and shifting
many functions to
software.
For almost a decade after
the success of the Berkeley
RISC project, many experts
in the computer industry
believed that RISC would
ultimately displace Intel
Corp.'s X86 computer chip
design, on which the
industry-standard PC was
based.
Hoping to unseat Intel, dozens
of new RISC-inspired
companies sprang up from
the mid-'80s through the
mid-'90s. But beginning
with its 486 chip, introduced in
1989, and continuing in
its Pentium series, Intel was
already incorporating the
best RISC ideas into its
chips, keeping their performance
close enough to
RISC's potential to let
the company retain its
commanding market lead.
The combination of Intel's
hardware and Microsoft's
software proved invincible,
and one by one the RISC
challengers collapsed. The
end of the RISC era almost
a decade ago left computer
designers with few
fundamentally new ideas
for improving computer
performance.
But even as the Intel juggernaut
moved on, the rising
importance of the Internet
and a growing consensus that
computing's future lies
in inexpensive
consumer-oriented devices
helped fuel the renaissance
in chip design.
Patterson, the RISC pioneer,
has now embarked on a
new design approach, known
as intelligent RAM, or
IRAM, that has generated
great interest among
consumer electronics companies.
RAM stands for random access
memory, the
semiconductor memory that
is used as a kind of
temporary scratch pad by
software programs. Patterson
and his Berkeley colleagues
have noted that the largest
performance constraint in
computer systems today is the
mismatch in speed between
the microprocessor and the
slower memory chips. The
Berkeley researchers
predict that in the next
decade processors and memory
will be merged onto a single
chip.
The IRAM chips would embed
computer processors in
vast seas of memory transistors.
Instead of stressing
pure processor speed, these
new chips would place the
emphasis on avoiding bottlenecks
that slow the data
traffic inside a processor.
Such an approach would be
especially attractive to
makers of memory chips --
companies eager to find
new ways to distinguish
themselves in what is now
a commodity market.
At the same time, some consumer
electronics
companies are pursuing ideas
similar to those behind
the IRAM project. For example,
Sony Corp.'s Emotion
Engine chip, which the company
is designing in
cooperation with Toshiba
Corp. for the coming Sony
Playstation II game console,
blends memory and
processor logic as a way
to create faster, more vivid
on-screen game action.
But no single computer architecture
is optimal for every
kind of problem. That is
why many researchers are
exploring the idea of reconfigurable
chips -- chips
whose circuitry can be reorganized
for each specific
computing problem.
One of the most extreme efforts
in this direction is a
processor approach known
as RAW, for
raw-architecture work station,
that is being pursued by
a group of researchers at
MIT's Laboratory of
Computer Science, working
separately from the E. coli
project.
RAW pushes the idea of RISC
to a new extreme. The
processor would not have
an "instruction set" in the
conventional sense of the
term, which refers to the
types of instructions that
let software programmers
direct a chip to do things
like add or subtract, or
compare and move blocks
of data. Instead of using
preordained instruction
sets, RAW would present the
entire chip to the programmer
as a veritable blank slate
on which tens of millions
of transistors might be
individually programmed.
Such unprecedented flexibility
would mean that the
same basic chip might be
used for an infinite variety of
purposes -- like creating
three-dimensional animated
graphics, performing supercomputer
calculations or
carrying out tasks not yet
conceived.
"We're trying to determine
whether this is simply a
challenge for programmers
or a complicated
nightmare," said Anant Agarwal,
the MIT computer
designer who heads the RAW
project.
As all these research efforts
proceed, computer
scientists are toiling against
the perceived limits of
Moore's Law -- a guiding
principle for chip design
since the Intel co-founder,
Gordon Moore, observed in
1964 that the number of
transistors that could fit onto a
silicon chip doubles approximately
every 18 months.
But ultimately, there are
physical limits to how closely
together circuits can be
etched with light onto the
surface of a silicon chip
and still function -- a reality
that is expected to mean
the end of the Moore's Law
paradigm sometime around
the year 2012. That is why
forward-looking work like
the molecular-scale
circuitry research of the
Hewlett-UCLA team is so
important.
"Clearly a technology this
radically different won't tip
over a trillion-dollar industry"
like today's
computer-chip industry,
Kuekes said. "But we're
looking considerably ahead
of where silicon runs out of
steam."