Claude Shannon, Mathematician, Dies at 84
By GEORGE JOHNSON
Dr.
Claude Elwood Shannon, the American
mathematician and computer scientist whose
theories laid the groundwork
for the electronic
communications networks
that now lace the earth, died
on Saturday in Medford,
Mass., after a long fight with
Alzheimer's disease. He
was 84.
Understanding, before almost
anyone, the power that
springs from encoding information
in a simple language
of 1's and 0's, Dr. Shannon
as a young man wrote two
papers that remain monuments
in the fields of computer
science and information
theory.
"Shannon was the person who
saw that the binary digit
was the fundamental element
in all of communication,"
said Dr. Robert G. Gallager,
a professor of electrical
engineering who worked with
Dr. Shannon at the
Massachusetts Institute
of Technology. "That was really
his discovery, and from
it the whole communications
revolution has sprung."
Dr. Shannon's later work
on chess- playing machines
and an electronic mouse
that could run a maze helped
create the field of artificial
intelligence, the effort to
make machines that think.
And his ability to combine
abstract thinking with a
practical approach — he had a
penchant for building machines
— inspired a
generation of computer scientists.
Dr. Marvin Minsky of M.I.T.,
who as a young theorist
worked closely with Dr.
Shannon, was struck by his
enthusiasm and enterprise.
"Whatever came up, he
engaged it with joy, and
he attacked it with some
surprising resource — which
might be some new kind
of technical concept or
a hammer and saw with some
scraps of wood," Dr. Minsky
said. "For him, the harder
a problem might seem, the
better the chance to find
something new."
Born in Petoskey, Mich.,
on April 30, 1916, Claude
Elwood Shannon got a bachelor's
degree in
mathematics and electrical
engineering from the
University of Michigan in
1936. He got both a master's
degree in electrical engineering
and his Ph.D. in
mathematics from M.I.T.
in 1940.
While at M.I.T., he worked
with Dr. Vannevar Bush on
one of the early calculating
machines, the "differential
analyzer," which used a
precisely honed system of
shafts, gears, wheels and
disks to solve equations in
calculus.
Though analog computers like
this turned out to be little
more than footnotes in the
history of the computer, Dr.
Shannon quickly made his
mark with digital
electronics, a considerably
more influential idea.
In what has been described
as one of the most important
master's theses ever written,
he showed how Boolean
logic, in which problems
can be solved by
manipulating just two symbols,
1 and 0, could be
carried out automatically
with electrical switching
circuits. The symbol 1 could
be represented by a
switch that was turned on;
0 would be a switch that was
turned off.
The thesis, "A Symbolic Analysis
of Relay and
Switching Circuits," was
largely motivated by the
telephone industry's need
to find a mathematical
language to describe the
behavior of the increasingly
complex switching circuits
that were replacing human
operators. But the implications
of the paper were far
more broad, laying out a
basic idea on which all
modern computers are built.
George Boole, the 19th-century
British mathematician
who invented the two-symbol
logic, grandiosely called
his system "The Laws of
Thought." The idea was not
lost on Dr. Shannon, who
realized early on that, as he
once put it, a computer
is "a lot more than an adding
machine." The binary digits
could be used to represent
words, sounds, images —
perhaps even ideas.
The year after graduating
from M.I.T., Dr. Shannon took
a job at AT&T Bell Laboratories
in New Jersey, where
he became known for keeping
to himself by day and
riding his unicycle down
the halls at night.
"Many of us brought our lunches
to work and played
mathematical blackboard
games," said a former
colleague, Dr. David Slepian.
"Claude rarely came. He
worked with his door closed,
mostly. But if you went
in, he would be very patient
and help you along. He
could grasp a problem in
zero time. He really was quite
a genius. He's the only
person I know whom I'd apply
that word to."
In 1948, Dr. Shannon published
his masterpiece, "A
Mathematical Theory of Communication," giving birth
to the science called information
theory. The motivation
again was practical: how
to transmit messages while
keeping them from becoming
garbled by noise.
To analyze this problem properly,
he realized, he had
to come up with a precise
definition of information, a
dauntingly slippery concept.
The information content of
a message, he proposed,
has nothing to do with its
content but simply with
the number of 1's and 0's that it
takes to transmit it.
This was a jarring notion
to a generation of engineers
who were accustomed to thinking
of communication in
terms of sending electromagnetic
waveforms down a
wire. "Nobody had come close
to this idea before," Dr.
Gallager said. "This was
not something somebody else
would have done for a very
long time."
The overarching lesson was
that the nature of the
message did not matter —
it could be numbers, words,
music, video. Ultimately
it was all just 1's and 0's.
Today, when gigabytes of
movie trailers, Napster files
and e-mail messages course
through the same wires as
telephone calls, the idea
seems almost elemental. But it
has its roots in Dr. Shannon's
paper, which may contain
the first published occurrence
of the word "bit."
Dr. Shannon also showed that
if enough extra bits were
added to a message, to help
correct for errors, it could
tunnel through the noisiest
channel, arriving unscathed
at the end. This insight
has been developed over the
decades into sophisticated
error-correction codes that
ensure the integrity of
the data on which society
interacts.
In later years, his ideas
spread beyond the fields of
communications engineering
and computer science,
taking root in cryptography,
the mathematics of
probability and even investment
theory. In biology, it
has become second nature
to think of DNA replication
and hormonal signaling in
terms of information.
And more than one English
graduate student has written
papers trying to apply information
theory to literature
— the kind of phenomenon
that later caused Dr.
Shannon to complain of what
he called a "bandwagon
effect."
"Information theory has perhaps
ballooned to an
importance beyond its actual
accomplishments," he
lamented.
After he moved to M.I.T.
in 1958, and beyond his
retirement two decades later,
he pursued a diversity of
interests — a mathematical
theory of juggling, an
analog computer programmed
to beat roulette, a system
for playing the stock market
using probability theory.
He is survived by his wife,
Mary Elizabeth Moore
Shannon; a son, Andrew Moore
Shannon; a daughter,
Margarita Shannon; a sister,
Catherine S. Kay; and two
granddaughters.
In the last years of his
life, Alzheimer's disease began
to set in. "Something inside
him was getting lost," Dr.
Minsky said. "Yet none of
us miss him the way you'd
expect — for the image of
that great stream of ideas
still persists in everyone
his mind ever touched."