For a decade or more Microsoft’s monopoly of PC operating systems gave
it a
stranglehold on computing. The technologies now shaping the Internet mean
that
a single software company is unlikely again to attain such dominance
WHEN Microsoft fought its way out of IBM’s shadow in the early 1980s, it
was hailed
as a monopoly-killer par excellence. The PC counterculture would liberate
ordinary
people from mainframe computers run by corporate IT departments—most of
them
made by Big Blue. As it turned out, this was only half right. The transition
to the PC
almost destroyed IBM as a single company; but Microsoft gained a share
of the market
for PC operating systems that even IBM never achieved for mainframes.
Now it is Microsoft that feels threatened by a technological shift. But
this time the firm
that dominates the old technology is unlikely to be pushed aside by a single
newcomer.
Instead the Internet promises to create a world in which no firm ever again
has the
power of a Microsoft or an IBM.
Judge Jackson described the threat to Microsoft in his findings. The Internet
could
“oust the PC operating system from its position as the primary platform
for applications
development and the main interface between users and their computers.”
If so,
Windows would become just one not very important piece of computing plumbing.
Yet, as the judge also made clear, this vision lies far in the future.
Although the Internet
is already affecting people’s lives, a further transformation is essential
if the hype
about what you can do online is not to fall flat. This is still early days
for the
Internet—it is about as developed as commercial flying was before the DC3—and
yet
the transformation is already on its way. It will not come from a single
dramatic
technological breakthrough, like the jet engine, but from lots of connecting
technologies
coming of age more or less together.
These technologies are quite unlike the PC operating system. That is the
intellectual
property of Microsoft, but the Internet employs standards and protocols
that are
“open”, meaning they can be freely used by anyone. They have mostly been
hammered
out in public forums, and they are beyond the control of any single firm.
The operating
system is the control centre of the PC, but the Internet is managed at
many levels, some
within devices, some on the network, others at the abstract level of a
“language”.
All this adds hugely to the complexity of the system, but it is the complexity
that
distinguishes a market economy from a centrally planned one. The PC has
to evolve to
the drumbeat of Microsoft’s programmers in Redmond, Washington. The Internet,
by
contrast, derives its adaptability precisely from its amorphous nature.
If a firm creates
and exploits a monopoly over one bit of technology, somebody, somewhere
will think
of a road around it.
Open Sesame
The best way of looking at the Internet of the future is to ask what is
wrong with
today’s. Instead of one single, huge failing, the Internet has three middling,
but
infuriating, ones. First, it is still not free from Microsoft’s monopoly.
To gain full
access to the network, most people use a PC, a machine that is prone to
crashing
without warning and is awkward if you are on the move. Indeed, thanks to
the PC, the
Internet has a level of inconvenience that would be unacceptable in any
other
mass-market medium. Second, the Internet itself is hard to use. Consumers
have trouble
finding what they want, and companies intent on doing e-business struggle
to integrate
new technology with their existing systems. Third, the network’s security
and
reliability are both inadequate.
What are the technologies that will fix these problems? Much of the answer
lies in new
standards, or protocols, that will let computers, handheld devices and
mobile phones
talk to each other in new ways. A protocol called POP (post office protocol),
for
example, lays down the rules for delivering e-mail. Another, called HTML
(Hypertext
Markup Language), defines the layout of web pages. Rather as conventions
in the game
of bridge let partners use bids to convey information about their hands,
protocols tell
the designer of a piece of machinery or software how to format information
coming
from or going to the outside world.
Protocols are dull, unremarkable things when written down on paper—like
recipes or
magic spells. Yet they make the Internet work. Software wizards have conjured
up a set
of new protocols, each designed to make a particular Internet problem vanish
in a puff
of acronyms. It is largely these protocols, combined with some clever new
software,
that are destined to transform the Internet into a medium that is fast,
convenient, and
reliable.
Perhaps the Internet’s most dramatic shift will be to extend computing
beyond the PC.
The first to log on will be smart phones and wireless Palm Pilots, but
other devices
will follow. Engineers may take a few years to come up with transmitters,
receivers
and sensors cheap and tiny enough to put into the appliances of the networked
home.
But almost everything that can usefully transmit data will eventually be
able to do so,
including refrigerators that automatically reorder what has been taken
out of them and
car keys that tell the Internet (and you) where you left them.
Some devices may use an operating system from Microsoft, but the majority
will not.
Most of their intelligence will lie on the network, which means that operating
systems
will matter less than standards. These standards are mostly already defined.
They tend
to be open, precisely because millions of devices from thousands of firms
will be
connected to the Internet.
One of the most important, called Bluetooth, is the work of more than 500
firms, led by
Ericsson, Nokia, Toshiba, Intel and IBM. Bluetooth defines how devices
should transmit
data to each other.The idea is to replace the cables that tie devices to
one another with
a single short-range radio link. Within two years, about 80% of mobile
phones will
carry a $5 Bluetooth chip that can connect them to similarly equipped notebook
computers, printers and, potentially, any other digital device within about
ten metres.
As well as defining how these devices find and talk to each other, Bluetooth
also ties
into existing data networks, including the Internet. In future, you might
tap out an e-mail
on your Palm Pilot or Psion, tell it to make the Internet connection through
your mobile
phone, print a copy on the printer upstairs and store another on your PC.
Standards are also open because customers and suppliers have learnt to
be wary of
being locked in to someone else’s technology. Several firms have defined
their own
standards to adapt the Internet for the small, monochromatic screens of
handheld
computers and smart phones. Such gadgets are not suited to the lively,
colour graphics
on most web pages—let alone bandwidth-devouring multimedia and animation.
Worse,
they have neither keyboard nor mouse, and lack the computing power and
memory they
need to run modern browsers.
But proprietary efforts have not got far. Wireless Internet access will
take off only if
there is a single, open standard for delivering suitable content to wireless
devices.
That is the job of something called WAP (Wireless Application Protocol).
This allows
mobile devices to gain access to the Internet using a “microbrowser”, which
displays
web pages specially formatted for tiny screens.
Nokia’s first WAP telephone will be available soon; Ericsson has just launched
a
WAP-based wireless notebook computer. Handelsbanken, a Swedish bank, has
extended its Internet banking service to mobile devices. Using WAP, the
bank’s
customers will soon be able to look up stockmarket prices, buy and sell
shares, consult
their accounts, transfer money and pay bills—any time, anywhere. Within
a few
months, Motorola will be selling WAP telephones in America, where firms
such as
AirTouch and Sprint PCS have announced new services. As wireless bandwidth
increases, the only limit on what they can do will be the size of the screens.
Like Bluetooth, WAP is supported by the largest firms in the business.
It will work with
any of the mobile networks across the world, and eventually with proprietary
operating
systems likely to be in smart-phone handsets, such as the Symbian joint
venture’s Epoc
(from Psion) or Microsoft’s own Windows CE. In effect, WAP lays down the
rules for
the wireless incarnation of the web. Like the web, it is not owned by anyone,
yet it is
the standard everyone will follow.
A kinder, gentler Internet
By escaping from the desktop, the Internet will both escape Microsoft and
become a
bigger part of everyday life. But this will not be enough to deliver its
promise. For that,
the Internet must also become easier to use.
John Patrick, a founding member of the World Wide Web Consortium who leads
IBM’s
efforts to create web technologies, calls what comes next “Natural Net”.
Today’s
Internet is a powerful way to communicate, given e-mail, instant messaging
and
chatrooms. Even so, people cannot really work together over the public
Internet. Again,
each of many different technologies has a part to play in making collaboration
feasible.
And again, none of them holds out the prospect of domination in the way
that the
operating system dominates the personal computer.
Take the infrastructure of the Internet. Videophones have failed to take
off, and
videoconferencing is still a minority sport, mostly because the Internet
is still so
clunky. The Internet needs more capacity, or bandwidth, both in its main
backbone and
in its connections to schools, houses and offices. The network must also
be reliable, so
that connections will not only be fast, but also available whenever needed.
Broadband technology can exploit four channels needed for online discussion—text,
voice, video and graphics. Once broadband is common, people can work together,
wherever they are, on anything from the design of a new car to a patient’s
MRI scan.
Forrester Research, a consultancy, reckons that, by 2003, live collaboration,
now used
only in one-off projects between companies, will have become as much part
of the
Internet as e-mail is now.
But who will own the broadband connections? On the Internet backbone, where
there is
no risk of a natural monopoly, collaboration is already the pattern. The
backbone is
struggling to keep pace with demand that at present appears to double every
six months.
Relief may come from the “Abilene Project”, or I2 (Internet2), as it is
known. Some
150 universities and firms such as Cisco Systems, Nortel, Qwest and IBM
are together
designing a backbone advanced enough to solve today’s Internet gridlock.
In local connections, where the risk of monopoly is higher, fast connections
are
arriving thanks to both cable modems and DSL (Digital Subscriber Line),
a technology
that turns traditional copper wires into fat data pipes. Each approach
has strengths, and
each offers “always-on” Internet connections as much as 30 times faster
than the
speediest dial-up modems. This will hugely increase both the range of Internet
applications—with high-quality streamed video to the fore—and the convenience
of
gaining access to Internet-based services.
The key thing is that DSL and cable represent the claims of two separate
and hostile
industries to be the standard-bearers of the data revolution. There is
thus the promise
both of rapid investment (after many delays) and genuine price competition.
Not all areas are free from the threat of domination. This is partly because
high-tech
industries are prone to “increasing returns to scale” and high barriers
to entry. One
example is speech-recognition and machine-translation technology. This
could, in Mr
Patrick’s words, turn the Internet into a “multilingual real-time intercom”.
It will, he
suggests, soon be possible to hold a conversation in English with a friend
in Tokyo
who is reading your words in Japanese. As speech-recognition algorithms
improve, the
spoken word could become the main interface to the network. This will be
especially
valuable with mobile devices, because it overcomes the limitations of a
fiddly keypad
and a screen-tapping stylus.
Yet speech recognition depends on dictionaries of words and sounds that
are hugely
expensive to compile. Because most costs are upfront, the more software
a
speech-recognition firm can sell, the lower the average cost for each copy.
A firm that
thrives in this area—Lernout & Hauspie, in Belgium, stands a good chance
of doing
well—will make a lot of money.
However, few of the core technologies of the Internet are easily owned.
Take the
example of what might be called dumb data. A mass of information resides
on millions
of websites, but how do you turn it into something useful? Most of it is
unstructured and
stored in a non-machine-readable form (eg, English), which means that it
can be read
only by humans. Because data are dumb, web search-engines often fail to
find a
sensible answer to queries—a page containing the word “Apollo” might be
about
mythology or about moon landings. Similarly, although companies can display
online
catalogues or details of their stock, this information means nothing to
other computers.
At the root of this problem is HTML, which defines web pages. This instructs
a
web-browser how to lay out the contents of a page, but does not tell the
computer what
the page is describing. A solution is to hand, in the form of an extension
to HTML, called
XML (eXtensible Markup Language). This adds invisible labels (called “metatags”)
describing the objects contained in the web page, so that computers know
what they are
dealing with. The specifications, price and availability of each product
in an online
store could, for example, be labelled with metatags. A computer would then
be able to
compare prices across several stores and make a recommendation, no matter
how much
various websites differ from each other in appearance.
Once information “knows what it is”, Internet searches will become faster
and more
accurate. Metatags will make it easier to extract information from a page
to suit a
particular use, or the limitations of a particular device. In addition,
XML will help firms
to strengthen their online business relations with suppliers and customers,
by knitting
together business processes and software applications both within and across
companies.
It is inconceivable that a lingua franca such as XML, which allows previously
incompatible computing platforms to understand each other, could be owned
by
anyone. For a start, the egalitarian ethos of the Internet resists an authority
“imposing”
its will—cyber groups have, for example, used peer pressure to quell the
bureaucracy
that manages the Internet’s domain names. Since no single organisation
controls the
Internet, XML could not be imposed in any case. Instead, the Internet muddles
through:
XML has been on the drawing-board for several years, and it has won almost
unanimous
support from Internet standards bodies, such as the World Wide Web Consortium
and
industrial firms. Lastly, in all its ramifications, XML is astonishingly
complicated. It
will not be complete until each industry has the electronic schemas that
describe
common processes for each industry—and the world is still waiting for its
various Dr
Johnsons to compile the necessary dictionaries.
But can you trust it?
And when there is no protocol to hand? The Internet grinds to a halt, until
someone
comes up with an answer that users feel comfortable with. That, at least,
is what has
happened with Internet security.
This is essential if firms are to throw open their electronic doors to
the world.
Consumers are, if anything, even more wary over the lack of security on
the Internet
than are firms. According to Mr Patrick of IBM, e-commerce and business
collaboration
will take off only if several conditions are met: authentication (checking
that someone
is who he claims to be, or a website what it seems to be); authorisation
(controlling
access in a sophisticated, “granular” way ); confidentiality (keeping private
information private); integrity (ensuring that information has not been
tampered with);
and non-repudiation (being sure the terms of a transaction are binding
and legitimate).
In theory, the solution is obvious. Encryption so powerful that it cannot
be broken
without mammoth computing power or the theft of a digital “key” from one
of the
parties to a transaction would pass all Mr Patrick’s tests. Reality is
more complicated,
mainly because no single open standard prevails on the wider Internet.
Various
proprietary systems are competing for attention, but no single one has
taken off,
because they are complicated and users fear that they will be lumbered
with a
particular technology. What is needed is another open protocol.
Coming to the rescue is the PKI (Public Key Infrastructure). By presenting
“digital
certificates”, the online equivalent of a handwritten signature, those
who register with a
“certification authority” will be able to use the public Internet to transmit
data securely,
make payments and prove that they are who they say they are. This will
make it
possible to check, for example, that an e-mail really does come from who
it says it
comes from. Certification authorities manipulate digital “keys” that authenticate
the
identity of each party to a networked transaction. These services are already
on offer
from firms such as telephone operators and banks.
Net scope
With a lab-full of new technologies and protocols, the Internet is likely
to overcome its
current drawbacks and continue its march into every corner of modern life.
No single
technology on its own can solve the Internet’s failings. Moreover, since
no single
organisation controls the Internet, every new protocol will have to prove
its worth if it
is to be widely adopted. Such an apparently haphazard way of doing things
might seem
like the Internet’s fatal weakness. In fact, it is its greatest strength.
Monopolies still occur in computing: they do in many industries. But, thanks
to the
nature of the Internet, they are likely in future to be modest, and limited
in scope. When
Bill Gates concluded that Netscape had to be stopped, by fair means or
foul, he could
be accused of ruthless bullying—but not of paranoia.