The Fourth Law of Robotics

By John Markoff

Sequence: Volume 29, Number 2


Release Date: March/April 1994

Recently I used an Internet Relay Chat (IRC) client for the first time
in more than six months. IRC is one of those wild, anarchist worlds that
has emerged out of the Internet soup in recent years. Like CB radio, it
offers casual visitors hundreds of simultaneous, stream-of-consciousness
conversations from a global electronic coffeehouse that tends to tilt
toward the lurid and sexual.

Nevertheless I was unprepared for the warning that greeted me on
the Stanford University IRC server:
*******************************************************************
*** -- Do not run bots on this server. If you must run a bot, use w6yx-2.
*** -- This also means that you should restrict yourselves to one (1) client on this
server. Additional clients should be run on w6yx-2.
*** -- Absolutely NO floodbots or tsunami bots on any Stanford server.
*** -- Violation of this rule may lead to loss of access for your entire site or domain.
*******************************************************************

What's going on here? I wondered. It's simple and it's ominous.
Protoartificially intelligent creatures are already loose in the net,
and in the future they will pose vexing ethical dilemmas that will
challenge the very survival of cyberspace. Computation ecologies are
beginning to emerge.

The irony is that computer science's most compelling vision is on
the verge of becoming the greatest threat to the net's existence. Going
back to Vanevar Bush, Douglas Engelbardt, and Alan Kay (and dare I add
John Sculley!), the shared vision of intelligent software agents has
inspired generations of researchers. Even Bill Gates has gotten into the
act, with his twin visions of softer software and information at your
fingertips.

But in the emerging information space of the Internet, who will
govern how our agents behave? The WorldWide Web, Gopher, and Wide Area
Information Server are wonderful resources, but in the future they won't
be searched by humans using Mosaic or some other quaint information
browser. No, we will all be tempted to turn our agents and knowledge
robots loose on the net to prospect around the clock for information.
Robert Kahn and Vint Cerf at Corporation for National Research
Initiatives (CNRI) have already built the first knowbots that have the
capacity to relentlessly prowl the net looking for information morsels.
Why would I have only one knowbot? Why not a hundred? Why not a
thousand?

The sorcerer's apprentice comes to mind.

But the Internet is already groaning under the load. The scarce net
resource is not bandwidth; rather, it's central processing unit (CPU)
cycles. And as more fiber-optic bandwidth is added rapidly as part of
the national infrastructure vision, the imbalance is certain to grow.
Indeed, adding bandwidth only compounds the problem by making it easier
to get to already overloaded servers. In recent months, every new
information resource that is placed on the net is immediately
overwhelmed--today largely by humans. An example: Late last year, the
WorldWide Web server of the National Center for Supercomputing
Applications was handling 600,000 queries a week and the rate was
growing exponentially.

Moreover, the remarkable growth of the net appears to be
accelerating even without the advent of software creatures. America On-
Line is preparing to bring 400,000 PC and Mac users onto the net, and at
the recent Western Cable Show in Anaheim, California, the hot new
technology wasn't video-on-demand, it was PC interfaces to receive high-
bandwidth feeds to the Internet over the cable network. (Is the net more
entertaining than Beavis and Butt-Head? We're about to find out. . . . )

On the horizon are a series of technologies that will serve to
dramatically amplify already exponential growth:

* Convenience scripts are at work on the net, systematically
looting ftp sites by walking directories, downloading every file in an
entire tree. Blend this with agent technology and you get the cyberspace
equivalent of the mad hatter auto dialer on steroids.

* A multitude of bots are now loose in the Muds wandering over a
text-based cyberscape. Their intelligence is minimal but growing.
Several years ago, a Mud artificial intelligence that was developed at
Carnegie Mellon University placed second in the first (would-be) Turing
Test held at the Boston Computer Museum.

* Most of the advanced computer science schools are hard at work
perfecting agent technologies, using the Internet and the Web as their
laboratories. Who in these research centers is thinking about the
ethical questions associated with automated information-seeking tools?

* Robert Morris's worm was an early warning of the unpredictable
nature of the net's ecology. Whose next clever hack will run amok, this
time with far greater consequences?

* It's not just the Internet. Telescript--General Magic's
telecommunications language--is really just a virus design lab in
disguise permitting me to launch a program that runs on your computer.
What does it do? Can I control it? AT&T's executives have already
acknowledged that they're not certain they can contain the technology.

In their defense, Telescript's designers say that they have done a
tremendous amount to cripple their new technology so it can be
controlled. They have made it an interpreted language, have stripped
away dangerous language attributes and the ability to speak directly to
hardware, and have put in limiting mechanisms to prevent sorcerer's
apprentice scenarios. But they acknowledge that they still don't have
real-world experience to back up their design work.

The responses by the information market theorists are that it is
just an artifact of today's reality that the net is still largely free
(subsidized) and that there is no cost associated with the use of
resources. Once a true information economy emerges, the abuse of
resources will be moderated by cost. Researchers at Xerox PARC have
already experimented with systems that generate CPU-based market
economies in which network nodes can profit by freeing unused CPU cycles
to the net. Conversely, simple market economics should solve some of the
worst excesses of the overwhelming of popular net servers.

But market mechanisms will control only some of the abuses. Indeed,
traditional market behavior may not be suited for an information economy
in which things of great value can be endlessly replicated.

Moreover, what will be lost in the shift to the market economy? It
seems the culture of the Internet is worth preserving, and that culture
derives from the free exchange of information.

Perhaps there is another alternative. We need a fourth law of
robotics.

Issac Asimov's I, Robot (1950) is well-known as the science fiction
novel in which he invented his famous Three Laws of Robotics, which
govern the relation of robots to their human masters: robots may not
injure a human or, by inaction, allow a human to be harmed; robots must
obey humans' orders unless doing so conflicts with the first law; robots
must protect their own existence unless doing so conflicts with the
first two laws.

Let me add a fourth law: Information robots may not abuse the
computational resources of the net and must defer to human network
users.

It's clearly a programming challenge, but it may be worth taking up
if we are to preserve the culture of the Internet.

John Markoff writes for the New York Times.




Take me to the index