Creative Destruction

By Robert C. Heterick, Jr.

Sequence: Volume 32, Number 3


Release Date: May/June 1997

To speak about the Internet is really to talk about a set of protocols -
the Transmission Control Protocol (TCP) and the Internet Protocol (IP).
Computer folklore (not accurate, according to those involved) has it that
the TCP/IP protocols grew out of defense research efforts to design
networks that were relatively bomb-proof. ARPAnet was the original testbed
for those protocols. The basic idea was to provide a way for a researcher
to log into another researcher's computer system and overcome the
constraints of geography with a robust wide area network. Of course, we
were fooled - as we often are - by the increasing pervasiveness of e-mail
and file-sharing, incidental by-products of this new form of robust
networking.

We probably should have known this would happen. Ada Augusta, Countess of
Lovelace, observed more than a century earlier when discussing Babbage's
Difference Engine, that "In considering any new subject there is frequently
a tendency, first to overrate what we find to be interesting or remarkable,
and secondly, by sort of a natural reaction, to undervalue the true state
of the case."

After realizing that we had undervalued the true state of the case, we
became very interested in making internetworking more ubiquitous. NSFnet
(circa 1984) was an attempt to spread the benefits of networking more
broadly in the research and education communities. Those with an Orwellian
bent will find some significance in that date. The Merit, IBM and MCI
collaboration culminated in the creation of ANS and grew out of the effort
to manage and operate NSFnet under contract to the National Science
Foundation. The incredible success of NSFnet and the underlying desire for
ubiquity led, rather predictably, to the privatization of NSFnet. And as
you may know, my favorite definition of privatization is: the sale of
assets of questionable value to commercial interests that really don't want
them.

It is abundantly clear that neither the telecommunications carriers,
interexchange and local, nor the major hardware/software companies - IBM,
Apple and Microsoft, to name a few - believed in the TCP/IP protocols or
ubiquitous internetworking. All were unprepared with solutions or
strategies to carry the Internet to a new level. The current shortage of
high-bandwidth circuits is but one measure of that lack of preparedness.
The World Wide Web and the Mosaic browser burst upon the scene and
underscored the lack of preparation on the part of our major strategic
partners for either the increase in demand or the plethora of innovative
new uses we intended to make of the Net. The newly privatized commodity
Internet has simply been unable to keep up with demand. In addition to
making the Net bomb-proof, we appear to have succeeded in making the Net
relatively management-proof.

The moral of this story may be to be careful what you wish for, as you may
get it.

Envisioning Internet 2

The struggle to keep up with rapidly growing demand is clearly a problem
for the commercial service providers and they will move - are moving -
expeditiously to resolve it. The solution will not come quickly enough for
anyone, but it will come. It is unlikely, however, that it will come solely
from the historic communications/hardware/software providers, as we can
certainly predict that new solutions rooted in cable, wireless, satellite
and even electric power distribution will coexist with those proffered by
the telephony industry. The high visibility accorded new players such as
Netscape and @Home are testimony to the high level of innovation and
entrepreneurship in this industry. While prices for mature services can be
expected to continue to decline, new service offerings will be slow to
emerge.

Let's think for a moment about the characteristics of these new sets of
service offerings. First, and quite obviously, we need a way to retain
local traffic locally. The overhead of sending packets from Atlanta to
Washington and back to Atlanta is simply unacceptable. Any mission-critical
function - and I think we all understand that internetworking is
mission-critical to our organizations - needs to have backup. We all need
connections to multiple backbones and Internet Service Providers.

The advanced applications we are all thinking about, whether for learning,
research or information retrieval will require higher bandwidth and
something more than best effort (the mantra of the current Internet)
service levels. Vendors will need to provide us with performance metrics
(something we haven't had since the advent of privatization) and guaranteed
service levels. Most of all, we need the benefits of robust, high bandwidth
networking to expand beyond our places of business into our homes. ISDN
(integrated services digital network) has proved too late, too slow, and
too expensive. If the history of ISDN is any indicator, the telco roll-out
of digital subscriber loop technology will also be too late and likely too
expensive. This problem is most likely to be solved through a combination
of new entrepreneurs and enlightened regulatory leadership from the Federal
Communications Commission and public utility commissions. Absent some
affordable solutions to this problem coming quickly to the marketplace, we
may expect to see many communities form public service authorities and
treat community networking as they have other infrastructure services like
water and sewer.

Higher Education's Changing Role

To the relief of many and to the chagrin of some, these issues have moved
far beyond the control of the historic higher education/corporate
research/federal agency collaboration that created the original Internet.
The system theorist Jerry Weinberg once observed that "cucumbers get more
pickled than the brine gets cucumbered." In the shuffle of international
telecommunications Goliaths, higher education is a small player indeed. We
must carefully select when and how we choose to engage these issues -
making sure that we can make a difference and not be trampled by the
giants.

The higher education community has selected a salient effort to enter this
fray that seems both achievable and useful. The project is called Internet
2. It is called that because it seems certain that there will be an
Internet 3 and successors beyond that. Let me simply sketch the dimensions
here. The icon of Internet 2 will be something called a GigaPOP. At least
initially, GigaPOPs are likely to be shared by geographically proximate
institutions. They will provide the capacity for high-bandwidth,
desktop-to-desktop communication between participating institutions and
thereby facilitate a new set of applications for learning, research and
information retrieval. These applications are certain to be collaborative,
multimedia-rich and bandwidth-intensive.

The retention of local traffic locally will certainly be a major feature in
the operation of GigaPOPs as will connectivity to multiple backbones and
Internet Service Providers. Initially, the backbone services will be
provided over the National Science Foundation's vBNS. GigaPOPs will be
engineered to provide Quality of Service features in addition to best
effort packet delivery.

The fundamental premise driving Internet 2 is not ubiquity, but rather
technology transfer. The object will be to resolve the chicken-egg dilemma
by demonstrating the value of the types of new applications that could be
delivered over high-bandwidth connections and to encourage the transfer of
the underlying technologies and protocols into the commodity Internet as
rapidly as possible. Internet 2 is not a solution to current shortcomings
of the commodity Internet, but rather a research project to build and
demonstrate applications that will change the very nature of higher
education.

I have an economist colleague who once observed that the purpose of
forecasting was to make astrology look scientific by comparison. In order
to think about the future we ought to take note of the environment. Is it
relatively stable or is it changing rapidly? In times of rapid change, we
must learn to interpolate rather than extrapolate.

Extrapolation looks at the past and assumes, in the words of the systems
theorist Jerry Weinberg, that the future will be like the past, because in
the past, the future was like the past. In most situations confronted by
our organizations this is a reasonable approach. However, when a revolution
is underway, such a focus on the past can be very misleading. A better
approach is to create a scenario of what you believe the operating climate
for your organization is likely to be in the future and attempt to set a
course based upon interpolating between where you are now and that future
scenario. Such an interpolation will suggest courses of action that look
very different than those derived from extrapolation.

Touchstones for the Future

Somewhere in the not-too-distant future we will have secure transactions,
micro-cash and differential service offerings. The impact of these, and a
host of other new service offerings, will require that we rethink a whole
set of issues including liability, intellectual property, universal service
and taxation of Net-delivered services. So far, we have done pretty well in
contending with the international character of the Net. While we have
maintained a generally reasonable attitude toward Net content in this
country (the so-far-successful challenge to the Computer Decency Act
serving as a case in point), other countries (Germany and China being two
glaring examples) have not been so fortunate. It will take another decade
to populate our legislatures with officials who truly understand that the
Net is different and that we will not be able to extrapolate industrial age
case law into the world of the Net.

As more and more commerce moves onto the Net, we will experience a similar
lack of understanding from legislators as they attempt to turn Net commerce
into a source of revenue. The success of the Net will ultimately force
governments to totally rethink the existing crazy quilt of taxation
policies that are based on the exchange of tangible, physical commodities.
It may well turn out that the Net will be regulation-proof.

There are some lessons here for our education communities as well. First
Sale and Fair Use doctrines served us reasonably well in an industrial age
economy. They simply will not extrapolate to the emerging world of the Net.
A world of secure transactions and micro-cash will open a whole new avenue
for securing intellectual property on the Net. Explicit or implied contract
transactions will dominate the future of the Net. In this instance, the
future will most certainly not be like the past.

The access/security trade-off extends well beyond the issue of intellectual
property. Recent legislative actions in this and other countries have
raised the specter of an unmanageable set of third-party liabilities. The
object seems to be to make intermediate transit locations also liable for
infringing or non-conforming transactions. A liability wavier, embedded in
common carriage for telephony and surface mail, clearly will need to be
extended to the Net. While they are at it, the federal government will also
need to make sense of the Universal Service Fund. It is refreshing to see
that the Joint Board, in its report to the Federal Communications
Commission, has at least proposed that recipients be means-tested. This is
a not insignificant issue for our institutions of higher education as they
tend to be the source of some of the Fund that goes to subsidize those who
have no need for the subsidy. Feeding the horses in order to feed the
sparrows will be the result of continued extrapolation of industrial age
regulation in the world of the Net.

In the 1970s, the Carnegie Commission raised the paramount questions for
higher education: Who pays? Who benefits? These are good touchstones as we
think about our future on the Net. Differential service charging will
clearly be a result of differential services on the Net. A great deal of
visceral fog has obscured the issues of who pays and who benefits when it
comes to the Net. Folks continue to confuse unpriced services from an
aggregator (such as universities or libraries) with "free" services. The
Net is likely to be much more disaggregated than were our industrial age
models that featured broad service suites. Anyone who has had to wrestle
with health benefit programs lately will observe that even the health and
insurance industries have fallen prey to disaggregation. So in addition to
being bomb-proof, management-proof and regulation-proof, we are likely to
discover that the Net is relatively aggregation-proof as well. Differential
service suggests that there will be differential benefits and
correspondingly differential payments. This is quite at variance with the
one-size-fits-all, best effort characteristics of the current commodity
Internet.

The 19th century economist, Joseph Schumpeter, once described capitalism as
"creative destruction." That could well be an apt description of the
Net-creative destruction. The old is being replaced in a very non-linear
fashion by the new. The route to planning our future on the Net is
interpolation, not extrapolation.

If all this is troublesome to you, you might take heart from a comment made
by Mark Twain upon leaving his first Wagnerian opera: "It's better than it
sounds."

Robert C. Heterick, Jr., is president of Educom. [email protected]




Take me to the index