CAUSE/EFFECT

The paper content is the intellectual property of IBM Corporation. Permission to print out copies of this paper is granted provided that the copies are not made or distributed for commercial advantage and the source is acknowledged. To copy or disseminate otherwise, or to republish in any form, print or electronic, requires written permission from the author and CAUSE. For further information, contact CAUSE at 303-449-4430 or send e-mail to [email protected].


Information Technology as a Transformation Agent

by Charlie Tuller and Diana Oblinger

The essential process of higher education is the transformation of information into knowledge, and knowledge into insight. With technology catalyzing such massive changes in how we manage information, and with cognition, communication, and collaboration helping us transform information into knowledge and insight, the implications for higher education are immense.

Information technology (IT) is a defining force affecting all areas of society well into the next century, changing every institution, every business, and every individual in profound ways. Technology itself has changed dramatically in the past fifteen years. Anticipate even more rapid change in the future, changes that impact organizations and society.

Within fifteen short years we have on the order of one thousand times better algorithms, five hundred thousand times more computing power per individual and five hundred million times more mobility of information. We do not begin to understand the technical significance of all this, let alone the societal change it has unleashed or the institutional change it demands.1

During the next decade, a variety of enabling technologies will be important, as will advances in their form and function. More important, however, is the ability of these technologies to transform individuals and institutions.

Three shifts in computing

For most of us, computing today is radically different from our early experiences. In fact, we have seen three major waves of computing: host-centric, client/server and network-centric.

Host-centric, or "tops-down" computing, dominated the environment for the past twenty years. The focus was on the physical enterprise, with a specific behavior pattern:

We still see a dominance of host-centric computing among administrative applications in higher education. Many institutions operate their institutional research or registration offices with a specific computer, using applications written in-house. Creation of reports is fixed by the application: a new program must be written to generate a new report.

Today a distributed client/server model dominates. The focus is the distributed enterprise with a different behavior pattern:

The client/server architecture has enabled organizations to do a great deal of "mixing and matching" to suit individual needs. An example of this pattern is in purchasing client machines. People are buying CPUs separate from memory. Specific hard drive sizes are ordered. Memory modules may be different. Connectivity is a mix. Customization is the rule rather than the exception. Applications are no longer written in-house; they are purchased. Information is viewed through a windowing system, whether Windows, OS/2, Mac operating system, or something else. Users tile their windows to view what they need.

The next step is network-centric computing. The behavior pattern changes again:

Hardware comes from many sources. The network contains your applications as well as the data. The network is, in fact, your application. Software need no longer be purchased and installed on your computer. When you connect to the network, you access the latest version of the software for which you have a subscription.

The network�s technical characteristics will be of little concern to its users. Its presence will be assumed. The vendors and the technologies that enable service will be in the background. It will not matter what the topology is, where the server is located, or whether the connection is facilitated by wire or wireless technology. From your computer, you will have access to the resources you want and need.

In spite of its importance, the network is "dumb": It is unaware of individual computing needs and preferences. It is merely a transport vehicle. Yet, one individual�s computing needs are different than another person�s. The "personalization" of your computing interface will reside in the software.

Advances in computing technologies, such as high-resolution displays, 3-D graphics and animation, handwriting and speech input, and natural language understanding will be used to improve the end-user interface, to facilitate personal interaction and customization with computers. This will enable new interaction models, including:

Speech recognition technologies, an example of "hands-free" computing, can recognize more than 30,000 words at a rate of seventy words per minute. In addition, researchers are currently developing a large-vocabulary, speaker-independent continuous speech recognition system for multiple languages.

These paradigm shifts in computing are not restricted to hardware platforms and networks (mainframe, client/server or network-centric). As technology matures, there is an evolution in its use. As a new technology is introduced, its early uses are likely to be found in niche areas. For example, when CPUs were introduced, they were first incorporated into accounting functions. Calendar and mail functions were promoted with the introduction of laptops. The early applications of voice-recognition showcase the use of voice, but do not reach everyday activities.

The second phase of technology introduction is characterized by a migration to general-purpose uses. Personal computers are in this phase today. PCs are used for many purposes: word processing, electronic communications, spreadsheets, graphics, multimedia, etc. The PC is highly adaptable because of the range of applications used to tailor its functionality. Yet to obtain this functionality, users are required to purchase specific software packages as well as upgrade hardware and operating systems, and keep up with new versions of the applications. Over the next few years, the PC will move into the third phase of evolution, that of being an "appliance."

The "appliance" phase is characterized by a "thin-client, fat-server" model. In this scenario, code does not permanently reside on the client, it resides on a server. When the user needs an application, it is accessed through the network and executed on the client machine.

This move will be motivated by a growing frustration with the fundamental limitation of general-purpose use: constant upgrading. Today�s general purpose computer use can be described as a "fat-client" model. The client machine must have increasing memory and processing capacity, and more and more software with a seemingly endless procession of upgrades. As computers and software improve, fatter and fatter clients will be required. From an individual user perspective, this can be a frustrating experience. Installing and integrating new versions of software, reconfiguring hardware, and ensuring compatibility require time and skill that the average user may not possess.

For large organizations with hundreds or thousands of users, the fat-client model is becoming insupportable. Many in higher education experience the inherent limitations of the fat-client model for even simple activities. Have you used one version of a word processor at home, only to find you cannot print that version at the office? Has a colleague sent you a file that you must spend hours converting into a format you can read or use?

A much more efficient model is that of the "thin-client, fat-server." An organization would invest its resources in ensuring that the network is up to date and that its users have information appliances. Changes would be made to the network, not to individual client machines. Actually, this evolution is under way. Java, Internet plug-ins, and compound documents are the emergence of the appliance model.

A "thin" client does not imply anemic personal computers. It means that the code does not have to reside on the local hard disk, instead it is delivered just in time to be executed dynamically. The logistics of keeping thousands of individual systems at the same level of platform, word processor, spreadsheet, messaging, graphics, etc., software is daunting. This "thin" model takes advantage of the power of the personal system without the burdens of systems management and logistics that affect every user community, from individuals through large enterprises.

Rate of change

Over the next decade, profound and inexorable advances in technology, particularly in computing capability, connectivity, bandwidth, software development, and digitized content, will continue to be change agents. To help you plan for the rate and direction of change, anticipate that:

Microprocessor performance has been increasing at a relatively constant rate, doubling approximately every eighteen months. This trend is expected to continue. Its impact, however, is a perceived time compression, which will cause a change in the business model of computing.

A steady rate of growth (2x per period of time; as in 2, 4, 8, 16, 32�) will yield progressively larger increments of growth as the rate continues. The result is that it takes less time to cover the same increment of technogical advancement. This perceived time compression stresses our established models.

Consider word processing as an example. We have had word processors since the 1980s. Initially, it did not seem that word processing changed much as we moved to new generations of technology. The word processor used on an 8088 machine was not radically different than the one used with a 286 machine. By the time 386 machines became common, users began to see a point-and-click interface instead of keystroke commands. Microprocessor performance was doubling every eighteen months to two years: the rate of change was the same as it is today. The base was relatively low, so the increment of change remained small, and function did not appear to change radically or quickly.

With today�s technology (Pentium and above), it is possible to "type" using voice dictation as well as opening or closing applications through voice commands. This is a sizable change from a manual interface. The next generation of technology will double the capacity of a Pentium Pro MMX processor. What functionality will we have available after the next doubling of microprocessor capacity? The pace of change, as we have moved from keystrokes to a point-and-click interface to voice commands, seems to be increasing. It is not. The rate of change is still 2x. Our perception, however, is that the expanding capabilities of technology seem to be occurring in a shorter and shorter time frame. Imagine extrapolating processor performance for four years; it will have doubled twice. Will word processing have migrated from voice recognition to mind reading?

The result is that we sense a breathless pace of change. The impact is felt on organizations, as well as individuals. Asset volatility is high. Neither organizations nor individuals can purchase the "right" machine. As soon as it is bought, it is out of date. No one can keep up with the current version of software. Organizations are spending enormous energy, time, and money churning hardware and software in an attempt to remain current. In spite of large investments, institutions find themselves with two or three levels of technology that are now obsolete, but which they cannot afford to discard.

The increment of change between generations of technology is becoming too large for this purchase-based model to survive. The emerging model is that of network station management -- a subscription model. For both individuals and organizations, it makes better business sense to pay a monthly "subscription" or "rental" fee for access to an information appliance as well as the software and storage needed on the network rather than to continue the upgrade cycle.

Social Changes

In addition to the exponential changes in technology, two social forces will drive change: (1) the increase in the value of time, and (2) the recognition that information technology is a competitive differentiator.

Information is being digitized. The significance is that the conversion of text, graphics, images, and video into bits gives information a digital passport to travel across global networks. Powerful new communications technologies are giving networks the bandwidth needed to handle rich but space-consuming content like video, MRI (magnetic resonance image) scans, or great works of art. Networks are developing the speed to support interaction, enabling two-way communication and collaboration. Together, digital content and high-speed networks make the once-improbable entirely possible.

In this decade, we will move beyond client/server computing and packet-based Internet connections to global connectivity, which will be embodied in a worldwide, highly distributed computing infostructure. Connectivity enabled by the infostructure will profoundly change access to content, services, and communications. Consider the explosive growth of the Internet in just the past few years. In May of 1996, the Alta Vista search engine had indexed more than 33 million articles and Web pages. Today, it would take more than five years to read the new listings added each month.2 Today, there are approximately 30 million Internet users. This number is expected to grow to 200 million users by the year 2000. The Internet enables individuals to get connected and stay connected.

There are likely to be three paradigm shifts that result from the infostructure: (1) Everyone will become a technology user because costs will be low enough and compatibility will be high. New software will allow the broader population of users to easily deal with ever more complex systems. (2) Inter-enterprise integration will become pervasive. We already see this in the form of electronic links among suppliers, distributors, and customers. (3) We will process and transport bits, instead of things and people; information will displace the physical. Working this way will be faster and less costly, as well as less harmful to the environment.

Technology enables the transmission of information. But fundamentally, the critical process is people interacting with other people. Technology enables us to develop a much more participatory and collaborative society.

The societal implications of participation and collaboration could be immensely powerful. Drawing on research in collaborative learning, we know that there are significant cognitive and non-cognitive effects of collaboration. For example, delivery of education through a collaborative, computer-mediated environment alters the relationship of the instructor, the students, and the course content. The many-to-many, asynchronous nature of the medium democratizes access and encourages student input.3

The basis of the non-cognitive benefits appear to be that cooperative learning is a social method where learners work together as equals to accomplish something of importance to all of them.4 There are positive effects on students� self-esteem, social relations, cross-cultural relationships and attitudes.5 Retention is improved.6 If the power of technology can be harnessed to bring such benefits to society as a whole, it may engender a new era.

Transformations

Technology is a transformation agent. Through technology, higher education can enhance information access that can be used to develop better relationships. Alternatively, institutions can enhance their competitiveness by finding "relationships" in the information they collect. Information empowerment, getting connected, and the apparent compression of time will impact individuals, institutions, and society. Our analysis is that higher education will be transformed by several factors, one of which is the influence of consumers.

Consumers

One of the most significant transformations resulting from these technological changes will be in the consumer market. In many ways, networked consumers are beginning to drive information technology.

First, the consumer market is enormous. By the year 2000 there may be more than 400 personal computers for every thousand people in the seven major industrialized countries. At present, 37 percent of U.S. and 26 percent of worldwide households have PCs. Put into perspective, 26 percent of all PC shipments go to consumers.

It is not only PCs; it is access to the global network. Eleven million homes have Internet access; predictions are that forty million will have it in five years. We will see similar rates of growth outside of the United States; other countries are about two years behind us. Projections are for a billion wired consumers by 2010.

Consumers are influential. The trend is for individuals to purchase the latest technology. As a testimony to their influence, recall that it was the consumer market that drove the acceptance of the Pentium and Windows 95, not the business or the academic community. Increasingly, the de facto architectural standards will be determined by consumer adoption. The result is that consumers will influence higher education, both directly and indirectly.

First, as technology becomes more prevalent among consumers, entering students will expect that technology will be part of their environment. We are already seeing students behave as more astute consumers. For example, those institutions that have adopted policies where computers are required for all students have seen their applicant pools double or triple, faculty report that students work harder and learn more, and students and employers believe they are better prepared for careers.7

This consumer movement also implies that students entering colleges and universities will have grown up on computers and the network. Sometimes called "Generation Y," these children will soon be on campus. As the first generation to take the Internet for granted, Generation Y�s very orientation in space and time will be different from its predecessors. Some are growing up with online pen pals in Europe or Asia. Far more than today, their world will be global, connected, and around-the-clock.8 Generation Y views computers and the network as basic equipment, no more puzzling or remarkable than a refrigerator. Are our campuses, our personnel, or our processes ready?

As consumers acquire and use technology more and more extensively, it will be used to shop online, to make payments, to seek advice, and to eliminate wasted time and travel. With a population comfortable doing comparison shopping over the network, how will students and their parents "shop" for a college or university? The explosion of Web pages for colleges and universities is an early indicator of this trend. However, the use of the network does not stop with looking for information. Students are able to apply to college via the network, order books, get course packs, and pay tuition via the Web. Why wait until college? Why not begin preparing for college by seeking advice from prospective institutions in the eighth or ninth grade?

Distributed instruction, the explosive growth of networks, and the trend to move bits instead of people and things will continue to erode the geographic hegemony of higher education institutions. One potential impact will be on the competition for students. Students will be increasingly likely to select educational institutions based on offerings, convenience, and price rather than on geography. This competition will not be limited to the United States or North America; it will be global.

Getting connected

Another impact may be on the sharing of courses and instructional content. As common formats, increased network bandwidth, and rights management improve, institutions with educational content are increasingly likely to share courses and content. The sharing of authentic information brings students closer to the level of scholarship that faculty experience. Working with authentic material, coupled with learning the "way of thinking" of a particular scholarly community, allows students to enhance their learning.

The ability of students to connect with experts around the world, as well as their peers, opens new opportunities for learning and enrichment, as well. Most students and faculty find these opportunities motivating. In addition to the uniqueness of the experience, contact with other cultures and with individuals from the work place tends to broaden cross-cultural awareness and fosters an appreciation of real-world, complex issues with which students will wrestle upon graduation.

Expanding options

In addition, communication, computing, and networking technologies expand the reach and range of traditional residential colleges and universities, enabling students to synthesize on-campus with online experiences. Some learners seek a mixture of face-to-face experiences and networked-based education. For example, the on-campus student who wishes a more individualized, self-paced, self-directed learning experience can achieve that desire through technology.

With a goal of reducing the time to degree, students may choose to complete courses in residence while simultaneously fulfilling other graduation requirements online. The network expands options for interaction among faculty and students. External experts are more easily accessed; opportunities for faculty to individualize and personalize contact with students is increased.9

Risk to the middleman

Information technology places pressure on the "middleman." Computer networks offer the possibility of the consumer accessing services and information directly rather than going through an intermediary. We have already seen these pressures in business. A common thread among automated teller machines, travel information, and online stock transactions is that the network makes the "middleman" potentially superfluous.

One of the challenges to higher education will be to identify those "transactions" where humans are "in the middle" as opposed to those in which they add value. For example, some higher education institutions are finding that a large percentage (up to 60 percent) of student inquiries for information (e.g., financial aid, overdue parking tickets, etc.) can be handled by an information system; only a modest percent require human intervention. "One-stop-shopping" service centers allow students to get the information they need through a convenient, customer-oriented approach. Institutions using such philosophies are discovering that the result is a reduction in the amount of time students need to conduct business transactions as well as an improvement in customer services. In addition, staff time can be refocused on higher value activities such as establishing a long-term financial plan for the costs of a student�s education or determining how to incorporate an internship into their program of study.

Enables rethinking

The final point to remember about information technology as a transformation agent is that it is a critical enabler for reengineering. Many institutions are finding ways to enhance efficiencies, reduce cost, and apply those human and financial resources to the core activities of higher education. A common approach is to reengineer administrative processes, to reduce bureaucracy but also to identify cost savings that can be applied to instruction or research.

New ways of conceptualizing a process can reduce time and expense. For example, consider a typical travel-expense reimbursement process with an average cycle time of three weeks. Problems within the process include mathematical calculation errors, currency conversion problems, missing signatures, travel expense coding errors, and incorrect routing. Analysis of the travel-expense reimbursement process often reveals a high degree of "non-value-added" activity -- steps in the process that have no value in the eyes of the customer. This process can be redesigned by eliminating the steps that contribute no value, and by introducing new technology and work policies to expedite the process. The redesigned travel expense reimbursement process reduces cycle time from three weeks to three days, reduces errors, eliminates unnecessary reviews and approvals, and places the money directly in the employee�s account.10

Change is imminent

Skeptics still question whether technology will have an impact on higher education. Even though many are calling for the transformation of higher education, an equal number see no reason to change. These "traditionalists" often cite how stable higher education has been for hundreds of years. Since the Gutenberg Bible was printed in 1456 using movable type, the technology of information storage, retrieval, and transmission -- the university�s basic technology -- has remained essentially constant until the current era. Indeed, the use of written records to supplement oral teaching goes back to the fifth century B.C. Since their inception, universities and colleges have relied upon lectures, discussions, and the written word because these were the only technologies available.11

Information technology has opened new, fundamentally different options for higher education, both in how to run "the business of higher education" as well as in teaching and learning. History demonstrates that fundamental technological change ultimately begets significant structural change, regardless of whether the affected participants choose to join or resist the movement. The changes that universities have weathered over the centuries did not upend their basic technology. Information technology does.12

Within the next decade, information technology and its effects as a transformation agent will have dramatic impact on our lifestyles and work styles. Technology will become ubiquitous. Its presence and power will be taken for granted. We will have an increasing capacity to enhance a variety of relationships because of the meaning we derive from information. We will transform data to information and knowledge; knowledge will lead to insight. Our world will be transformed by information technology -- the insight we gain will lead to a world of enormous opportunity.


Sidebar:

Storage

The capacity of data storage devices is growing by 60 percent each year, and data access rates are increasing dramatically. Digital magnetic storage is already cheaper than paper and will continue to be the dominant medium for the storage of active data. The trend is toward smaller, less costly, more rugged disks, with more bits per unit area. For example, in 1995 IBM researchers demonstrated a new world record in magnetic data storage density -- 3 billion bits per square inch -- using advanced magnetoresistive recording heads. At this density, the text of 375 average-sized novels could be stored in a single square inch of disk surface. CD-ROMs are rapidly becoming the preferred storage and publishing medium for text, images, full motion video, electronic catalogues, games, and software. Current prototypes of multilayer optical disks, (with ten disks), a high-density CD, has the capability of storing 6 billion bytes of information, equivalent to more than one million pages of text. In addition, small form factors will enable CD-ROMs to be incorporated in mobile computers.

Microprocessor Performance

There will be continued rapid growth in microprocessor performance, which is expected to double every eighteen months. Clock rates will continue to move ahead rapidly, exceeding 1 GHz or billions of instructions per second by the next decade. By the year 2000, a typical client microprocessor will have the capability of today�s large servers. This performance improvement is enabled by advances in technology such as continued linewidth reduction, increased chip density, and architectures such as Very Long Instruction Word (VLIW).

Application Development

In application development, framework-based composition will replace monolithic design. This trend is driven by object technology, which facilitates the design of application frameworks and reusable parts. The availability of domain-specific parts will enable faster and less costly development of new applications, and this will be done by domain experts, instead of programming experts.

Wireless Communications

Wireless communications and increasing bandwidth will change the landscape, enabling transparent access across all networks. Things will work together without being wired together. Personal area networks (PANs) will be pervasive. With a bandwidth of about 10 Mbps, these transceivers will be built into most products. The only "wire" required for a personal workstation will be the electrical cord. Local area networks, or LANs, will be ubiquitous, with integrated voice and data. Radio frequency will be used within individual buildings or small campuses, and infrared will be used within rooms or other enclosed spaces. Wide area networks (WANs) will provide worldwide universal access through technologies such as Cellular Digital Packet Data, satellite networks, and two-way paging. Communication device form factors, i.e., the size of the "computer," will decrease, while function increases, resulting in wearable computers such as wrist watches that can receive messages and send an acknowledgment.

Networking

Asynchronous transfer mode (ATM) will be the predominant networking technology, enabled by standard interfaces. Its use will be driven primarily by multimedia applications, which require high bandwidth to the desktop for digital video. It will also integrate voice and data over the same network. The cost of ATM adaptors is expected to drop, making them cost effective for office and campus use. High-speed ATM to the desktop will transform computing, since for many applications, it will no longer matter where data and the computing resources are located.

Display Technology

Display technology will change significantly within the next decade. Liquid crystal displays or LCDs will become dominant across a wide range of sizes and will displace the CRT display for most applications. Thin film transistor LCD displays will allow tremendous power savings over CRTs. Projection technology will allow very large, very high resolution displays, reaching display size of nearly 50" in diagonal and exceeding eight million picture elements. At the other end of the spectrum, very small, very high resolution displays will be used in personal viewing systems, such as those used today in immersive virtual reality applications. In addition, we will see the emergence of paper-like displays which will be reflective, very-high-contrast and resolution, and low power.

Appliance Phase

By 2000 or 2005, the network will have changed from a connectivity and transport mechanism to a destination in its own right. Software and some content will become imbedded into the network instead of only into the connected end points.


Endnotes:

1 Dee Hock, "The Birth of the Chaordic Century: Out of Control and Into Order" (paper presented at the Extension National Leadership Conference, Washington, DC, 11 March 1996).

Back to the text

2 Marshall Van Alstyne, �Applying a Theory of Information and Technology to Higher Education� (Paper presented at Stanford Forum for Higher Education, 16-18 October 1996)

Back to the text

3 Linda Harasim, �Collaborating in Cyberspace: Using Computer Conferences as a Group Learning Environment,� Interactive Learning Environments 3(2): 199-130.

Back to the text

4 Robert Slavin, Cooperative Learning: Theory, Research and Practice (Boston: Allyn and Bacon Publishers, 1990).

Back to the text

5 Mary Hamm and Dennis Adams, The Collaborative Dimensions of Learning (Norwood, N.J.: Eblex Publishing Corporation, 1992).

Back to the text

6 Dennis Adams, Helen Carlson, and Mary Hamm, Cooperative Learning in Educational Media: Collaborating with Technology and Each Other (Englewood Cliffs, N.J.: Educational Technology Publications, 1990).

Back to the text

7 Diana Oblinger, Mark Resmer, and James Mingle, �Student Mobile Computing� in The Future Compatible Campus, Diana Obinger and Sean Rush, eds. (Bolton, Mass.: Anker Publishing, 1998).

Back to the text

8 Ellen Graham, �Generation Y: When the Terrible Twos Hit Their Terrible Teenage Years,� Wall Street Journal, 5 February 1997.

Back to the text

9 Diana Oblinger, �High Tech Takes the High Road: New Players in Higher Education,� Educational Record 78 (Winter 1997): 30-37.

Back to the text

10 Kris Hafner and Diana Oblinger, �Transforming the Academy,� in The Future Compatible Campus, Diana Oblinger and Sean Rush, eds. (Bolton, Mass.: Anker Publishing, 1998).

Back to the text

11 William Massy, �Life on the Wired Campus: How Information Technology Will Shape Institutional Futures.� in The Learning Revolution, Diana Oblinger and Sean Rush, eds. (Bolton, Mass.: Anker Publishing, 1997).

Back to the text

12 ibid.

Back to the text


Charlie Tuller ([email protected]) is Technical Advisor, IBM.

Diana Oblinger ([email protected]) is Manager, Academic Programs and Strategy, IBM.

 ...to the table of contents




[Comments] [Search] [Home]