Copyright 1998 CAUSE. From CAUSE/EFFECT Volume 21, Number 1, 1998, pp. 24-27, 32-34, 47. Permission to copy or disseminate all or part of this material is granted provided that the copies are not made or distributed for commercial advantage, the CAUSE copyright and its date appear, and notice is given that copying is by permission of CAUSE, the association for managing and using information resources in higher education. To disseminate otherwise, or to republish, requires written permission. For further information, contact Jim Roche at CAUSE, 4840 Pearl East Circle, Suite 302E, Boulder, CO 80301 USA; 303-939-0308; e-mail: [email protected]

The Technical Realities of Virtual Learning: An Overview for the Non-Technologist

by Kenneth J. Klingenstein

In our enthusiastic embrace of distance education and virtual learning, it is easy to overlook the technological infrastructure necessary to support the effective delivery of services in such environments. This article explores the set of technological issues that underpin the future development of virtual education.

Almost all pedagogy conceived under the rubric of virtual learning seeks to leverage information technology (IT) and so requires the implementation of powerful and readily accessible computer and networking systems. What are the technical and financial implications of upgrading network infrastructure to support virtual learning environments? How will we support advanced applications using such technologies as virtual reality and streaming video? The production of distance learning materials is currently quite expensive, in terms of both courseware development costs and faculty time. What are the prospects for lowering these cost barriers? Will administrative systems provide the tools for access and customization to handle large numbers of remote students? Can we establish a common "middleware" infrastructure on and between our campuses (for example, standards-based authentication, digital signatures, electronic authorization, directory services) necessary for collaboration and resource sharing?

First Questions

At the outset, it is important to sort through the clutter of terms such as distance education, virtual learning environments, and online instruction. For the purposes of the analysis in this article, we consider the broad range of IT-leveraged educational activities, regardless of location of the student (on or off campus) and instructor. IT-leveraged learning involves more than just delivering content. Other factors need to be considered, particularly providing multiple modalities for learning and creating communities of learners. People respond to different kinds of learning stimuli. We need to address the characteristics of asynchronous learning, issues about delivery mechanisms, and distance-insensitive motivational approaches (i.e., how we get people to respond to distance learning the way they do to in-class education). While the delivery of content is an important component of virtual learning, the delivery of the atmosphere, the nurturing of inquiry, and the building of a community of learners are equally critical. We need to ask not only how technology can help deliver content in a virtual learning environment, but if and how it can provide the broader environment that is necessary for effective learning.

As university employees we tend to be institution-centric and forget other points of view. It is important to remember that in the reeducation areas that will be major drivers for distance education, customers (students) may affiliate with multiple institutions. This significantly raises the importance of interoperability and standards between and among institutions. More than common tools, we education vendors will also need common practices, using the tools in standard ways. For example, our limited time to learn the tools (separate from the content) will lead to conservation of interface as a design criterion for learningware. The points of view of the "supply side" -- faculty and teachers -- will need to be understood as well. Teachers will want to use modules and instructional objects from a variety of sources to build their courses. This suggests standards at the educational object level, with the creation and indexing tools that manipulate such objects. To some degree, teachers will move from being composers to being conductors, assembling materials and motivating students more than writing new scores from scratch. While this disaggregation of education allows consumers to pick a variety of learning experiences and customize course timing, content, and interface, it also makes packaging, continuity, and assessment difficult. It is important that higher education institutions preserve their critical role in programmatic sequencing of courses and assessment, even as their other "middleman" role of overall educational broker is reduced.

Living with Technology

A key challenge in leveraging technology to support a virtual learning environment is the volatility of emerging technologies. Some relevant technologies that are needed for an effective infrastructure have not yet reached maturity (for example, see the discussion below on authentication, authorization, encryption, and other security technologies). On the other hand, even wise investments in relatively mature technologies have at best limited life spans in a world where performance doubles every eighteen months.

We are also currently working with some seriously "broken" technologies. For example, hypertext markup language (HTML) is the simplified stepchild of the standardized general markup language (SGML). It was an experiment gone wild, put out on the Internet on a trial basis; yet within a matter of months it became a global de facto standard. Now, the retrofit of the more powerful SGML into the Internet is made difficult by the embedded HTML base. This is just one example of a technology that became established before it was refined.

There are a couple of technological axioms to keep in mind as we sort all this out. Interoperability -- open standards that allow a variety of creative entities (both academic and corporate) to build separate components that work together and leverage each other -- is perhaps the most important principle underlying the rapid development of technology. At the same time, it makes the standards processes themselves byzantine and occasionally inconclusive. Indeed, as networking has moved from an academic activity to a major industry, collegial standards processes have been replaced by competitive forums that often do not lead to a single consensus. (There is a well-known saying that technology standards are so useful that we should have a lot of them.) Managing complexity is the most challenging aspect that we face as the technology builds on itself in this layered fashion. And scaling is an eternal consideration. Technologies must be able to accommodate not only orders of magnitude increase in usage, but orders of magnitude difference in the variety and performance characteristics of environments in which users apply the technologies.

While higher education was the wellspring in the rise of computing and networking, for the most part the torch now has been passed to the commercial sector. It remains for us to focus on the factors that make us different from the corporate world, and concentrate our energies there. For example, unlike corporate workers, who tend to stay at a single computer all day, many of our workers (students) will work at several different computers during the course of their day. This creates a mobility requirement for services such as authentication and customization that we in higher education will likely need to address ourselves. Similarly, our directory services requirements, as public institutions, have aspects that differ from the corporate sector, particularly related to the Family Education Rights and Privacy Act (FERPA), and will need somewhat distinct engineering.

A Taxonomy of Infrastructure Technologies

As we sort through the technological issues in building effective widespread virtual learning environments, it is useful to categorize technologies into four groups:

Delivery systems -- technologies such as on-campus and residential networking, Internet, video servers, and so forth.

Some of these areas, such as delivery systems, are farther along than others, such as creationware, but all are still relatively immature. Even what we have today as proven technologies, such as electronic mail, may work well in their current mode, but scaling and "industrial strength" implementations are unproven. For example, many of us are finding that e-mail, once a critical productivity tool, has become almost overwhelming in volume. That problem is exacerbated in virtual learning environments as well; several online universities have reduced their student-to-faculty ratio expectations as teachers reported their inability to respond in a timely fashion to student e-mail correspondence. Industrial strength e-mail with encryption and digital signatures and standard means for attachments is also still not commonplace in the online world.

Delivery systems and basic instructional tools

Two areas to consider here are physical delivery options and the communications and information tools available with those options. It should be noted that the people to support such services are a critical component as well; other CAUSE/EFFECT articles and a CAUSE professional paper have examined the crisis in support services.1

Physical delivery systems

In general, higher-speed lines and frame relay services have made delivering virtual education to businesses and schools easier, but delivery to the home is still a major challenge. Emergent extensions of telephony called the DSL group (ADSL being the most promising) can offer high-speed access to the home, but much as with the earlier approach of ISDN, these digital subscriber services are limited to homes fairly proximate to central telephone offices. (Note that both ADSL and cable technologies are deployed with asymmetric bandwidth, so that home users have little upstream bandwidth compared to the into-the-house flow. Whether this is an impediment to virtual learning depends on the tools in use; streaming video into a house would be viable, but video conferencing between a house and a remote instructor needs symmetric bandwidth.) Satellite technology has been proven to be effective in a broadcast environment of one-to-many, that is, reaching great numbers of learners from single points of dissemination. There are low-earth orbit technologies that promise to enable the delivery of two-way Internet services (including video) to rural environments, but they are not moving along as rapidly as was hoped.

What we see today in the Internet is pretty much what will be available for the next year or two. The Internet2 project promises potential solutions to the challenge of enabling new and innovative network applications, especially those that require high bandwidth or other committed transmission characteristics. The biggest invention to be delivered from the current Internet to Internet2 is quality of service (QoS), that is, the ability to send some packets with a guaranteed level of service. In some sense, the current Internet is about connectivity, while Internet2 is about differentiation. (It should be noted that another of the stated goals for Internet2 is a scalable multicast approach, which is necessary to permit viable ad hoc one-to-many video communication and distance education applications.) The technical challenges here are considerable, and solutions will not reach broadly into higher education for some time.

Dedicated video links are expensive and do not scale well; they are an interim component of the distance learning technology infrastructure. While economics will continue to justify one-to-many video via satellite, most multicast video services will ultimately be carried on top of conventional data networks.

CD systems are another alternative for delivery of content, the cheapest bulk-rate delivery medium. This technology can be used in conjunction with the Internet, so that it's easy to incorporate hot links as well as permitting updates via the Internet. They should not be overlooked in the move to fancier delivery schemes.

As we get deeper into the provision of virtual education, we will develop some good practices around the servers that house and distribute content. For example, we may situate information servers beyond internal firewalls to permit distance education or within our Internet protocol (IP) address space to license databases. It is likely that video servers (with their exceptional network bandwidth requirements) that are intended to serve external communities may be located in gigaPOPS sites or other external locations, so that the traffic does not congest our own Internet linkage.

Information and communication tools

Electronic mail is already in widespread use. While e-mail delivery mechanisms will improve somewhat, this medium is relatively mature (technologically, if not sociologically). Better security and multimedia for e-mail will finally become viable in the next year or two.

Two other communications tools, Net News and IRC (also known as "chat"), can create an electronic agora. While we have improved these tools over the years, they still have limited filtering and archival mechanisms and very little structure. They serve some role as general discourse tools, especially in asynchronous classes, but cannot deliver content well.

In terms of information tools, the Web is obviously powerful. Java applets offer particular promise, especially to facilitate simulations, but this technology is not without challenges. For example, for a short time one could be confident that a program written in Java would be executable in every Web "browser," but Java is now fragmented and less a standard. Other issues, such as applet authentication, are also unresolved.

The University of Colorado (CU) is one of several campuses experimenting with desktop video across the Internet, commonly referred to as "video over IP." Our early experience indicates that it is truly a "killer" application (both in its appeal and its impact on the network). It's not just that it's video; it's that it is video to a computer rather than to a television, fundamentally changing the use of video because of the appliance it is plugging into. Today we can cobble together off-the-shelf technologies to run coordinated conferencing -- with video, voice, and data -- across IP networks (albeit lightly loaded networks). In one current CU journalism class being delivered to a computer lab one hundred miles away across the Continental Divide, the student computers display a real-time video of the instructor. When she opens a Web page, it simultaneously opens on the students' browsers. In yet another part of the screen (large screens are very helpful in virtual education), the instructor can play a video tape through a VCR connected to her computer; the students see the video and hear her voice-over describing salient aspects of the scene. This technology may be costly today, in terms of equipment and network capacity, but these costs will drop considerably in the future.


Middleware is a term for the evolving set of software tools needed to turn rough capacities into useful services. Today, these areas include network and individual security, customization (for example, bringing your folder of browser bookmarks, e-mail aliases, and other preferences to your current location), and access to personal files. In the future, new applications such as calendaring and video may add new middleware requirements.


Currently, the single most important middleware gap is in the realm of electronic identification -- proving that individuals are who they claim to be, either by something they know (for example, a password), something they have (for example, a smart ID card) or something they are (the new field of biometric authentication). To enable a virtual learning environment, two needs are immediate: a campuswide authentication scheme for students of an institution, and interoperability of such schemes between institutions.

We need an effective campuswide authentication scheme for two reasons: (1) to allow students to access site-licensed materials, such as databases, digital libraries, or local courseware, and (2) to allow students to utilize administrative support services, such as online forms and query of official records. Current instructional software controls, such as IP address, are inadequate, and current administrative support services use inherently insecure access schemes.

Institutions will form consortia to purchase instructional materials in bulk; the vendors of those materials want to have a common way to grant the consortium privileges. Individuals will affiliate with multiple institutions and want a common means to confirm those separate relationships.

There are several candidate technologies for authentication, and the limits of each illustrate the complexity of the task.

Kerberos was developed by MIT some fifteen years ago as a basic password encryption mechanism. Its simplicity is a strength and a weakness. The simplicity helps in its deployment and low-cost operation, yet Kerberos does not provide interoperability between authorities, e-mail authentication, and other advanced features.

X.509 is a set of protocols that permits authentication through digital signatures and encryption of e-mail. Unlike Kerberos, this technology has a hierarchical structure that will permit an authenticated exchange of authentication credentials among disparate authorities. However, it is not a low-cost technology, either in deployment or operation. Moreover, the current generation of X.509 implementations depends on a user's credentials being stored on the hard drive of the local machine. This model works for the stationary workers of a typical corporate setting, but does not accommodate the roving student-user on a campus. (This is an example of one of the uniquely higher education challenges mentioned above, where industry may not readily develop academically appropriate solutions.).

Smart card technology offers a long-term promise through the use of computer chips that can embed digital credentials in credit cards. However, it is unlikely that home computers, the platform for many distant learners, will soon have readers for smart cards.

It is likely that different universities will choose different alternatives, based upon their embedded technology base and their urgency in deploying authentication and other security schemes. The issue is less which option an institution chooses and more that one does choose an institution-wide authentication approach with a plan to interoperate with other institutions' approaches. Institutions should follow activities such as the CNI authentication project2 to stay abreast of developments in this area.


A key follow-on to authentication, authorization refers to the provision of a set of attributes and characteristics to authenticated individuals to permit certain electronic interactions by those individuals. Indeed, electronic authorization is where the real payoffs exist. In some sense, authorization extends the Access Control Lists (ACLs) of computing lore into a general set of permissions and restrictions that govern the kinds of actions that an individual can take. Authorization will determine who can modify what data, what path workflow documents will take for approvals, and which students can access particular reserved materials.

Authorization is a very difficult challenge for several reasons. First, the enabling technologies are not evident at this point. Generally each application has its own internal authorization mechanisms, and they do not interoperate with other applications. Second, the maintenance of permissions on an individual basis is extremely time-consuming; it would be far better to establish group characteristics and define an individual as the intersection of various groups (and perhaps some individual "negative permissions" to allow for exceptions). For example, a departmental secretary working on a research project would have a set of permissions representing membership in the department, a job classification, and a project code. But a group-based approach, while administratively viable, raises the hardest of issues -- establishing the classifications and associated metadata that define the classifications. In preparation for electronic workflow, one major university spent two years establishing some ninety distinct relationships that an individual could have with the institution, only to discover during implementation that a president emeritus of the university was not in any category.

Customization and directory services

One of the characteristics of the information age is the concept of "mass customization," that is, developing applications that permit a wide variation in preferences and features and enabling valuable personal datasets such as e-mail aliases, bookmarks, and so forth. With the increasing complexity and volume of software, we have come to rely heavily on these personalizations to make the world tractable.

Two further developments in mass customization could have positive effects on virtual learning. We need to learn how to make personalization portable, so that we see the same networked world regardless of location and computer. And, as contradictory as it sounds, customization must become standardized. We will need ways to move our preferences between applications and interoperable tools to manage these personal data. The core technology on which to build these services lies in middleware called directories. Directories are standard repositories for storing these data; protocols such as LDAPv3 can access and manage these data, providing portability and interoperability. We have a fair distance still to go to refine these tools and put them into effective use.


Beyond the basic information and communication tools described above, some virtual learning is based on advanced instructional environments such as virtual realities and multimedia courseware. Creation technologies for these systems, be they Web-development tools, Java applets, or multimedia authoring systems, are still immature, both as tools and in the marketplace. There is little interoperability and much volatility in key pieces of the technology, and the management mechanisms are limited.

The primary tools for Web development are a mix of homebrew systems and proprietary packages. Most take the user fairly far but leave some magic to local systems administrators, in that they lack the scalability necessary for the large volumes of volatile online information to come. The current lack of standards for instructional objects, coupled with still-complex developer interfaces, makes multimedia authoring a difficult task that locks a developer into a proprietary environment. Java applets show much promise, but the recent destandardization of Java, along with some nagging technical problems about security and categorization, are of concern.

The target courses for virtual learning are not clear. On one hand, it is well known that a relative handful of core academic courses (for example, Basic Chemistry, English 101, American History) account for the bulk of college education credits; there is clearly the highest payoff (academic and economic) to using virtual tools to support these large lecture classes. Yet some, by their nature, are otherwise poor candidates for virtual education. Another driver for virtual education is need for technical classes for corporate workers to stay current. Yet the very volatility that drives the need for reeducation makes investment in such course development risky. A faculty member is not going to make a huge commitment to putting a course online if 50 percent of that course is going to erode in a year.

While not a strictly "technical" issue, the current realities of intellectual property greatly compound the complexity of creating educational systems. Few professors understand the ongoing changes in copyright and their consequence on the display of materials; the resulting permission processes are byzantine. Technical solutions to copyright are not being pursued as aggressively as containment. On the other side, few faculty understand their ownership rights and limitations, especially with regard to institutional rules related to electronic materials they create.

Administrative support: systems upgrades and metadata creation

Two technical areas will be essential to the effective administration of virtual learning: opening up our current administrative systems (especially student information systems) to student access regardless of location, and developing new systems to administer and manage the instructional objects of virtual education.

Many institutions have started to extend access to core administrative systems, generally through Web interfaces. These efforts will need to include authenticated updates and transactions as well as the inquiry mode usually deployed. The prospect of opening up institutional financial systems to student users, in a manner consistent with that student's access to academic resources, requires strong technologies and partnerships with internal auditors as well as faculty.

The greater challenge lies in the development of computing systems to manage the curricular objects and student profiles that represent the components of virtual learning interactions. Curricular objects include the academic modules, their subcomponents (be they MPEG videos, audio streams, simulations, applets), workflow and homework submission systems, database access tools, and museum reference systems. We need to be able to find such objects, associate objects together, pass data among objects, enable distributed change control, and monitor the history of all these activities. Going against all these activities are sets of student users, bearing profiles and permissions to perform actions on the objects. Those actions may include reading the object or submitting homework to it. And in turn, both those objects and those profiles, filled out by educational institutions and users across the world, will need a consistent interpretation of their many variables and parameters. This metadata may well be the toughest area of all; to date, we have neither the tools nor the predeliction for such cooperation on meaning.

One recent development in particular deserves considerable attention. The Instructional Management System (IMS) project 3 is attempting to develop a broad range of standards in support of virtual learning. Led by Educom, it is a joint effort of higher education, K-12, and training organizations seeking to enable interoperability among institutions, software, and users. It has already created draft standards and the metadata to characterize both learning materials and users, and has been informed by solid scholarly work in this area, such as the Dublin Core and other digital library research. If it holds its consensus together, and continues to implement the best of breed research, IMS will provide a considerable contribution to virtual learning.

Right Level of Investment

There are several key factors for institutions to consider in evaluating the technological investments for virtual learning.

How important is virtual learning to the institution's role and mission? It is clear that not all schools will find it strategically or economically appropriate to pursue virtual learning; indeed, many may be adversely effected by the virtual worlds to come.

What investments should be made? The inventory of technological needs described above is long and costly. Foci and priorities are essential. One rule of thumb is that those pieces that are also germane to the broader academic enterprise, such as authentication and Web/e-mail/video servers, are clear wins. Tolerance to volatility may affect when and how the monies are spent. The leading edge is always more expensive and frequently leaves avatars with implementations that are inconsistent with final standards.

Where will the funding come from? While virtual learning is often, and perhaps inappropriately, touted as a cost saver, for now much of the infrastructure is not in place at many institutions, so it will mean spending more than saving dollars. Return on investment will not be immediate. It would appear that the savings are not going to be nearly as dominating as anticipated until we have better tools for interactions between faculty and students that are less consumptive of faculty time. It may also be the case that twenty-five students is always going to be the optimum number for a "community of learners," regardless of the technology tools available. On the other hand, there is great promise for decreasing costs in areas where human interaction is not required. For example, in the area of student registration, it should be possible to reduce the cost per transaction dramatically.

Next Steps

Despite this litany of technical realities, we should not despair. Virtual education is inexorable, not only for the power and economies that it may afford, but for the changing base of customers and their orientation to online activity.

There are several steps that individual campuses should consider, regardless of their long-term commitment to virtual learning. As mentioned above, there are a number of basic investments in networking, campuswide authentication, and administrative system interfaces that should be considered now. Robust desktop computers on campus, and standardization of the software on those desktops, are obvious needs. Beyond these concrete steps, there is the large cultural education that is required to raise the pedagogical issues within the academy, to initiate the discussions and evaluations that will provide incentives for the faculty to build the grist of the next generation of learningware.

On the national level, we need to move towards interoperability and support the continued development of tools. Initiatives such as the CNI authentication project and the IMS project are important efforts. Federal agencies need to promote the core technologies, assessment approaches, and intellectual property structures that are still needed to move us closer to the promise of virtual learning.

The last twenty years have been a breathtaking ride on the beasts of technology, lurching fitfully towards an uncertain future in our life and our learning processes. As we conquer the technical challenges that lie ahead, the ride will only accelerate.


1 See Polley A. McClure, John W. Smith, and Toby Sitko, The Crisis in Information Technology Support: Has Our Current Model Reached Its Limit?, CAUSE Professional Paper #16 (Boulder, Colo.: CAUSE, 1997 ( and J. Michael Yohe, �Information Technology Support Services: Crisis or Opportunity?� CAUSE/EFFECT Fall 1996, 6-13 (

Back to the text

2 The Coalition for Networked Information has developed a white paper on authentication issues. See

Back to the text

3 See

Back to the text

Ken Klingenstein ([email protected]) is director of Information Technology Services at the University of Colorado, Boulder. the table of contents

[Comments] [Search] [Home]