Glimpses of Our IT Future: What's Green, Mobile, and Regulated All Over?

min read

No, it's not a corny joke your six-year-old might tell. "What's green, mobile, and regulated all over?" is a fairly succinct description of today's evolving IT environment. More than ever, we find we are dealing with a new interpretation of Moore's Law. As technical capabilities continue to double regularly, energy, economic, and regulatory constraints on those capabilities are increasing exponentially as well. The 2008 EDUCAUSE Evolving Technologies Committee attempted to capture a glimpse of the collective future for IT professionals in higher education, anticipating not only the evolving technologies but also ways to address the ever-evolving constraints placed on our ability to provide the technologies that our institutions have come to expect.

Each year, the members of the EDUCAUSE Evolving Technologies Committee identify and research the evolving technologies that are having the most direct impact on higher education institutions. The committee members choose the relevant topics, write white papers, and present their findings at the EDUCAUSE annual conference.

2008 EDUCAUSE Evolving Technologies Committee

Beth Forrest Warner, Committee Chair
Assistant Vice Provost, Information Services
University of Kansas

Kelvin William Bentley
Director of Online Learning
Northampton Community College

A. Michael Berman
Senior Vice President/Chief Technology Officer
Art Center College of Design

Sharon Collins
Project Manager
East Carolina University

Anthony D. Conto
Office of Information Technology
University of Maryland

Susan M. Lewis
Special Assistant to the Dean
University of Illinois at Urbana-Champaign

John W. McGuthry
CIO
Armstrong Atlantic State University

Mary Christine McMahon
Manager, Research Application Development
Saint Louis University

Tina Meier
Director, Server Administration
Oklahoma State University

John S. Moses
Director, Technology Planning, Biological Sciences
University of Chicago

This year, under the leadership of Committee Chair Beth Forrest Warner, the committee selected five evolving technologies, presenting a brief overview at EDUCAUSE 2008. Published below are excerpts from the white papers on each topic, written by individual members of the committee: green enterprise computing, by Tina Meier and John S. Moses; location-aware computing, by A. Michael Berman, Anthony D. Conto, and Susan M. Lewis; virtual worlds, by Kelvin William Bentley, Sharon Collins, and Anthony D. Conto; business process management, by John W. McGuthry; and regulatory compliance, by Mary Christine McMahon.

The full white papers can be found on the Evolving Technologies Committee website (http://www.educause.edu/EvolvingTechnologiesReports). These white papers address many other strategic areas for each evolving technology: key questions to ask; the implementation challenges; the major vendors and how to judge among them; how to proceed and the issues to be addressed; and the likely impacts in the next three to five years.

Green Enterprise Computing

Green initiatives are under way on many campuses today. A focus by senior administration on greater operational efficiencies in an age of rising energy costs seems to be the most prevalent reason. For example, at Oklahoma State University (OSU), the Green Computing Initiative has taken a look at desktops and computing lab environments. This is not an isolated instance. On June 20, 2008, one hundred CEOs worldwide endorsed a statement signaling the intent for "green IT."1

Why Is Green Enterprise Computing Important to Higher Education?

The makeup of systems within data centers has changed over the past twenty-plus years. For example, OSU's data center was built in the 1960s, with several follow-up phases of improvements and expansions. Each improvement or expansion provided what was needed at that time. However, these improvements and expansions were often carried out without much thought given to energy reduction—mostly because systems were not in place to enable doing so. The situation is different today. For those of us involved in the design of new data centers or perhaps the replenishment of existing ones to meet capacity demands, what areas might we examine to reduce energy use? If we had a fresh sheet of paper and crayons to draw plans from the ground up, what would we look at, and where might we start?

How Is Green Enterprise Computing Evolving?

Fee-based services, such as Gartner's Best Practices for Data Center Design and Burton Group's Data Center Practice, offer up-to-date information on green enterprise computing, as do also free websites such as TechTarget's Data Center (http://searchDataCenter.techtarget.com) and Server Virtualization (http://searchservervirtualization.techtarget.com). Also, colleges and universities that have recently gone through this process might be willing to share their knowledge (e.g., RFPs, evaluations, assessments). Finally, numerous resources can be found through the EDUCAUSE "Green Computing" topic area (http://www.educause.edu/Resources/Browse/Green%20Computing/35635).

Three areas worth focusing on immediately are (1) power and cooling, (2) server consolidation and virtualization, and (3) systems management. Regarding power and cooling, it is estimated that the energy cost of a server has outstripped—or will soon outstrip—the capital cost associated with it. But reducing and consolidating the number of servers in the data center is not the only answer. Working with the facilities office to come up with benchmarks and possibly contracting with an architect or consulting firm that specializes in green data center design will help optimize efforts. Vendors too may finally be stepping up to this challenge. For example, the Networked Embedded Computing group at Microsoft Research is working on a coordinated network of sensors and software algorithms that help start up or shut down servers as needed. And VMware allows for the automatic powering off of VMware guest systems based on certain parameters.

The second focus area is server consolidation and virtualization. The leading solution for data center virtualization continues to be VMware, but Microsoft's Hyper-V, Citrix XenServer, Virtual Iron, and Parallels are other options. Choosing a solution depends on the particular needs of the institution, such as high availability, unified management interface, multiple operating system support, snapshots, and automatic powering down of non-utilized servers. An institution should conduct an assessment of the data center, determine the potential cost savings to be had from virtualization, identify service-level dependencies across servers, and confirm the cost savings at the end of the project. Cost savings will be both qualitative and quantitative.

The third focus area involves systems management. An institution should start by examining the services being provided and the systems that provide those services. For example, are servers dedicated to a single application? Do servers stay powered on even when unused? A review of existing data centers might shine some light on single-purpose servers and those with utilization at less than 40 percent. Virtualization can and should help in most cases.

Vendors offer many tools to assist with green data center design. Most of these tools focus on server consolidation, server virtualization, and intelligent facility power and cooling design. However, there are also non-vendor-specific tools, such as the following:

Conclusion

When thinking about a data center redesign and server consolidation, institutions should review their academic and administrative strategic plans to determine future infrastructure needs. Many institutions are examining the option of outsourcing commodity services, such as web, database, and e-mail. The convergence of technologies is also resulting in the consolidation of services within IT areas—for example, ERP-type implementations that are converging the data center, server administration, application development, networking, and security areas. Should all of these areas be making separate decisions for data center usage? A thorough assessment will assist the institution, the facilities office, and the architects in provisioning the most cost-effective solution when redesigning a data center for green enterprise computing.

Location-Aware Computing

Location-aware computing refers to systems that can sense the current location of a user or device and change behavior based on this location. The best-known example is the Global Positioning System (GPS) navigation device. Since a GPS device knows its current location, it can give directions to the GPS user for how to get to a new location, and it can update these directions continuously as the device moves.

The reduction in price of location sensors, combined with the widespread availability of highly sophisticated portable devices (particularly smartphones), has resulted in the increased impact of location-aware systems in recent years. Most commonly, a location-aware system will determine its location through one of the following methods, ordered from least precise to most precise:

  • Mobile-phone triangulation: determines its location by estimating its distance from multiple cell towers using the strength of each signal
  • Wi-Fi triangulation: uses the same principle as mobile-phone triangulation but estimates the distance from Wi-Fi access points
  • GPS: determines a location based on signals from multiple satellites
  • Radio-Frequency Identification (RFID): uses the signal from one or more RFIDs

Other location-aware devices have been based on TV signals, Bluetooth, infrared light, vision, ultra-wideband radio, and ultrasonic signals.2

Once a location-aware device determines its location, it can then take action or update information based on the location. A range of applications—from "friend finders" to surveillance systems—can track the location of individuals with location-aware devices and report those locations (this does, however, raise important privacy issues). Smart-guide systems can give the user information as he/she passes through an environment, such as a museum or a college campus. Rich-media systems can offer sounds, sites, and data associated with specific locations to bring a historic event to life. Simulation games can change behavior based on the choices that users make to move through a space. Location information can be "mashed up" with systems such as Google Earth to create new applications relatively rapidly and inexpensively.

Why Is Location-Aware Computing Important to Higher Education?

Some colleges and universities have begun offering services to students based on location-specific information. And there are many more opportunities to tailor services and minimize cost using the ubiquitous cell phone. Indeed, the predicted demise of land-line phones will hasten the growth of location-aware services. These services will also provide an inexpensive entry into research efforts that have a mapping component.

Students, from their first visit to a campus, can use services built around their cell phones to enrich their knowledge of the campus and gain easier access. Campus visitor centers can provide tours via cell phones to augment existing student-led tours. Some campuses have used cell phones to provide audio tours of their museums and libraries. These services personalize the student experience and can augment, or in some cases replace, existing offerings. These services are very useful for institutions that want to provide anywhere/anytime response but that lack the budget for twenty-four-hour staffing.

A number of campuses have also utilized cell phones as a delivery device for information, such as campus bus locators. It may make sense for a campus to implement geo-tracking services to help with traffic management—for example, when 20,000 people arrive on campus for a football game. Letting visitors know, via cell phone, which parking lots are open and which streets are less congested could mean that fewer police will be necessary to manage traffic.

In addition, GPS-equipped cell phones can be used in tabletop exercises for emergency operations training, offering simulated experiences based on location. Colleges and universities have already begun to use GPS-enabled cell phones to provide enhanced security services to their students.

How Is Location-Aware Computing Evolving?

Location-aware services are beginning to transcend specific devices and proprietary services. Costs are coming down, and many more technologies and applications support these efforts. There is a shift from dedicated GPS devices to the GPS-enabled cell phone for data collection and distribution. With GPS-enabled cell phones, geo-tracking photos, and Google Earth, anyone can be in the business of providing these services to others. And with the recent availability of a free development environment for the iPhone, interest in building location-aware systems will be on the rise.

Although the potential for location-aware services is great,3 the tracking capabilities have raised concerns that the technology, in particular RFID tags, can be used maliciously to track people without their knowledge. Since RFID tags can be read from a distance, they could be placed surreptitiously in items of clothing, for example, so that people can be covertly tracked wherever they might travel. Some experts discount privacy issues by noting that these issues can be easily addressed by using chips that permanently destroy RFID capability at the point of sale of items. Others believe that these privacy issues are real and will take a combination of technology and policy to safeguard.4

Non-RFID-based systems pose privacy issues as well. If an application is built to track the location of students or others on campus using cell phone location, who has access to this information? Do "friend finder" systems increase the risk of stalking? As these applications become more widely implemented, campuses will have to be alert to the privacy implications and will need to address them through policy and practice.

Conclusion

Location-aware computing allows institutions to offer new and exciting applications on campus. It also provides a rich environment for experimentation and for construction of interesting prototypes. Given the coming near-ubiquity of mobile phones with locative technology pre-installed, and the ready availability of software tools for developing location-aware systems, the quantity and range of campus applications will expand rapidly in the near future.

Virtual Worlds

Virtual worlds are 3-D, computer-based, simulated environments in which individuals create an avatar, a virtual person that can resemble themselves or can take on almost any shape of their imagination. Almost anything in real life can be constructed or emulated in a virtual world, without the gravitational properties and physical impairments of real life. Virtual worlds are social and imaginative; they break the "draw within the lines" boundaries of many past philosophies of education.

Virtual worlds such as Second Life, Active Worlds, Project Wonderland, and World of Warcraft are part of the Web 2.0 craze. The socially interactive Web 2.0 tools fulfill the social networking/presence requirements of Net Gen, Generation X and Y, and Millennial students—individuals who will drive the technology needs of the future by demanding innovation as never before. Thus, Web 2.0 and the still-to-come Web 3.0 tools are being developed to address their needs.

Why Are Virtual Worlds Important to Higher Education?

Virtual worlds are excellent social networking tools, allowing hundreds of thousands of users to engage each other in real time. Interaction with other avatars can be between nearby officemates or between people from anywhere in the world. Want to learn a new language? Teleport to France or Spain in Second Life and have a conversation with people in their language, learn their culture, and talk about their work and family lives. The importance of the virtual worlds environment to education will only expand in future years as the technology matures.

In a virtual world, educational tools—such as PowerPoint presentations, images, links to websites, course materials, and also 3-D objects—can be used and shared. Students can walk through a heart valve or around a DNA model replication. They can visit museums and step into a reproduction of work by Monet or other artists. The ability to teach students in topics that have never been possible before—and in an environment that is engaging, interactive, entertaining, and challenging—is the essence of virtual worlds in higher education. Even though many of us may hate to admit it, the static chalkboard is fading. Virtual worlds add a new dimension to face-to-face teaching as well as to distance education, bringing the two worlds together.

How Are Virtual Worlds Evolving?

Some open-source virtual worlds are not quite as sophisticated as Second Life, the virtual worlds created by Linden Lab. Project Wonderland, an open-source toolkit for creating 3-D virtual worlds, is in the early stages of development and holds much promise. Project Wonderland has a roadmap that includes improving the usability of virtual worlds through interaction with objects on different levels, the capability to have more than two hundred simultaneous users in several rooms, additional security, improved live application sharing, the ability to record video in-world, private voice-chat, and the placement of 3-D models in-world.5

Croquet, another popular open-source virtual world, has a myriad of features. Unlike Second Life, Croquet does not run on a centrally based server but instead on each user's local machine. The interactions between players result from commands sent over the network, which are then executed on fast local virtual machines. The roadmap for Croquet includes the following as high-priority items: the development of navigational controls that are more like game navigation conventions; the integration of hybrid 2-D/3-D applications in which 2-D objects can exist and act as full 3-D objects with no additional support; an integrated instant messenger client that can act as an ad-hoc rendezvous service by exchanging "meet me" information with other Croquet users; voice-over IP and video integration; and the ability to support multiple languages.6

The applications for use in virtual worlds such as Second Life are increasing in number and use. Companies are using Second Life for marketing and training purposes. Educational institutions are using the virtual world to deliver online courses and enhance face-to-face courses. Although these augmented-reality environments have been relegated to a laptop or a desktop computer in the past, mobile access to virtual worlds is becoming a new reality. For example, in June 2008, Second Life became available on mobile phones with 3G or Wi-Fi through Vollee, an application that can currently be downloaded on more than seventy models of mobile phones. Chris Mahoney, business development manager at Linden Lab, emphasized the importance of this achievement: "For Linden Lab, this represents an intuitive way to extend the reach and accessibility of the Second Life Grid platform. This is a great way for Second Life Residents to stay connected to their friends, business, and experiences in-world, wherever they are."7 Students can use their mobile phone to log in to Second Life and interact with classmates and people throughout the world.

Another mobile application, from Comverse Technology, allows Second Life to run on Java-enabled mobile phones and integrates short message service (SMS) and instant messaging (IM) with streaming video directly in-world. In addition, Google and the Open Handset Alliance developed Android, an open-source mobile phone platform that certain mobile phones will include. This technology also will allow users to run Second Life on their mobile phones.8

What other uses will virtual worlds deliver? The U.S. federal government is studying the use of virtual worlds combined with artificial intelligence (AI).9 This is the perfect environment, since emotions and interactions are already in place. The virtual robots (avatars) are created by users, and AI features can simply interact with these avatars. The environment is controlled and includes holographic projections in the future. U.S. military services are also using virtual worlds to train pilots and tank crews.

Future abilities also include moving within a virtual environment without the use of a mouse. With a camera capturing body movement, avatars can move through virtual worlds quite easily. Mitch Kapor and Philippe Bossut designed a prototypical interface that demonstrates the possibilities for operating Second Life "hands free," without a mouse or a keyboard (http://handsfree3d.com). To make this work, they modified the open-source Second Life client to support a 3-D camera as an input device.

Researchers from Georgia Institute of Technology in Atlanta, Georgia, and from Ludwig-Maximilians-Universitat in Munich, Germany, are doing interesting work in the field of augmented reality (http://arsecondlife.gvu.gatech.edu/). Avatars and environments created in Second Life can be integrated with real-life images and environments using a 3-D video camera. Integration with real-time video feeds will allow future Second Life applications to include enhanced real-time collaborations and recorded videos.

Conclusion

The vastly popular virtual worlds are becoming a way of life for upcoming generations of college and university students. Those of us in higher education need to capture and hold the attention of these students with more than words. Virtual worlds can do so, while delivering the interactivity and the social bonding that students need to further their learning. Instructors will thus need to learn more about virtual worlds, and their applications, to further the growth of education and learning.

Business Process Management

A business process is a set of business tasks that are used to accomplish a specific business outcome. A business process typically begins with a specified need and ends with a completed product or service. In many organizations, business processes are developed by accident. In these instances, there was no real thought given to how the process was developed. But whether processes have been developed intentionally or by accident, almost all processes will, at some time, need improvement. Unless someone or some organization makes a strategic decision to improve business processes, they will not improve on their own, and they will not change.

Business Process Management (BPM) is a systematic strategic approach to improving an organization's business processes. These processes can include anything from ordering equipment to processing human resources claims to receiving trouble reports or help-desk calls. These processes many times involve different departments, customers, technologies, partners or competitive organizations, and other components associated with starting and completing the processes. BPM looks at those processes and identifies new or alternative processes to make the desired output either faster or less costly or to improve the quality or quantity.

Why Is Business Process Management Important to Higher Education?

Although the vast majority of examples in the marketplace today are not concentrated in higher education, there is an ever-growing trend in higher education to begin improving operational environments. The demands to retain and attract students and highly qualified faculty, to reduce operational expenses, and to eliminate and reduce errors in operations will continue to push colleges and universities to focus on process improvement initiatives.

An illustration might be the process of entering new employee data at a college or university. First, the human resources department enters benefits and personnel data, using two different enterprise applications. (This is a very common scenario in higher education.) After this information is entered into those applications, another department enters information into an ID card system. Then another department issues a parking permit and enters more information into a different system—and so on. In many instances, the new employee process in higher education happens correctly only by chance.

One example of a BPM solution would be to have all the departments use a tracking system. This system would be able to track the progress of the employee through the various stages. Each department would be identified with its specific tasks based on where it is in the process. The BPM solution would identify when tasks are completed and by whom. It could notify individuals when their tasks are completed and track the entire process until all components are finished. By implementing this solution, institutions can identify potential confusion, as well as find new and innovative ways to complete the process.

How Is Business Process Management Evolving?

A BPM solution (or, in some instances, a process modeling solution) is a solution that monitors business process. It provides insights into the inputs to the process, all of the associated steps during the process, and the outputs of the process. It can calculate key statistics associated with the process, such as time and cost measures. It can help institutions make critical decisions about required or even forecasted human resources requirements. Most important, it can provide all of the reports and measures needed for an institution to determine when a process may need to be modified or improved.

A BPM software solution is software that helps institutions improve business processes by aiding them in determining which processes should be automated. Software solutions can help identify when business rules are not being followed and when waste is being produced. These software solutions often are set up to model business processes. In many instances, they are set up to extract data out of or insert data into enterprise applications. BPM software can be thought of as an integration layer for the processes and the associated enterprise applications.

Most large organizations use multiple enterprise applications for a variety of functions. Often, users must use multiple applications to complete a single task. Even though much of the data may be shared, processes may require individuals to interface with different applications or different components inside of a specific application depending on the circumstances. BPM software can be used as the glue to bind the process together and eliminate the need to switch applications. This can allow the individual user to focus on following the process rather than on trying to remember the steps required to follow the process.

Many BPM software solutions are on the market. Since the market is young, buyers should be wary of the almost certain change to come. There will be some consolidation, as well as market entrants and market exits. The following is a list of a few of the BPM applications commonly used today:

  • Pegasus System
  • Savvion
  • Lombardi
  • Oracle
  • IBM
  • Appian
  • Software AG

This is by no means an exhaustive list. Prospective buyers should perform an extensive search to find the solution that best meets the institution's long-term needs and direction.

A few "soft" standards exist for BPM solutions. Some of the more widely used standards are Business Process Execution Language (BPEL), XML Process Definition Language (XPDL), Web Services Description Language (WSDL), and Java Message Service (JMS). Again, this is not an exhaustive list. An institution should be sure that it is prepared to support the standard used by the solution that it chooses.

Conclusion

BPM solutions apply a technical solution to business process improvement activities. They can help identify gaps in existing processes and can help reinforce both new and existing processes. Whether an institution is working on establishing a new or improving an existing employee process, student recruitment process, or procurement process, BPM solutions can serve as a key component in fostering change.

Regulatory Compliance

College and university regulatory compliance and its relationship to evolving technologies can be viewed in much the same way as a cosmologist observes dark matter and its relationship to the universe. According to astrophysicists, dark matter accounts for the vast majority of mass in the universe. The inferred presence of dark matter is based on its gravitational effects on visible matter, with outcomes therefore observed as the effects of unseen actors. Scientists are attempting to discover what dark matter is, how much there is, and what effects it may have on the universe. In much the same way, IT organizations in higher education are seeking to quantify the impact of regulatory evolution and anticipate the proliferation of regulations, laws, and mandates, all while trying to achieve sustained compliance by optimizing the value they receive from good data management, attention to service, and security operations.

To further complicate the organizational challenges surrounding compliance and evolving technologies, colleges and universities are complex environmental systems that continually change "due to organizational actors shifting in and out of the environment, creating difficulties for vertical, horizontal, formal, and informal norm diffusion."10 According to Urs Gasser and Daniel M. Haueusermann, compliance generally indicates the observance of norms on the part of an organization.11 Institutions therefore must institute formal norms that comply with federal and state regulatory agencies but also must deal with institutional policies and norms. All of these factors exert influence on the collection and use of data within the institution— including, but not limited to, employee and student information, financial data, information concerning individuals participating in college/university health services, and research operations.

A primary source for formal institutional norm establishment is the U.S. Code of Federal Regulations. The CFR is the codification of the general and permanent rules published in the Federal Register by the executive departments and agencies of the U.S. federal government. Applying these regulations to mature technology is itself a daunting task. Anticipating the regulatory exposure in the context of evolving technologies makes the task even more challenging.

Why Is Regulatory Compliance Important to Higher Education?

Traditionally, college and university compliance objectives have focused on mimicking the rules enforced by governmental agencies and on avoiding severe penalties for noncompliance with those rules. How and where data entered the system and the question of its authenticity were perhaps not of primary concern; the thrust of compliance in terms of data was report writing. However, socially interactive applications are significantly expanding both the possibilities and the risks associated with data management and services today. Is the data authentic? Are those responsible for the data input involved at a proper "role" level? Is data, once it is in the system, being managed effectively?

As technology matures, these questions and others will become critical in order for compliance objectives to keep pace with innovation. Additionally, ongoing change is itself a vital variable in compliant data management and innovation. Preserving good compliance goals will require organizational evolution. But as King Whitney Jr. noted: "Change has a considerable psychological impact on the human mind. To the fearful, it is threatening because it means that things may get worse. To the hopeful, it is encouraging because things may get better. To the confident, it is inspiring because the challenge exists to make things better."12 Organizations therefore must find ways to manage change effectively so as to reduce any negative effects that the variable of change itself can have within the organization. Likewise, compliance initiatives may create the environment by which IT improvements are forged, as a result of adopting best practices and working to achieve transparency in IT developments.

How Is Regulatory Compliance Evolving?

How can colleges and universities achieve actionable intelligence to prepare for regulatory impacts, and how can they embrace change while strengthening the IT organization? Establishing uniform standards and practices for change management across an enterprise and codifying best practices are two important methodologies. Measuring success in relation to other institutions and/or IT organizations can also help establish benchmarks for success. Business process and key control documentation is also a powerful tool that can be used to model, analyze, and facilitate necessary changes. Standardized business processes and activities improve efficiency, support successful reorganizations based in necessary changes, and create a central source for employee training. However, institutions must also ensure flexibility, since day-to-day operational improvements are critical for supporting a sustainable compliance framework.

Regulatory compliance is best addressed through a three-point approach using an iterative process: (1) define the compliance situation and assess risk; (2) apply proper control measures (fixing the gap); and (3) follow up with governance (eliminate the root cause of the failure). All college and university employees will play an important role in securing the data that resides in the care of the institution. As Web 2.0 applications mature, colleges and universities will be faced with the challenge of certifying the adequacy of systems or the adherence to regulatory goals. This challenge will require a unique expertise, as well as different interactions with regulatory agencies. And these new interactions will require an organizational transformation.

Conclusion

IT professionals are responsible for understanding their critical roles in ensuring compliance through good technology systems and excellent data management. If IT professionals resist their role as facilitators in these processes, the risk of sanctions and monetary fines increases exponentially. IT professionals must find ways to work closely with the regulatory experts within their organizations to ensure robust institutional systems based in a thorough knowledge of regulatory concerns.

2008 White Papers

The full white papers on the 2008 evolving technologies—green enterprise computing, location-aware computing, virtual worlds, business process management, and regulatory compliance—are posted on the Evolving Technologies Committee website:

http://www.educause.edu/EvolvingTechnologiesReports

Also on this website are links to the Evolving Technologies Committee white papers from the years 2000 through 2007.

Notes

1. Mark Raskino and Simon Mingay, "CEOs' Letter Signals Serious Intentions for 'Green IT,' " Gartner, June 24, 2008, http://www.gartner.com/resources/159200/159265/ceos_letter_signals_serious__159265.pdf.

2. Mike Hazas, James Scott, and John Krumm, "Location-Aware Computing Comes of Age," Computer, vol. 37, no. 2 (February 2004), pp. 95–97.

3. Simson L. Garfinkel, Ari Juels, and Ravikanth Pappu, "RFID Privacy: An Overview of Problems and Proposed Solutions," IEEE Security & Privacy, vol. 3, no. 3 (May/June 2005), pp. 34–43.

4. Tom Karygiannis, Bernard Eydt, Greg Barber, Lynn Bunn, and Ted Phillips, Guidelines for Securing Radio Frequency Identification (RFID) Systems, Recommendations of the National Institute of Standards and Technology, April 2007, http://csrc.nist.gov/publications/nistpubs/800-98/SP800-98_RFID-2007.pdf.

5. "Project Wonderland Roadmap and Release Estimates," http://wiki.java.net/bin/view/Javadesktop/WonderlandRoadmap; "Project Wonderland: Toolkit for Building 3D Virtual Worlds," https://lg3d-wonderland.dev.java.net/#goals.

6. The Croquet Consortium, "The Core Model," http://www.opencroquet.org/index.php/The_Core_Model.

7. Mahoney quoted in "Vollee Brings 'Second Life' to Mobile," press release, February 18, 2008, http://www.vollee.com/news/30_vollee_brings_second_life_to_mobile. See also "Vollee Enables 25 More Handsets for Second Life Mobile," press release, June 30, 2008, http://www.vollee.com/news/138_vollee_enables_25_more_handsets_for_second_life_mobile.

8. "Comverse Demos Second Life on Mobile Phones," Wireless Federation, February 12, 2007, http://wirelessfederation.com/news/comverse-demos-second-life-on-mobile-phones/; Dianne See Morrison, "Google Android Phone to Debut as Early as October," washingtonpost.com, August 15, 2008, http://www.washingtonpost.com/wp-dyn/content/article/2008/08/15/AR2008081500801.html.

9. Michael Hill, "Researchers Teach Avatar to Think," nashuatelegraph.com, June 12, 2008, http://www.nashuatelegraph.com/apps/pbcs.dll/article?AID=/20080612/BUSINESS/887932389/-1/business.

10. Timothy N. Atkinson and Diane Suitt Gilleland "The Scope of Social Responsibility in the University Research Environment," Research Management Review, vol. 15, no. 2 (Fall/Winter 2006), p. 3, http://www.ncura.edu/content/news/rmr/docs/scope_of_social.pdf.

11. Urs Gasser and Daniel M. Haueusermann, "E-Compliance: Towards a Roadmap for Effective Risk Management," March 15, 2007, http://papers.ssrn.com/sol3/papers.cfm?abstract_id=971848.

12. Whitney quoted in the Wall Street Journal, June 7, 1967.

© 2008 Beth Forrest Warner and the 2008 EDUCAUSE Evolving Technologies Committee. The text of this article is licensed under the Creative Commons Attribution-NonCommercial-NoDerivs 3.0 License (http://creativecommons.org/licenses/by-nc-nd/3.0/).

EDUCAUSE Review, vol. 43, no. 6 (November/December 2008)