XR for Teaching and Learning: Year 2 of the EDUCAUSE/HP Campus of the Future Project

XR Technologies for Achieving Learning Goals

What factors influence the effectiveness of XR technologies for achieving different learning goals? This is a complicated question because the broader question of what factors influence the effectiveness of any technology for learning is complicated. The reason this broader question is difficult to answer is that many factors can influence learning: the instructional technology used, the medium of delivery, the quality of instruction, the temperature in the classroom, the student's socioeconomic status, etc.

The No Significant Difference Phenomenon is well known. A body of research spanning decades has shown mixed results in investigating whether student outcomes are improved when education is delivered face-to-face or at a distance. This has sometimes been used as a blanket argument against the use of technology in education,1 but that misunderstands these findings. Rather, what this research shows is that the effects of the mode of delivery are drowned out by the effects of other variables. Many of these studies show that one variable in particular—the instructional method employed—is one of the most powerful factors, if not the most powerful, influencing learning. In other words, instructional method is far more important than the medium of delivery … so much more important, in fact, that the effect of instructional method dwarfs the effect of technology into statistical insignificance.

"We don't know if this is the future, but it sure looks like it."
Brant Steen, Bucks County Community College

This may seem like an unexpected admission in a report about educational technology. But if XR is to be used in education, it is important to be realistic about how to use it effectively. Other research has shown that blended learning—that is, enhancing face-to-face teaching and learning with online components—achieves better student outcomes than either face-to-face or online alone.2 It is this type of blended learning environment in which XR is most useful. If instructional method is one of the most powerful factors influencing learning, it is critical that we understand how XR best fits into those instructional methods. XR holds the potential to be a game changer3 for pedagogy, but it must be deployed thoughtfully in order to fulfill this potential.

Experiential and Competency-Based Learning

In a 2009 interview, Captain Chesley "Sully" Sullenberger, the "Miracle on the Hudson" pilot who famously landed US Airways Flight 1549 on the Hudson River, credited the frequent and repeated simulations that airline pilots perform for his success with that landing: "One way of looking at this might be that, for 42 years, I've been making small regular deposits in this bank of experience, education, and training. And on January 15, the balance was sufficient so that I could make a very large withdrawal."

Landing a commercial airliner on water is not something that one can (or would want to) practice in the physical world even once, let alone repeatedly. In a simulation, however, anything can be practiced again and again. Because of the danger, complexity, and expense of aircraft, flight training was one of the first jobs to make widespread use of simulations,4 and flight simulators are still widely used for aviation training.

Nursing Education Already Uses Simulations

Like aviation, the nursing profession has long understood the value of simulations for training. The Institute of Medicine's 2000 To Err Is Human report5 argued that professional education across healthcare fields should use simulations whenever possible when creating learning environments, thereby enabling students to practice technical skills and thus reduce medical errors.

Training for nurses is extremely specific. State boards of nursing produce scope and standards of practice regulations that inform the creation of checklists of skills by publishers of nursing education resources. These checklists range from basic skills such as taking a patient's temperature and blood pressure to more complex skills such as how to dress different types of wounds and how to interact with a difficult patient. To become proficient in most of these skills requires a nursing student to perform a specific set of steps in a specific order, reliably and without requiring supervision. This is a different model of education from that of many disciplines, different even from many other professional programs. The law, for example, and even other medical training require students to recall a large corpus of knowledge but not usually to execute it in a fixed sequence. What's more, nurses are often working under time pressure and with incomplete information. They must therefore be able to perform specific skills with speed and precision, sometimes in the face of a rapidly changing situation.

Two forms of simulation are widely used in nursing education: manikins and actors. The reader has probably seen and perhaps even used this sort of manikin, for example, in CPR training. These manikins are versatile in that students can practice a wide range of skills on them. But they are not particularly realistic; often they are not even a complete body, just a head and torso, or just an arm, etc. Standardized patient actors, on the other hand, are clearly more realistic. But actors must be trained to participate in a medical simulation and must also be paid. Scheduling an actor's time adds complexity to an already complex scheduling problem, as students' and instructors' time must be scheduled as well.

Nursing education is therefore an ideal venue for deploying XR technology. Simulation-based training is already widespread, but there is a clear need for simulations that are higher fidelity than using manikins and less complex and expensive than working with actors.

XR Maintains Existing Student-Learning Outcomes in Nursing Education

Two nursing programs participated in this study: the Morgan State University (MSU) Nursing Program, part of the School of Community Health and Policy, and the Simulation Center in the Columbia University School of Nursing. XR technology had already been deployed at both of these institutions prior to the start of this study. At MSU, the library recently purchased a small number of VR backpack rigs (figure 1) and headsets for a makerspace currently being designed and built within the library. The Emerging Technologies Consortium at Columbia was formed under the university's IT unit in 2017 to help facilitate exploration and adoption of new technologies at Columbia.

HP VR backpack 
Figure 1. The HP VR backpack
Image courtesy of HP Inc.

Both the MSU Nursing Program and the Columbia Simulation Center were just starting to use XR while this study was ongoing. MSU was launching an initiative to educate novice nurses to recognize and respond to early indicators of "clinical deterioration"6 (the deterioration of a patient's condition just before or just after being admitted to a hospital) using high-fidelity simulations. An extensive and detailed set of criteria for evaluating the specific skills being taught accompany these simulations—for example, that the novice nurse "verbally identifies the signs of clinical deterioration" and should "evaluate the effect of medications and oxygen administration on patient's clinical deterioration."

Columbia was starting XR implementation with a case study. Multiple students from multiple nursing subdisciplines, wearing AR headsets, meet with a standardized patient–actor. Students take a patient history and perform physical assessments, etc.; some information is provided by the patient–actor and some is included in AR overlays. Afterward, students meet as a team to make their diagnosis and decide on a course of action for the simulated patient.

For both the MSU and Columbia simulations, students are evaluated on a specific set of skills, which, importantly, are the same skills that could be evaluated in any type of simulation (such as VR, a computer-based simulation on a screen, or a simulation with an actor or a manikin). Instructors have a rigorous set of predefined criteria for evaluating students' performance of these skills, and these criteria can be used across teaching environments. It is a critical point that the use of an XR simulation does not require a change to student learning outcomes. By not requiring this change—i.e., by maintaining existing student learning outcomes—the cost of adopting XR for instruction is dramatically reduced.

XR Maintains Existing Learning Outcomes in Other Disciplines, Too

Another subject for which XR can increase the realism of the learning environment and remain consistent with preexisting learning outcomes is language learning. There are, of course, several excellent language-learning software applications, and in-person language courses are legion. But the one thing even these cannot provide is immersion, which is a particularly effective method for learning a language. Immersion may not be possible for many students, however, as it requires either travel or the presence of a local language community. VR, though, has been found effective in simulating an immersive language environment.7 At Syracuse University, for example, a project is under way to use 360-degree video to develop virtual tours of landmarks in countries around the world, in the native languages of those countries. For another example, the Sound Storytelling project under way at Yale University records the sounds of rural life in Indonesia, as a tool to immerse the learner in the Bahasa Indonesia language environment. By simulating realistic language environments, XR can enhance language learning by providing the student with immersion where it might not otherwise be possible.

Still another subject where XR can provide a realistic simulation that is consistent with established learning outcomes is chemistry. As Lori Silverman, director of the Science Learning Institute at Foothill College, amusingly put it: "One of the big problems with chemistry is chemicals." Teaching and learning in chemistry requires a lab, which an institution may have the resources and infrastructure to set up but which is prohibitive and possibly dangerous for a student to attempt at home. It is perfectly feasible, however, for a student to interact with a simulated chemistry lab at home. Indeed, this goes for any subject that requires a lab. Furthermore, a simulated lab is especially useful for courses offered by institutions with a large percentage of commuter students. Many educational institutions make cloud-based software applications available to the campus community, such as a learning management system (LMS), enterprise licenses for statistical analysis packages, or subscription library databases; similarly, a virtual lab would enable members of the campus community to interact remotely with another important campus resource.

XR technology is particularly well suited for fields such as nursing, language learning, and chemistry—fields that require students to gain direct experience but where gaining that direct experience is a challenge because it is dangerous, expensive, complex, or remote. The more realistic and the higher fidelity these simulations of the physical world are, the more valuable they are as learning environments. Yet there are things far more complex than a chemistry lab and far more remote than a Bahasa language community. XR is also useful for simulating things in the physical world that simply cannot be accessed physically.

Making the Abstract Concrete

“We want to drag information literacy out of the 19th century.”
Seneca Jackson, Morgan State University

Not all education is competency-based. Some K–12 and higher education courses seek to convey a body of abstract knowledge. Students are expected to come away from certain courses and programs in possession of knowledge but not necessarily a set of skills. It is of course difficult to separate knowledge from skill: How can an instructor assess a learner's knowledge if not by having her demonstrate it by doing something? In many cases the nature of the subject limits what can be demonstrated. Astrophysics and history, for example, are not subjects in which students can easily demonstrate hands-on skills. For many subjects, the object of study is not accessible, for one reason or another. These subjects have therefore traditionally been taught more or less in the abstract, using illustrations, videos, and perhaps models, but without much direct hands-on experience. And ultimately the assessment of a learner's knowledge in such fields has depended on writing, as on an exam or an essay, and not on demonstrating a skill.

XR Expands What Can Be Learned as Skills

One of the most important educational functions of XR is to dramatically expand the range of activities with which a learner can gain hands-on experience. XR can provide hands-on experience of things that are too small to manipulate with hands, such as cells; too large, such as entire physical environments; or not physical at all, such as electromagnetism. In other words, XR dramatically expands the range of topics that can be learned as skills rather than as abstract knowledge.8

Several examples of such topics emerged during this research. Perhaps the most fully developed application for this purpose is Cellverse, under development by the CLEVR project at MIT to teach cell biology. Cellverse is a collaborative educational VR game: the cell in question has a genetic defect, which the VR user must fix from within the simulation. Cellverse requires two users (figure 2), the explorer and the navigator: the explorer wears a VR headset and has a view from inside the simulated cell, while the navigator uses a tablet to access a bird's-eye view from outside the simulated cell. The navigator gathers and organizes data about cells, while the explorer makes observations; the navigator's selection of reference information is informed by the explorer's observations, while the explorer is guided by information provided by the navigator.9 In other words, the navigator is working with abstract knowledge, while the explorer converts abstract knowledge into action. The "winning condition" of Cellverse is for the explorer and navigator team to work together to select an appropriate therapy to "cure" the cell, thereby demonstrating that both users can recall information about the functions of organelles within the cell, can analyze and make inferences about the relationships between organelles, and can generate hypotheses about how to fix the cell and then execute plans to do so. Cellverse has already been used in a few select classrooms to test the effectiveness of the design.

Students using Cellverse
Figure 2. Students using Cellverse to learn about cellular biology
Image courtesy of the MIT Education Arcade 2018

Another application that makes effective use of the hands-on nature of the XR experience to teach a subject that has traditionally been abstract is the Electrostatic Playground. Also developed at MIT, this is a VR simulation of charged particles. Electromagnetism and electrical engineering are often taught as separate topics—related, of course, but traditionally a student is expected to possess the more abstract knowledge about electromagnetism prior to learning more hands-on engineering topics. But the Electrostatic Playground teaches both at the same time. The user manipulates simulated charged particles and experiences how they react to one another. Knowledge that previously could be gained and assessed only in the abstract can now be experienced and demonstrated as a skill.

Authentic Experiences

The learning outcomes for many STEM disciplines, such as biology and physics, are fairly well defined by grade level, at least within K–12 education. Just as with nursing training, the use of a VR simulation increases the fidelity of the experience but remains consistent with established instructional standards in these fields. Cellverse has been deployed in a few classrooms, and both students and teachers have found that it enables the development of a spatial awareness of the cell environment and the contextualization of the roles of organelles in the cell. This is at least partly due to the fact that Cellverse is designed to be an "authentic" experience—authentic both in the sense that the simulated environment matches current research on cells (and is constantly being updated to remain current) and in the sense that the explorer experiences the physicality of the simulated environment. This physicality is especially noticeable in the Electrostatic Playground, where simulated particles move and interact authentically. What was once possible to present only as drawings on a page or perhaps as a video can now be a hands-on experience.

Biology is a particularly popular subject for XR development, perhaps because many of the objects of study are physical but are too small to see with the naked eye. Beyond those discussed so far, several XR simulations exist, or are under development, for teaching biology-related subjects. The VR-Lab at the Norwegian University of Life Sciences is developing a VR application for a molecular biology course, for example, and Unimersiv, a platform for educational VR content, has a simulation in which the user can explore the protein structures on the surface of cells. There are even some Google Expeditions of cells that are compatible with Google Cardboard, a free smartphone-based VR app designed to work with inexpensive headsets. There is also a long list of VR and AR applications that were not developed specifically to be used as part of a course but nevertheless have some educational potential. One example is the unfortunately named InCell, a racing game for Google Cardboard in which the user is racing against viruses.

One of the most valuable functions of XR is to enable the simulation of aspects of the physical world that are not accessible in any other way. As in anatomy simulations,10 the more authentic these simulations are, the more valuable they are as learning environments. But what does authentic mean when there is no human experience to compare it to? Any simulation of an organelle or an electron is obviously an abstraction. What's important is for such abstractions to be as realistic as possible: simulated organelles and proteins must interact accurately, and simulated electrons should repel each other with an appropriate amount of force, etc. In other words, even though the simulation gives the user an experience that is impossible in the physical world, that simulation must maintain the impression that it is realistic by adhering to the relevant "rules" of the physical world, rules like gravity and other physical forces. By adhering to these rules, a simulation of the physical world can model things that do not—or do not yet—exist.

Experimentation

"This is right on the brink of 'Nobody knows what they’re doing!'"
Amber Bartosh, Syracuse University

All new technology is a learning experience. Even a new tool that performs a familiar task (for example, a new spreadsheet application) requires the user to scale at least a small learning curve. And most new technologies possess at least some new functionalities. The functionality or actions that a technology enables are called affordances.11 Affordances can be obvious—for example, that an office chair affords being sat upon. But affordances may not be obvious; an example might be the same chair being used as a racing vehicle.

XR is similar in some ways to existing technologies, such as film. But XR also possesses functionality that is entirely new, enabling users to perform tasks that are not possible with other tools. XR has new affordances: it is not a film, it is not a game, it is something else entirely. Every once in a while an innovation comes along that changes what is understood to be possible in a medium: the development of cubism in painting, for example, or the invention of the electric guitar for music. But this is relatively rare, as most media are well established and their affordances relatively well understood. XR, on the other hand, is changing what is understood to be possible with technology, as its affordances are still being mapped out. Where we are now with XR is perhaps where musicians were with electric guitars in the 1940s: still perfecting the hardware and exploring its possibilities at the same time.

This process of mapping out the affordances of XR—of exploring the functionality and the limits of what is possible with this new technology—is happening across many fields and is one of the most exciting developments in using XR for teaching and learning.

XR and the Possibilities of Physical Space

The Interactive Design and Visualization Lab (IDVL) at Syracuse University has been using XR to render architectural designs since before this study began (figure 3). XR offers a higher-fidelity, more realistic tool for rendering architectural designs than more "traditional" architectural tools, such as computer-aided design and drafting (CADD) software or scale "dollhouse" models. Rendering designs in XR enables the user to walk around inside a space and interact with the objects, materials, and soundscapes within it.12 Building a simulated architectural space makes it possible to collect data about individuals' navigation through and interaction with the space and to rapidly integrate those findings into iterating the design of the space.

A person in a VR simulation of interactive windows
Figure 3. A VR simulation of interactive windows, at Syracuse University
Image courtesy of Amber Bartosh, Interactive Design and Visualization Lab at Syracuse University

Although they are lumped together under the umbrella of XR throughout this report, VR and AR are different technologies and therefore provide different affordances. This is especially clear in how they are used in the context of physical spaces. VR enables the simulation of an entire space that may not yet exist; AR, on the other hand, is closely tied to the existing built space: the IKEA Place app, for example, enables users to virtually place scale-accurate 3D furniture in a room using a smartphone. This is similar to other smartphone applications, such as Warby Parker, which enables users to try on virtual eyeglasses. Several stores' apps enable users to try on virtual clothes. These are not complex functions, but the point is that these apps allow users to see things as they might be, to experiment prior to making a change to one's environment.

Two other examples of projects closely tied to existing physical environments are choreography projects, one at Barnard College and one at Yale University. Barnard College offers several courses that explore the intersection of technology and the performing arts. In one dance course, students develop a choreographic work that is then "placed" on campus so that it can be seen using a smartphone-based AR app. This is a deliberate exploration of two issues that are currently frontiers in dance: site-specific choreography, or developing a piece for a particular site other than a stage, and choreographing a work that will be viewed on a small screen. In the Beyond Imitation project at Yale, a dancer's sequences are captured as motion-capture data, which is then fed to a machine-learning algorithm that generates completely new dance sequences. These sequences may then be viewed as a simulation before being integrated into a new choreographic work by the dancer herself. This sort of computer-generated choreography provides material for the dancer, of course, but in doing so enables exploration of both the nature of artistic collaboration and the "language" of dance.13

The Journey Is the Destination

The Beyond Imitation project is one of several under the umbrella of Yale University's Blended Reality project, which, as of this writing, is in its third year. The Blended Reality project continues to experiment with XR technology, for example through the ongoing development of the Clamshell Controller, a universal controller that can take output data from any sensor as input and then input that data into an XR simulation. For input, the Clamshell Controller can use data from existing sensors, such as force pads, light sensors, or accelerometers. But the goal is for the controller to use data from any type of sensor, even one-off custom-built sensors, thereby enabling developers and users to bring entirely new types of data into simulations.14

"XR technology is a way to increase the scale of our capabilities."
Justin Berry, Yale University

By experimenting with XR hardware or functionality—or both—the projects discussed in this section are deliberately innovating in their respective arenas. These projects are not only pushing the boundaries of their disciplines, they are also pushing the boundaries of our understanding of the capabilities and affordances of XR itself.

As previously discussed, there are many domains in which XR can fit into existing pedagogy while remaining consistent with established learning outcomes. But sometimes experimentation is the point. In the Syracuse University School of Architecture, for example, students in a special-topics course called Mediated Environments did some original development in the XR development platform Unity to create an immersive visualization of environmental data—something none of these students had done before and which is in fact a new method for visualizing environments in the field of architecture at large. In a course at Syracuse called VR Storytelling, in the Newhouse School of Public Communications, students developed a VR experience relevant to their field of study—again, a new experience for these students and new to their field of study as well. To lower the bar to XR development for others, students in a course at Yale titled 3-D Modeling for Creative Practice developed a small library of program modules (also in Unity) for common actions that an XR developer might want to use.

These projects were course assignments, but because they were pushing at the edges of their respective fields, students' final products could not be assessed according to any traditional rubric. What constituted a successful project in these courses was their spirit of experimentation: a data visualization may be crudely realized, or a Unity script might be buggy. The important criteria in assessing these assignments was their exploration of creative ideas. One of the learning objectives of the VR Storytelling course is that the student will "identify stories that can be 'told' better through an experience." But what kinds of stories are those? This is a question to which we do not yet know the answer. These students are literally discovering the boundaries of XR capabilities while experimenting with technology on the cutting edge of their field, a valuable and all-too-rare educational experience. Sometimes experimentation is the point.

Notes

  1. Richard E. Clark, "Media Will Never Influence Learning," Educational Technology Research and Development 42, no. 2 (June 1, 1994): 21–29.

    ↩︎
  2. Barbara Means, Yukie Toyama, Robert Murphy, Marianne Bakie, and Karla Jones, Evaluation of Evidence-Based Practices in Online Learning: A Meta-Analysis and Review of Online-Learning Studies (Washington, DC: US Department of Education, January 2009).

    ↩︎
  3. Diana G. Oblinger, Game Changers: Education and Information Technologies (Boulder, CO: EDUCAUSE, 2012).

    ↩︎
  4. The first flight simulator, for training pilots to fly biplanes, was created in 1910. The first computerized flight simulator—the precursor to all modern flight simulators—was developed in the early 1950s. For a brief history of flight simulators, see Karolina Prokopovič, "What Do You Know About the Evolution of Full Flight Simulators?" Aviation Voice, June 7, 2017.

    ↩︎
  5. Institute of Medicine, To Err Is Human: Building a Safer Health System (Washington, DC: The National Academies Press, 2000).

    ↩︎
  6. Daryl Jones, Imogen Mitchell, Ken Hillman, and David Story, "Defining Clinical Deterioration," Resuscitation 84, no. 8 (August 2013): 1,029–34.

    ↩︎
  7. Jennifer Legault, Jiayan Zhao, Ying-An Chi, Weitao Chen, Alexander Klippel, and Ping Li, "Immersive Virtual Reality as an Effective Tool for Second Language Vocabulary Learning," Languages 4, no. 1 (March 2019): 13.

    ↩︎
  8. By enabling the learning of abstract concepts to at least partly become the gaining of a skill, the use of XR as an educational technology is consistent with the idea from cognitive science of "embodied cognition." This idea is complex, and there are multiple strands of research around embodied cognition, but to dramatically oversimplify: Embodied cognition suggests that "cognition" involves not only the brain but also the body and its movements through the environment. In other words, learning and understanding can be said to be the development of behaviors appropriate to a situation. By expanding the range of activities with which a learner can gain direct, hands-on experience by making "physical" the nonphysical, XR technology increases the scope of the environment that can be integrated into embodied cognition. See, for example: Andrew D. Wilson and Sabrina Golonka, "Embodied Cognition Is Not What You Think It Is," Frontiers in Psychology 4 (2013): 1–13.

    ↩︎
  9. Meredith M. Thompson, Annie Wang, Dan Roy, and Eric Klopfer, "Authenticity, Interactivity, and Collaboration in VR Learning Games," Frontiers in Robotics and AI 5 (December 19, 2018).

    ↩︎
  10. Sue Workman, "Mixed Reality: A Revolutionary Breakthrough in Teaching and Learning," EDUCAUSE Review, July 30, 2018.

    ↩︎
  11. James J. Gibson, The Ecological Approach to Visual Perception (Boston: Houghton Mifflin Harcourt, 1979); Donald Norman, The Design of Everyday Things (New York: Basic Books, 1988).

    ↩︎
  12. Amber Bartosh and Bess Krietemeyer, "Virtual Environment for Design and Analysis (VEDA): Interactive and Immersive Energy Data Visualizations for Architectural Design," Technology|Architecture + Design 1, issue 1 (2017): 50–60; Bess Krietemeyer, Brandon C. Andow, and Anna H. Dyson, "Human-Facade-Interaction: Constructing Augmented Reality Simulations for Co-Optimizing Dynamic Building Skin Performance," MRS Proceedings 1800 (2015).

    ↩︎
  13. Luka Crnkovic-Friis and Louise Crnkovic-Friis, "Generative Choreography Using Deep Learning," ArXiv:1605.06921 [Cs], May 23, 2016.

    ↩︎
  14. Justin Berry, Lance Chantiles-Wertz, and Isaac Shelanski, "The Clamshell: Rethinking the Virtual Reality Interface," in 2019 IEEE Games, Entertainment, Media Conference (GEM), 1–5, 2019.

    ↩︎