XR for Teaching and Learning: Year 2 of the EDUCAUSE/HP Campus of the Future Project

Integrating XR into Curricula

A 2019 report1 from the US Department of Education's What Works Clearinghouse recommends the use of simulations, and XR specifically, to help students engage in complex problem-solving and interact more deeply with learning materials. For XR to be implemented in a pedagogically meaningful way, this report suggests that it is necessary for both the course and the curriculum as a whole to provide students with sufficient time to engage with complex problems.

Few academic programs have integrated simulations into their curriculum as thoroughly as nursing and other medical disciplines have. Indeed, it may be difficult for many programs to make available this kind of time for students to engage with XR technology across the curriculum. It can be done, of course, but as readers of EDUCAUSE reports know perhaps better than anyone, changing curricula in higher education is a lengthy process. In the meantime, there are other ways to create time for students to engage with XR technology.

Time and Skills

Hackathons in particular have been used to great effect at several participating institutions.2 MIT has hosted an XR hackathon, Yale has hosted two, and Hamilton College sponsored a group of students to travel to one at another institution. These events were organized by a combination of campus units, some administrative (such as IT units) and some academic, with the participation and sponsorship of several external organizations. The specific academic programs involved in these hackathons naturally vary across institutions, as do the topics of the hackathons: Yale's hackathons were on global climate change, while the Hamilton students participated in tracks on topics such as smart homes and smart cities. More importantly, though, these hackathons provide students with the time and resources to engage in complex problem-solving and to engage deeply with a topic through the use of XR technology.

Time is just one issue in implementing XR in a way that is pedagogically meaningful; another is the technical ability of students. While XR is useful for providing skills-based education, ironically its use requires some skills in the first place. Anyone participating in a hackathon probably has some programming skills going in. But the same should not be assumed for students more broadly. Even at an institution where technology literacy is integrated into the general education curriculum, it may be too steep a learning curve to ask students to learn to develop an XR application within the span of a single academic term.

Barnard College is one such institution where technology literacy is integrated into the general education curriculum, which is called Foundations and is organized around Modes of Thinking. One of these modes is Thinking Technologically and Digitally, which includes such things as computational thinking, programming, and digital arts and humanities. As is the case at many institutions, courses at Barnard can fulfill one or more requirements of the general education curriculum. A course that addresses Thinking Technologically and Digitally often contains a lab section in which students have access to relevant hardware and software. Instructors at Barnard may request support from the Instructional Media and Technology Services (IMATS) unit on various technologies. This support can take many forms, such as scheduling an IMATS staff member to come to class and provide instruction to students. IMATS also offers workshops to students and faculty on various technologies throughout the academic year, which is a common service model offered by many campus IT units and centers for teaching and learning.

Another method for integrating XR into the general education curriculum is by including it in a first-year experience course.3 Florida International University (FIU) has offered such a course for several years, and in fact there are several different first-year experience courses on offer. Currently a new first-year experience course is being developed in which students will work with XR technology to explore issues such as design thinking and ethics in online spaces. A challenge that FIU faces in developing this course is that it will require instructors to teach it. Obvious, yes, but a course in which students use XR needs an instructor who knows how to use XR and is able to support the students. Instructors involved in this course will be supported by both the campus IT unit and FIU's Center for the Advancement of Teaching. Students and instructors will also be supported by the Miami Beach Urban Studios, a building-sized (16,000 square feet) makerspace-like facility that serves the campus community. Integrating XR into a first-year experience course and providing this level of support will allow students the time and resources needed to engage deeply with the technology.

Medical app from the Yale School of Medicine showing a brain AR
Figure 4. Sample image from a brain AR medical app from the Yale School of Medicine
Image courtesy of Michael Schwartz, Yale School of Medicine

Few institutions have been using XR technology in the classroom long enough to collect experimental data on its learning effects, but Yale is one of those institutions. As of this writing, the Yale School of Medicine has been participating in Yale University's Blended Reality Applied Research Project for approximately two years, and a team from the Department of Neuroscience has developed an AR app to visualize the brain (figure 4). This app will be integrated into some, but not all, lab sections of a neuroscience course in the fall 2019 semester. At the end of the semester, students' performance on course assessments will be compared: those who used the app versus those who did not. More clinical studies of this sort are needed to further explore the learning effects of XR technology in specific fields, for specific use cases, and for specific types of students.

Student Assessment

A recent article in The Chronicle of Higher Education lamented our lack of knowledge about the pedagogical impact of XR.4 This concern was echoed by several interviewees for this project—even some from nursing programs, which have long recognized the value of simulations in teaching and learning. Indeed, the Institute of Medicine's 2000 To Err Is Human report5 recommended that professional education across healthcare fields use simulations whenever possible when creating learning environments. A large body of research has since emerged about the pedagogical impact of simulations using manikins and standardized patient–actors in medical education. Research on XR simulations, however, is only beginning to emerge in medical education.

Research on XR in other fields lags even further behind. There is a growing body of research about the use of VR and AR for teaching a variety of subjects.6 Much of that literature is highly focused, however, investigating the use of a specific technology for teaching a specific topic or in a specific course. Only a fraction of that literature compares the effectiveness of an XR technology with a non-XR option.

The present study was broader in scope than most of this previous research and focused on a range of XR technologies across institutions, though at each institution the focus was on only one or at most a handful of specific fields or courses. Because of the breadth of this study, we can at least start to answer the question about the pedagogical impact of XR.

The previous section discussed the integration of this technology into courses and across curricula. That is of course necessary, but once that integration is under way, the next step in any discussion of the effectiveness of XR (or any) technology for learning is its pedagogical impact on the student. This section briefly discusses methods that instructors at participating institutions used to assess student learning in courses and for assignments in which XR was used. Some of the points made earlier are repeated here, but they are drawn together into a discussion of XR-specific assessment methods.

Simulations are already widespread in nursing education, using manikins and standardized patient–actors. Instructors evaluate students' performances in these simulations according to predefined rubrics based on specific learning objectives and criteria. These criteria may also be built into an XR simulation, so that a student's performance is assessed throughout their interaction with the simulation. XR applications that are game-like, such as Cellverse, may use similar mechanisms. Although there is no standard rubric for assessing learning about the central dogma in biology, for example (or indeed in many STEM disciplines), there are inventories and assessment tools for the central dogma and other topics.7 These tools can be built into biology simulations and used for assessment, so that students' performance is assessed throughout their interaction with the simulation, similar to how VR games are developed with built-in analytics. But instead of capturing key performance indicators (KPIs) about a user's actions within a game, an educational simulation may capture KPIs about a user's actions and behaviors that address the learning objectives. This is in fact how at least some existing XR simulations, such as Shadow Health, work: they are essentially computer-based training modules with an evaluation tool that grades the user's performance at the end of the simulation.

XR applications that simulate the physical world but are not designed as games, such as the Electrostatic Playground, might likewise capture KPIs about a user's actions. The Electrostatic Playground is designed as a space for exploration, however; unlike repairing an organelle in Cellverse or giving a patient an injection, there is no "correct" way to interact with subatomic particles in the Electrostatic Playground. In this type of simulation, a student's performance may be assessed through interactive exercises, like a hands-on quiz. These assessments may be built into the simulation, like end-of-chapter questions in a textbook. Just as important, such simulations should provide functionality to enable instructors to create their own assessments.

It can be challenging to assess student learning when experimentation is the point. This experimentation may be with the technology, as at Yale, where a group of students worked on developing a new controller and another group developed a library of Unity program modules. Or this experimentation may be with the subject matter, as at Syracuse, where students developed new interactive data visualizations. In either case, rubrics for evaluating student learning may not exist, given that these projects are pushing the boundaries of their respective fields. Even here, however, some useful guidelines emerge for how instructors can think about student assessment. Another student project that pushed the boundaries of the field was an assignment for a dance course at Barnard, where students produced a work of site-specific choreography: here the assessment rubric naturally focused on the choreography rather than the technology. But a central question in evaluating the choreography was whether the student factored the technology into the choreography. The dance piece was intended to be viewed on a small screen, which may affect the choreographic choices the student makes. In other words, a student assessment rubric might ask whether XR technology is being used thoughtfully in the context of the task.

Another example is the AR application visualizing the brain, developed by the Yale Department of Neuroscience. This was not a course assignment, but the programmer for the project was a student. No software requirements specification was created going into this project. Rather, over the course of developing this app, the project PIs and the developer met frequently to brainstorm and identify what was possible for the app, based on what they were learning about the technology as work progressed. In this way, the development process was flexible enough to accommodate new functionality as new software versions of the development platform were rolled out, and the application that was ultimately developed was informed by what the team learned throughout the process. In other words, a student assessment rubric might ask whether XR technology is being used flexibly enough to accommodate both changes in the technology itself and in the student's knowledge about the technology.

Notes

  1. Nada Dabbagh et al., "Using Technology to Support Postsecondary Student Learning: A Practice Guide for College and University Administrators, Advisors, and Faculty," Washington, DC: Institute of Education Sciences, What Works Clearinghouse (WWC 20090001), Washington, DC: National Center for Education Evaluation and Regional Assistance (NCEE), Institute of Education Sciences, US Department of Education, 2019.

    ↩︎
  2. A hackathon is a "design sprint"-like event, often over a weekend or a few days, in which participants collaboratively develop one or more software programs or other technologies to address a specific project or creative problem. Hackathons have been used to great effect in open-source communities, such as the Wikimedia Hackathons, at which participants work on the technologies behind Wikipedia, and on large-scale social issues that may be best addressed collaboratively across sectors, such as the the future of urban environments.

    ↩︎
  3. US Department of Education, Institute of Education Sciences, "First Year Experience Courses," What Works Clearinghouse, July 2016.

    ↩︎
  4. Beth McMurtrie, "Virtual Reality Comes to the Classroom," Chronicle of Higher Education, May 27, 2019.

    ↩︎
  5. Institute of Medicine, To Err Is Human.

    ↩︎
  6. Merchant, Goetz, Cifuentes, Keeney-Kennicutt, and Davis, "Effectiveness of Virtual Reality-Based Instruction; Radu, "Augmented Reality in Education; and Marc Ericson C. Santos, Angie Chen, Takafumi Taketomi, Goshiro Yamamoto, Jun Miyazaki, and Hirokazu Kato, "Augmented Reality Learning Experiences: Survey of Prototype Design and Evaluation," IEEE Transactions on Learning Technologies 7, no. 1 (January 2014): 38–56.

    ↩︎
  7. The central dogma of molecular biology is an explanation of the flow of genes, from DNA to RNA to proteins. Dina L. Newman, Christopher W. Snyder, J. Nick Fisk, and L. Kate Wright, "Development of the Central Dogma Concept Inventory (CDCI) Assessment Tool," CBE—Life Sciences Education 15, no. 2 (October 13, 2017).

    ↩︎