The Future of 3D Technology in Higher Education
For decades, science fiction has envisioned a future for 3D technology. William Gibson's cyberspace and Neal Stephenson's metaverse are entirely immersive, VR-like environments. (Not to mention, of course, The Matrix.) Vernor Vinge's novel Rainbows End explores a future in which AR is ubiquitous through the use of contact lenses that project overlays on top of what the wearer is seeing. A decade after the publication of Rainbows End, such contact lenses are under development.1
There is no necessary distinction between AR and VR; indeed, much research on the subject is based on a conception of a "virtuality continuum" from entirely real to entirely virtual, where AR lies somewhere between those ends of the spectrum.2 One can easily imagine a future headset (a pair of glasses, a pair of contact lenses, a prosthetic eye, etc.) that the user can "dial" back and forth, from entirely transparent to entirely immersive, depending on the use case. Indeed, science fiction has already envisioned this future, from the display that Tony Stark sees in his Iron Man mask in Marvel superhero movies, to the gesture-based computing in the movie Minority Report.3
Zeynep Tufekci, in her book Twitter and Tear Gas, argues that the online and offline worlds are often seen as being entirely separate, that "the online world is somehow less real than, and disconnected from, the offline one."4 Tufekci argues that this is no longer the case, if it ever was … that the online world is as much a part of the offline world as any form of human communication over distances, as integrated into the real as email, snail mail, telephone, radio, or pigeons. Indeed, Tufekci never even uses the term "real world" except to critique it and suggests that the term "virtual" betrays this same falsely dualist mode of thinking.
Tufekci was discussing the use of social media specifically, and in the context of political protest movements, not education. Nevertheless, this view of the online world as being integrated into—indeed, being part of—the offline world is critical to imagining the future of 3D technology—indeed the future of any technology—and not just in higher education.
For the future of 3D technology in higher education to be realized, that technology must become as much a part of higher education as any technology: the learning management system (LMS), the projector, the classroom. New technologies and practices generally enter institutions of higher education as initiatives. Several active learning classroom initiatives are currently under way,5 for example, as well as a multi-institution open educational resources (OER) degree initiative.6 When massive open online courses (MOOCs) were new, many institutions launched MOOC initiatives. Even mobile devices were first introduced into many institutions of higher education as initiatives.7 Now, however, mobile devices are owned by nearly all faculty and students, and both groups want to use more video in their courses.8 These technologies, and the practices around them, have moved beyond the initiative stage and have become relatively standard in higher education.
We are currently still in the initiative stage of 3D technology adoption into higher education. Over time, 3D technologies will inevitably become more common in higher education. Indeed, this is already happening, though, like all technological advances, it is not evenly distributed yet: Some 3D technologies are more integrated into institutions than others. Specifically, 3D scanning and printing are well on their way to being commonplace in the institutions that participated in the Campus of the Future project. In particular, 3D scanning was used at participating institutions mostly as an input to VR, a mechanism for producing 3D models that could be manipulated in VR; 3D printing was used mostly as an output from VR, a mechanism for producing physical objects that were designed in VR. Certainly these are not the only uses of 3D scanning and printing, but they were the predominant uses in projects at participating institutions. 3D scanning and printing have certainly not yet become standard in higher education, but it is worth noting that these technologies are being used as access points, or avenues into the use of even more experimental technologies.
Networked photocopier/printers are commonly made available to students on campus because reading and writing are critical to the work of being a student, probably part of every course. To enable this critical piece of the student experience, many institutions of higher education provide students with technology guidelines—recommendations for the hardware and software configuration of any computer in order to operate in the campus computing environment (e.g., see the guidelines from Syracuse University and Hamilton College). In addition to recommendations for software for purchase, some institutions also provide software to students and other institutional affiliates (e.g., the list of software licensed by Lehigh University for use by affiliates). Even if an institution does not provide software, academic units often recommend specific software to students in their program (e.g., Photoshop for fine arts departments, CAD software for architecture departments). It is easy to imagine a not-too-distant future in which institutions of higher education or specific departments recommend that students arrive on campus with computers configured to support 3D technology. Institutions might also recommend or even provide CAD/CAM software or game engines for 3D modeling, alongside the currently more common antivirus software, word processors, and statistical analysis packages. For students to purchase or, in some cases, for institutions to provide this type of software might be prohibitively expensive today. But this will likely not always be the case. Ten years ago, mobile devices were seen on campuses mostly as part of technology initiatives; now nearly every student brings their own to campus. Five years ago, 3D printers were rarely available to students at institutions of higher education; now makerspaces are increasingly commonplace.
Personal experience is one of the most effective elements of acquiring an education. Alongside personal experience, however, there must be a component of narrative and storytelling in effective education—to generate interest, to provide structures for remembering, and to assist students to contextualize what they are learning.9 Some scholars have argued that all human communication is based on storytelling;10 certainly advertisers have long recognized that storytelling makes for effective persuasion,11 and a growing body of research shows that narrative is effective for teaching even topics that are not generally thought of as having a natural story, for example, in the sciences.12
VR's ability to immerse the user in a simulation—in other words, to enable a narrative developed by others to become a personal experience—makes it a particularly powerful vehicle for providing educational experiences. In particular, there is a growing body of research that shows that VR holds a great deal of promise for teaching empathy.13 One participant in the Campus of the Future project mentioned the United Nations project Clouds over Sidra, which is the story of a 12-year-old Syrian girl living in a refugee camp in Jordan. Other project participants mentioned other possible uses of VR to reconstruct historical events and put users in the middle of them—for example, the 1965 Selma to Montgomery march or battles on the Western Front during the First World War. These could be structured like the popular Choose Your Own Adventure children's books, so that one's actions in a simulation change one's future options.
A research team at Syracuse University is currently investigating the use of VR in teaching. Among other research questions, this project is investigating where a user's attention is directed while in a VR simulation and how that affects the user's later recollection of the events in the simulation. Some early work along these lines has been conducted in police simulations, where the stakes for correctly directing one's attention, taking action, and being able to recall events later are quite high.14
The experience of Gallaudet University highlights one of the most important areas for development in 3D technology: accessibility for users with disabilities. Accessibility is often something of an afterthought in the design of new technologies, and 3D technology is no exception. Yet designing for accessibility is critical for some users to even be able to use 3D technology at all.
Gallaudet is a school primarily for deaf and hard-of-hearing students. In the United States, deaf individuals often communicate via American Sign Language (ASL). ASL is a visual communication system, which means that once a deaf user dons a VR helmet, she cuts herself off from all communication from the outside world. The user wearing the VR helmet may sign to others, but there is no simple way for others to communicate back. AR is therefore arguably a more "deaf-friendly" technology, as the user can see through an AR headset.
One possibility for communicating with a deaf user in VR is a popup box appearing in the user's field of vision, like a text message. This is a slow (slower than either speech or signing) form of communication, however, and potentially breaks the immersion of the VR simulation. Another possibility is for a VR environment to be designed with "stop points"—locations or activities that are established in advance as points when the user must take off the headset and communicate with others. This too, however, takes the user out of their immersion.
The Americans with Disabilities Act provides standards for accessible design of physical spaces of various types. The ADA also provides technical assistance and guidance for accessible technology, though this primarily deals with web accessibility. Individuals and companies developing VR environments for deaf and hard-of-hearing users should consult these standards, though neither is exactly on point for VR. It is probably time for a new version of the ADA standards to be developed that addresses VR and AR technologies.
Another issue for VR, one not confined to deaf and hard-of-hearing users, is simulator sickness. Given that this is a form of motion sickness, individuals differ in their susceptibility, but it can occur without the user actually experiencing any motion.15 Many solutions have been proposed to combat simulator sickness, including dimming one's headset, keeping one's time in a simulation short, slowing down the refresh rate of the simulation, and operating a VR simulation on an empty stomach. These are suggestions, however, and little research has been done to establish their efficacy. One method that has been shown to be effective in combating simulator sickness, however, is to insert a static object into the user's field of vision: a frame, like a cockpit, or some other static object, like a nose.
Students want technology integrated into their courses, and they want their instructors to make more use of technology in their teaching.16 This is a persistent finding of much prior EDUCAUSE research, even for those tools that are quite well established, such as the LMS and lecture capture. Certainly 3D technologies have a greater "cool factor" than these comparatively staid tools. Yet, 3D technologies, like any technology, must serve a meaningful pedagogical function.
For that to be the case, 3D technologies must be incorporated into the instructional design process for building and redesigning courses. And for that to be the case, it is necessary for faculty and instructional designers to be familiar with the capabilities of 3D technologies. And for that to be the case, it may not be necessary but would certainly be helpful for instructional designers to collaborate closely with the staff in campus IT units who support and maintain this hardware.
However, staff in IT units and centers for teaching and learning often do not collaborate and may not have much contact at all. Every institution of higher education has a slightly different organizational structure, of course, but these two campus units are often siloed. This siloing may lead to considerable friction in conducting the most basic organizational tasks, such as setting up meetings and apportioning responsibilities for shared tasks. Nevertheless, IT units and centers for teaching and learning are almost compelled to collaborate in order to support faculty who want to integrate 3D technology into their teaching. It is necessary to bring the instructional design expertise of a center for teaching and learning to bear on integrating 3D technology into an instructor's teaching, and it is necessary to bring the technical expertise of the IT unit to bear on the deployment of 3D technology in the classroom.
Even assuming that an institution has a workable mechanism for instructional designers and IT staff to collaborate, some effort is still required to meaningfully integrate any technology into the teaching and learning experience. Therefore, one of the most critical areas in which IT units and centers for teaching and learning can collaborate is in assisting instructors to develop this integration and to develop learning objects that use 3D technology. Instructional designers can help faculty develop pedagogically sound uses for 3D technology in their courses, but they may lack the skills to help faculty deploy this technology. IT staff may have the deployment skills but may lack the skills to develop new tools, such as simulations and models in platforms such as Unreal Engine and Steam. Instructional designers may understand the uses of learning analytics and the gamification of learning, but game designers can bring to the table engagement analytics and an understanding of gamification honed in the game industry. Collaboration among IT staff, instructional designers, software developers, and game designers has the potential to enable extremely creative uses of 3D technology in teaching and learning, and across campus.
The process for developing learning objects that use 3D technology, as described here, is quite labor intensive, involving a team that includes, at a minimum, an instructor, an instructional designer, and an IT staff member, one or more of whom possess the skills of a software developer and a game designer. For 3D technology to really gain traction in higher education, it will need to be easier for instructors to deploy without such a large support team.
Sites such as Thingiverse, Sketchfab, and Google Poly are libraries of freely available, user-created 3D models. Among other freely available tools for building 3D models is Google Blocks. A third component that is critical for 3D technology to gain traction in higher education is freely available tools to help instructors develop their own learning objects that use 3D technology. Some instructional uses that could benefit from this sort of tool have already been discussed: developing simulations of historical events or conducting "popup" events in the local community. Such events could be extremely powerful instructional opportunities if it were possible for instructors to build custom simulations for the specific user community. Some tools, such as Minecraft: Education Edition, are already available to assist in the development of virtual environments; the world of 3D technology needs a tool that provides instructors with a similar framework for development.
In particular, a tool is needed that allows for the development of instruction for entire classes. Many current educational VR simulations allow for only a single user. Even popular VR games often allow for only a small number of users: Star Trek: Bridge Crew, for example, is designed for four players. What instructors in higher education need (and instructors at other educational levels too, of course) are simulations that can accommodate an entire class of students simultaneously. ClassVR is a tool that enables the simultaneous delivery of a simulation to multiple headsets, though the simulation itself may still be single-user. A combination of an easy-to-use development platform for instructors, the ability to create "multiplayer" educational simulations, and low-cost headsets would be a powerful tool for integrating 3D technology into higher education.
The project team at Syracuse University has an HP Z VR Backpack PC. This rig has enabled researchers at Syracuse to conduct what they call "popup" events, discussed above, where the public can experience VR environments. The "cool factor" of VR makes this a particularly effective form of outreach. But it also has a more pedagogically useful function: It enables an educational simulation to be set up and used anywhere, any time. And as the development process gets easier, the more responsive the simulation can be to the local context and educational needs.
The VR assignment in a first-year experience course at FIU was described earlier. It requires groups of students to manipulate a model of a hypothetical plot of land under 18 inches of water. Given the rapid rate of sea-level rise in the Southeastern United States,17 this is just barely a hypothetical scenario. Imagine a group of environmental engineers going into the field (or wetland) with AR headsets and collaboratively designing buildings and earthworks virtually. Those buildings could then be 3D printed. Of course, industrial-scale 3D printers for construction are quite a bit different (and more expensive) than desktop 3D printers. But the process is similar and, as with all architecture, benefits from the designers having an understanding of the environment the building will occupy.
Recall the VR walkthrough of the Vishnupada Temple, which enables the user to peel away surfaces and virtually look through walls. Imagine combining this with the type of AR overlay being designed at Harvard for electronic components. Now imagine combining all of that with a radio-frequency sensor, which is already commercially available as a smartphone app. Users would be able to see through walls, a superpower that any archaeologist, architect, or electrician might desire. Indeed, the C-THRU firefighting helmet has already implemented some of this functionality to enable firefighters to see through smoke via the use of thermal sensors and AR displays.
One can imagine an immense range of possible applications of untethered AR and VR in a wide variety of settings and educational disciplines. In a classroom or a lab, of course, any possible simulation can be deployed, particularly if the instructor has some lead time and has collaborated with an instructional designer. However, the real power of mobile VR and AR rigs comes from their flexibility—the ability to deploy a simulation anywhere, anytime, combined with a development process that (one assumes) will only continue to get easier. Thus any location can be a classroom, any time can be a teachable moment.
A 3D scanner operates like a combination of a digital camera and a LIDAR range finder: Lasers are bounced off an object to identify its shape, and digital photos are taken of its surface; photogrammetry software then "wraps" the photos around the 3D model of the shape. Depending on the physical size of an object and the complexity of its surface, a 3D scan may comprise dozens, hundreds, or even thousands of photos. As anyone who has ever looked at the storage use on their smartphone knows, digital photos can be fairly large and take up a lot of space. A single 3D scan might not tax the storage capabilities of a campus IT department, but once a 3D scanning program is in place, for courses or for research, the number of scans that must be stored will increase rapidly.
Hamilton College encountered this very problem. The GeoSciences 3D Scanning Project has the goal of scanning the entire collection of mineral samples acquired by faculty, students, and alumni over 200 years. While these mineral samples are not physically large, they have uneven surfaces; therefore the photo set for the 3D scans of the objects is quite large. By the end of this project, Hamilton College will need to devote significant storage space to maintaining this data set. Furthermore, the institution may want to make this data set available to others. The original mineral samples at Hamilton College are used by several neighboring institutions; it seems likely that these same institutions, and perhaps others as well, may want to use the 3D scans of these samples.
Institutional repositories are often the mechanism by which institutions of higher education make such data sets available. An institutional repository is a collection of an institution's intellectual output, often consisting of preprint journal articles and conference papers and the data sets behind them.18 An institutional repository is often maintained by either the library or a partnership between the library and the campus IT unit. An institutional repository therefore has the advantage of the long-term curatorial approach of librarianship combined with the systematic backup management of the IT unit.
On a larger scale, the issue of data management for 3D scans, 3D models, VR environments, and other large data sets associated with 3D technology is significant in general. Several websites allow users to upload and share 3D models: Thingiverse, Sketchfab, and Google Poly, among many others. Use of such sites takes the storage burden off the institution. There may be an advantage, however, to maintaining local copies of this data instead of, or in addition to, sharing it, even if only as a backup.
Sharing data sets is critical for collaboration and increasingly the default for scholarship. Data is as much a product of scholarship as publications, and there is a growing sentiment among scholars that it should therefore be made public.19 Institutional repositories are often the vehicle by which data sets are made available. But for data sets to be managed locally, as well as to be found by others, institutions must adopt both policies and metadata for data sets. Fortunately there has been some work to develop both. Data governance policies are commonplace, as are guidelines for creating your own policies.20 The CARARE Metadata Schema Version 2.0 enables the capture of metadata and provenance data describing the creation of 3D models. It would behoove institutions to adopt data governance policies for the maintenance of 3D data sets as well as appropriate policies for descriptive and provenance metadata21 about those data sets.
Doug Bolton, "Samsung Patents Design for 'Smart' Augmented Reality Contact Lenses," The Independent (April 6, 2016).↩︎
Paul Milgram and Fumio Kishino, "A Taxonomy of Mixed Reality Visual Displays," IEICE Transactions on Information Systems, vol. E77-D, no. 12 (1994); Steve Mann, "Through the Glass, Lightly," IEEE Technology and Society Magazine 31, no. 3 (2012): 10–14.↩︎
Strictly speaking, even what we now consider VR is augmented reality. VR replaces what the user sees with an entirely constructed visual environment, along with perhaps a constructed auditory environment, and the ability to manipulate some of that environment with one's hands. It is a testament to how much humans rely on our sense of vision that we consider a constructed visual-only environment to be virtual reality. One could even argue that there is no such thing as VR, really, and that the only true VR would be a complete replacement of one's full sensorium with a construct, as in The Matrix. Part of the philosophical fun of that movie, of course, is the question of how it would even be possible to know that one is inside a perfect construct. Can The Matrix even be considered VR at all? Is the virtuality continuum not actually a continuum but rather a circle, where VR loops back around to the real?↩︎
Zeynep Tufekci, Twitter and Tear Gas: The Power and Fragility of Networked Protest (New Haven, CT: Yale University Press, 2017): xxvi.↩︎
Anastasia Morrone, Anna Flaming, Tracey Birdwell, Jae-Eun Russell, Tiffany Roman, and Maggie Jesse, "Creating Active Learning Classrooms Is Not Enough: Lessons from Two Case Studies, EDUCAUSE Review (December 4, 2017).↩︎
Rebecca Griffiths, Jessica Mislevy, Shuai Wang, Linda Shear, Nyema Mitchell, Michelle Bloom, Richard Staisloff, and Donna Desrochers, Launching OER Degree Pathways: An Early Snapshot of Achieving the Dream's OER Degree Initiative and Emerging Lessons (Menlo Park, CA: SRI International, 2017).↩︎
Yvonne Belanger, "Duke University iPod First-Year Experience Final Evaluation Report" (June 2005).↩︎
Brooks and Pomerantz, ECAR Study of Undergraduate Students and Information Technology, 2017; Pomerantz and Brooks, ECAR Study of Faculty and Information Technology, 2017.↩︎
Melanie C. Green, "Storytelling in Teaching," in Lessons Learned, Vol. 2: Practical Advice for the Teaching of Psychology, eds. B. Perlman, L. I. McCann, and S. H. McFadden (Washington, DC: American Psychological Society, 2004), 175–84.↩︎
Walter R. Fisher, Human Communication as Narration: Toward a Philosophy of Reason, Value, and Action (Columbia, SC: University of South Carolina Press, 1989).↩︎
Harrison Monarth, "The Irresistible Power of Storytelling as a Strategic Business Tool," Harvard Business Review (March 11, 2014).↩︎
Michael F. Dahlstrom, "Using Narratives and Storytelling to Communicate Science with Nonexpert Audiences," Proceedings of the National Academy of Sciences 111, no. 4 (2014): 13,614–20.↩︎
Sarah Zhang, "Can VR Really Make You More Empathetic?" Wired (September 1, 2016); Virtual Human Interaction Lab, Empathy at Scale, Stanford University (December 20, 2015).↩︎
See Addy Hatch, "Counter-Bias Police Training Simulator Gets First Large-Scale Test," WSU Insider, October 4, 2017; and Eddie L. Reyes, "How Do Police Use VR? Very Well," Police Foundation.↩︎
David Johnson, "Introduction to and Review of Simulator Sickness Research," research report 1832, U.S. Army Research Institute for the Behavioral and Social Sciences (2005).↩︎
Brooks and Pomerantz, ECAR Study of Undergraduate Students and Information Technology, 2017.↩︎
Arnoldo Valle-Levinson, Andrea Dutton, and Jonathan B. Martin, "Spatial and Temporal Variability of Sea Level Rise Hot Spots over the Eastern United States," Geophysical Research Letters 44, no. 15 (August 16, 2017): 7,876–82.↩︎
Clifford A. Lynch, "Institutional Repositories: Essential Infrastructure for Scholarship in the Digital Age," portal: Libraries and the Academy 3, no. 2 (April 2003): 327–36.↩︎
Christine L. Borgman, Big Data, Little Data, No Data: Scholarship in the Networked World (Cambridge, MA: MIT Press, 2015).↩︎
Jon Bruner, "Data Governance: What You Need to Know," O'Reilly (March 6, 2017); Jill Dyché and Analise Polsky, 5 Models for Data Stewardship: A SAS Best Practices White Paper, SAS Institute, Inc.↩︎
Jeffrey Pomerantz, Metadata (Cambridge, MA: MIT Press, November 2015); Marcia Lei Zeng and Jian Qin, Metadata, 2nd ed. (Chicago: ALA Neal-Schuman, 2016).↩︎