Navigating the XR Educational Landscape: Privacy, Safety, and Ethical Guidelines

Regulatory and Ethical Considerations

While there has been a renewed interest in XR over the past few years, in part accelerated by the COVID-19 pandemic, a good deal of research and information already exists on the topic of XR safety, privacy, and security:

Key Privacy Laws and Regulations Related to XR in Higher Education

In many cases, while seeking to adopt 3D digital learning technologies, higher education institutions should be able to draw from existing procurement infrastructure and institutional policies that may have been created to mitigate risk associated with the delivery of online courses and the integration of third-party learning and communication tools. Our compliance recommendations thus focus primarily on enhancing existing policies and procedures by broadening their scope to explicitly include XR initiatives and the unique privacy and security concerns associated with XR environments.

While few laws and regulations explicitly address XR in higher education, the existing higher education regulatory framework should nevertheless be viewed as having potentially significant policy implications relating to XR implementation. Notably, state and federal laws (as well as laws of foreign jurisdictions, as applicable) relating to privacy, accessibility, and other civil rights can have unique applications in the context of XR. The scope of this section is limited to data privacy and security regulatory considerations for XR, but a discussion of this broader regulatory framework continues in the following section.

Constitutional Right to Privacy

The Fourth Amendment of the U.S. Constitution provides for a privacy right against unreasonable searches and seizures from the federal government, which has been extended to state officials by incorporation through the Fourteenth Amendment. As privacy law involving the Fourth Amendment has evolved over time, a "reasonable expectation of privacy" standard (sometimes called the "right to be left alone") has developed through case law, where courts have generally articulated a higher expectation of privacy—and therefore greater right to privacy—in private spaces and a relatively lower expectation of privacy in public spaces.

The right to privacy in one's home is particularly well established and includes, for example, prohibitions on law enforcement or other state actors from conducting warrantless searches of one's home using wiretaps, infrared, or similar scanning technologies capable of "seeing" inside walls (Kyllo v. United States) and drug-sniffing dogs (Florida v. Jardines).

Similarly, the Fourth Amendment has been held to protect an individual's privacy interest in their location and movement (Carpenter v. United States). The U.S. Supreme Court has held that placing GPS tracking devices on a person or their property is unconstitutional without a warrant (United States v. Jones), as is accessing a person's cell phone or other devices with this and a variety of other personal information contained within (Riley v. California). These privacy rights are enforceable against civil authorities, including "school officials" and criminal authorities.

XR devices use location-tracking and a variety of recording and scanning technologies that can capture vast amounts of personal information from one's home. If this data is improperly collected or used by government actors, such as by faculty and staff in public institutions of higher education, the result can be a Fourth Amendment violation.

A recent case from the Northern District of Ohio involving a room scan conducted as part of a proctored exam can provide a glimpse into how courts could decide cases presenting similar facts but with the use of XR devices. In Ogletree v. Cleveland State University, the defendant used online proctoring technologies in part for students unable to attend campus due to COVID-19 protocols. Based on the settings chosen, these tools enabled live webcam monitoring and room scanning of the testing environment as methods of detecting cheating. The test-taker, in this case, alleged that the room scanning occurred without sufficient notice and cited the presence of tax documents and medication in the room in court documents, which would have possibly been in view of both the live proctor at the other end of the camera and anyone who may view the vendor-maintained recording thereafter.

While the judge held on summary judgment that a violation of the student's Fourth Amendment rights did occur, it is important to note that the student's reasonable belief that this room scan would not take place until approximately two hours before the exam weighed heavily in this decision. A policy that informed students of the use of room scanning during proctored examinations had initially been included in the course syllabus but was removed part-way through the term, and the plaintiff argued this signaled that this practice would no longer occur. The plaintiff also had no other option to complete the exam due to limited space at home with others present and due to COVID-19 policies preventing the student from completing the exam on campus. With sufficient notice and clear and consistent policies in place, the case might have been decided differently.

Family Educational Rights and Privacy Act (FERPA)

FERPA applies to all educational institutions receiving funding from the U.S. Department of Education and creates privacy protections for student education records, which are any records containing personally identifiable information (PII) relating to a student that is maintained by the institution (or a party acting on behalf of an institution). Institutions will likely have existing FERPA policies and procedures that can be applied in a largely consistent manner across both digital 2D learning environments (e.g., learning management systems and various learning and communication tools) and 3D learning environments (e.g., XR reality environments).

Institutions may share student PII only with "school officials" with "legitimate educational interests," which can include individuals outside of the institution in certain cases, unless either the category of information has been designated as a "directory" and students have been provided with an opportunity to opt out of such data sharing1 or the student has provided written consent to having the information shared more broadly. Institutions must also employ "reasonable methods" when securing student data. For more than five years, the U.S. Department of Education has been strongly recommending that institutions of higher education adopt the NIST 800-171 framework to protect any FERPA student information and to demonstrate that reasonable cybersecurity methods have been taken. Conformance with this standard is already required for certain data processing activities via executive orders and federal regulations. Additionally, institutions participating in the Title IV programs also agree as part of their Program Participation Agreements to comply with the Gramm-Leach-Bliley Act (GLBA) Safeguards Rule. In February 2023, the U.S Department of Education stated that it will soon "issue guidance on NIST 800-171 compliance in a future Electronic Announcement," but also "encourages institutions to begin incorporating the information security controls required under NIST 800-171 into the written information security program required under GLBA as soon as possible."

Regardless of the learning environment, it may be helpful to think about the creation of education records as deriving from four primary sources:

  1. From students—for example, during account registration, when submitting assignments, or when students participate in synchronous or asynchronous discussions with instructors or classmates, including over third-party communication tools being used to support the learning experience
  2. From active processes initiated by the institution—for example, when creating academic files or issuing grades, and when directly capturing student participation through recording class sessions, whether audio or visual
  3. From passive, automated processes initiated by the institution—for example, when using cookies or similar technologies to collect technical information and usage data
  4. When provided to the institution by third parties, including partner organizations that are supporting learning activities

These four primary sources should be considered throughout institutional FERPA policymaking and implementation to ensure awareness around education record creation. In the context of XR learning, several types of PII can be collected through these same four sources. For instance, student biometric data, which is explicitly considered PII under FERPA, may be automatically collected as students wear VR headsets. Moreover, students may directly provide "selfies" and other PII when creating avatars to use in XR environments.

Regarding third parties, institutions should consider several factors before moving forward with such educational activities hosted on third-party platforms and devices, just as they would with regard to other learning or communication tool providers. In the case of XR, however, the type of data collection may be more personal and sensitive and the amount of data collected may far exceed that which is collected in non-XR environments. As noted above, students may be asked by their institutions to use specific devices where the manufacturer retains biometric data—data that can be akin to health data in some respects and that can receive additional protections under multiple state laws as well as the European Union's General Data Protection Regulation (GDPR). Or students may be asked to upload selfies and volunteer other descriptive information to create avatars in their own likeness (e.g., when creating avatars on XR collaboration platforms such as Spatial.io or AltspaceVR among others). As a result, careful consideration needs to be given to the privacy practices of the third party, the degree to which that third party can be considered a "school official," and whether opt-out or written consent procedures should be implemented. Where information sharing would not be harmful to students, updating the institutions directory information categories may also be warranted if the institution is unable to identify a legitimate educational interest for the scope of data transfers that may occur as part of offering XR experiences with third-party support.

Additionally, the presence of a student avatar in XR learning activities introduces several unique challenges to FERPA's application to XR. Should records featuring student-created avatar images be considered PII in the same manner as student photos? Does the degree to which the avatar resembles the student's physical likeness or the way the avatar was created (e.g., using photogrammetry as the basis for the avatar) matter? In recorded XR sessions where student avatar participation is present, would the same FERPA considerations apply regardless of whether the real-life student or the avatar is present?

The need to protect student identities as presented through their avatars may not be immediately obvious to faculty and staff, particularly if the avatar does not bear a close resemblance to the student. And yet, where students create or are assigned avatars with unique features such that it would be possible to identify the student through repeated exposure, institutions would be wise to treat such records no differently than those featuring other PII.

The U.S. Department of Education has yet to engage in rulemaking or issue specific guidance on XR and avatars that would account for these novel circumstances and may not for some time. However, this does not mean it is too early for individual institutions to start considering how existing FERPA rules and guidance would apply in these unique scenarios and ensure faculty and staff have considered the potential application of FERPA in XR by updating training and resources.

Distance Education Identity Verification and Privacy

Per federal rules, accrediting agencies must demonstrate that they require institutions that offer distance education or correspondence education to have processes in place to establish that the student who registers for such a course or program is the same student who participates in and completes the course or program when awarding academic credit.2 Prior to 2019 rulemaking updates, the Department of Education had required accreditors to list the following acceptable practices for verification of identity: secure logins, proctoring, or other effective methods. However, the department no longer lists specific practices and instead defers to accreditors.

Regardless of the verification methods employed, institutions must notify students of any additional costs they will incur, as applicable, as a result of verification implementation in a course at the time of registration or enrollment. Institutions must also ensure identity verification methods 3 As is detailed below, state-specific privacy laws will also be a factor when it comes to ensuring any personal information used to identify students is adequately secured and that data subjects are adequately informed.

Recently, a number of online proctoring companies that have been used to support identity verification compliance for distance education courses and programs have faced some backlash after allegations of privacy violations and poor data handling practices. In addition to a growing concern among students and legal complaints against these companies, institutions themselves have in some cases been targeted for their use of proctoring software allegedly without adequate privacy protections or appropriate disclosure practices in place. The decision in Ogletree v. Cleveland State University (2022), detailed above, serves as an example of the application of federal privacy law. In addition, individual state privacy laws, such as the Illinois Biometric Information Privacy Act (BIPA), have also served as the basis for complaints involving proctoring software.

These lawsuits and complaints against proctoring companies and institutions partnering with them should serve as a warning when it comes to adopting XR technologies without proper controls in place given the similarities in data collection (particularly regarding biometric data collection). Moreover, institutions must learn to balance these sometimes competing requirements among accreditors and federal and state privacy laws by both ensuring that a student's identity can be verified and that the verification methods used can adequately protect the student's privacy.

Children's Online Privacy Protection Act (COPPA)

COPPA applies to website operators, including educational institutions providing services online, and assigns unique protections to children under age 13. COPPA clearly applies to XR service providers as long as the provider is "targeting children." While the FTC, which enforces COPPA, has not released specific guidance on XR data collection, institutions may wish to err on the side of caution by obtaining consent from parents or guardians in all cases when using XR technologies with children under 13 using existing COPPA procedures.

Note, however, that amendments to COPPA recently proposed in the U.S. Congress could soon broaden the entities and categories of data collection covered (including protections around biometric data), as well as expand certain protections to teenagers up to age 18. Institutions that invite minors to participate in XR experiences should monitor legislative updates in this area and consider whether additional privacy measures will need to be implemented for children and teens.

Health Insurance Portability and Accountability Act of 1996 (HIPAA)

HIPAA assigns protections for medical information maintained by covered entities, which often will include institutions of higher education to the extent they provide health care services. Additional HIPAA protections may not apply in cases where the information is covered under FERPA as an education record. XR devices may collect various biometric information that could conceivably be considered health data depending on the context; however, an institution would likely not need to apply HIPAA protocols to information collected as part of educational activities as FERPA would instead apply. Nevertheless, institutions should consider scenarios in which FERPA would not provide an exemption and when HIPAA protections would attach, such as when users are not "students" or may be engaged in noneducational activities.

U.S. State Laws, Including the California Consumer Privacy Act (CCPA)

Approximately 20 states have data privacy laws that assign protections to biometric data and are increasingly applying those protections to out-of-state entities, attaching these requirements to the individual consumer's location. The CCPA, which is perhaps the best known of these laws, does not always directly apply to nonprofit institutions. However, many nonprofit institutions engage for-profit (and CCPA-covered) service providers to process consumer information and can absorb certain CCPA compliance responsibilities through such partnerships, depending on how data is collected and shared. Like the GDPR, a number of state laws, including the CCPA, grant unique data subject rights and contain certain consent requirements for biometric data collection.

The General Data Protection Regulation (GDPR) and Other International Privacy Frameworks

The General Data Protection Regulation (GDPR) was adopted in the European Union (EU) in 2016 and has served as a useful benchmark for comprehensive data privacy laws across the world. This framework, which took effect in 2018, assigned vast data security protections and granted numerous rights to data subjects located inside of the European Union (EU) (and now separately in the United Kingdom), including both the rights to access personal information and have it deleted. It also establishes numerous disclosure and consent requirements and established enforcement mechanisms beyond its own borders as long as the data subject is located within them.

Biometric data, which is commonly and expansively collected by XR devices, as noted above, receives special protection under the GDPR, which may apply when institutions are serving students in the EU and in the United Kingdom. Such processing is prohibited unless the data subject has given explicit consent, and unique security requirements apply when handling such data.

The GDPR is now just one of many international privacy frameworks that institutions of higher education serving an international population of learners should be concerned with, however.4 Perhaps the most recent and expansive GDPR-like privacy framework with extraterritorial implications is the Personal Information Protection Law (PIPL), which was adopted by the National People's Congress of China on August 20, 2021, and went into effect November 1, 2021.

When comparing PIPL with GDPR, the International Association of Privacy Professionals (IAPP) observed that "while the PIPL bears a resemblance to the GDPR, it includes certain substantive obligations that differ from the GDPR, and there are also obligations found in the GDPR that are not included in the PIPL." As with the GDPR, however, biometric information is considered to be sensitive in nature and such collection could trigger consent requirements in addition to certain safe handling requirements.

Institutions that offer educational content and services outside of the United States will want to continue monitoring how the GDPR, PIPL, and other emerging international privacy frameworks evolve, paying particularly close attention to the treatment of biometric data in future guidance and enforcement actions.

Human Subjects Research

Beyond student privacy considerations that may routinely come up in the context of in-person or remote teaching and learning environments, researchers at or affiliated with an institution of higher education may also have an interest in processing data collected through XR technologies. Students may also have a research or publication interest in data collected by XR technologies, whether for dissertations, theses, or less formal projects. Note that IRB protocols would not apply in the context of class-based activities and assessments that would not lead to published results. Additional exemptions will also apply as determined by the institution.

All higher education institutions that receive federal research funding through various agencies are required to have Institutional Review Boards (IRBs) charged with determining and overseeing procedures for human subjects research in a manner consistent with federal regulations under the so-called "Common Rule."5  Among these responsibilities, IRBs evaluate research proposals and set protocols aimed at protecting the privacy of subjects and the confidentiality of their sensitive personal information.

XR technologies are unique in their potential to collect truly vast amounts of biometric data that is, or has the potential to become, personally identifiable as well as their potential to collect other sensitive data, such as location information and visual and audio records of the user's surroundings. As a result, research involving human subjects using these technologies will likely be subject to considerable scrutiny by IRBs to ensure data is properly secured and de-identified, as applicable, and that data subjects are actually providing informed consent.

As IRBs begin to review more research requests involving human subjects and XR technologies, it may be appropriate to develop new policies and/or update resources to address common data questions and concerns. These may help guide researchers through the approval process, including whether certain data practices may qualify proposed research for a specific IRB exemption and, specifically, whether existing de-identification standards used in exemption criteria remain appropriate in the context of XR data collection.

Other Compliance Considerations

Discrimination and Harassment

Title VI of the Civil Rights Act states that no person in the United States "shall, on the ground of race, color, or national origin, be excluded from participation in, be denied the benefits of, or be subjected to discrimination under any program or activity receiving Federal financial assistance." For violations to occur, the conduct must be objectively offensive and either severe or persuasive. An institution can face consequences for failing to intervene after having knowledge that such conduct is occurring.

As is the case with all educational experiences created for students, care and attention should be paid to ensuring all students feel comfortable to participate. Teacher-student and peer-to-peer interactions through avatars may invite new opportunities for discrimination and harassment that may feel particularly real given the immersive nature of XR but can perhaps be easily overlooked given the virtual (i.e., not physical) reality in which these interactions take place. Institutions should consider the degree to which existing codes of conduct should apply or should be modified to appropriately address XR interactions.

Meanwhile, Title IX of the Higher Education Amendments prohibits discrimination based on sex at schools that receive federal funding. Courts and federal agencies have, at times, interpreted sex discrimination prohibited under Title IX to include harassment based on gender identity and sexual orientation.

As was noted regarding Title VI above, care and attention should be paid to XR learning experiences, where unique opportunities may exist for various forms of harassment that may not have previously been considered as part of an institution's existing Title IX policies. Institutions should again consider whether existing policies and codes of conduct attach to XR environments. In particular, the question of whether the operator of one avatar can commit sexual misconduct against the operator of another avatar should be addressed.

Accommodations and Accessibility for People with Disabilities

Section 504 of the Rehabilitation Act and the Americans with Disabilities Act (ADA) prohibit discrimination based on disability, construed broadly as having a physical or mental impairment that substantially limits one or more major life activities. Section 508 of the Rehabilitation Act establishes proactive technical standards, such as captioning and audio description requirements, for institutions receiving federal funding. (These standards are updated periodically to conform with the World Wide Web Consortium's Web Content Accessibility Guidelines (WCAG)). Many states have expanded the scope of entities impacted by Section 508 requirements (so-called "little 508" or "mini 508" laws) while adding additional remedies for those impacted by violations as well as penalties for violators.

In an educational context, enforcement of Section 508 is most common with regard to public websites and open content. However, in all cases, ideally, these standards should be considered as part of XR learning experience design, particularly if effective accommodations cannot be easily provided upon request. It is important both to design with accessibility in mind to avoid having to retroactively make changes and to involve disability service units at institutions as part of conversations around adopting new XR technologies or programs to better anticipate what accommodation requests may be common and whether appropriate resources are available.

Intellectual Property

Intellectual property (IP) refers to patent, copyright, and trademark rights, which are all examples of creations derived from new, unique ideas. As patent rights issues, which would apply primarily in the context of invention disputes, are less likely to arise in the context of everyday teaching and learning, this section focuses only on copyright and trademark.

Creators of original works connected to tangible forms of expression hold their copyright automatically, possessing a "bundle" of exclusive rights to reproduce, distribute, make derivatives of, and publicly perform and display their works (17 U.S.C. § 106). Facts and raw data are not copyrightable. Rather, works must contain at least a "modicum of creativity" for these protections to attach.

Regarding design work based on raw data sets, original expressions of data may be considered sufficiently creative to be eligible for copyright protections. In the case of 3D scanning and photogrammetry,6 which may feature in the design of unique XR content offered by institutions of higher education, institutions will need to consider how best to navigate ownership rights associated with such projects and update IP assignment forms and policies accordingly. Beyond the design of XR content, institutions may also wish to review policies and procedures relating to unique student creations in XR environments themselves and those impacting third-party tools and experiences where student copyright may be governed by unique terms and conditions, as will be discussed further below.

Meanwhile, a trademark can refer to any word, phrase, symbol, or design (or a combination of these) used to identify goods or services (15 U.S.C. §§ 1051 et seq.). For a mark to be eligible for trademark protections, it must be (1) distinctive and (2) used in commerce. In a trademark dispute, infringement would also be found only where the use of a similar mark would have the likelihood of creating confusion as to the origin or sponsorship of goods or services.

If the intent to use these marks is made clear (e.g., applying TM or SM labels, respectively, in connection with the marks) and such use does not constitute its own infringement, at least limited trademark protections can apply regardless of whether an official registration process has been started. However, having a trademark registered with the U.S. Patent and Trademark Office (USPTO) will create a legal presumption of ownership and offer broader protections in nationwide markets and enable the use of the ® (circle-R) symbol. Trademarks need to be defended for protections to be maintained, which means rights holders need to actively monitor the use of their marks.

In the context of XR, there are several unique trademark considerations. Since a key threshold issue for determining trademark infringement is whether the unauthorized use of trademarks has occurred "in commerce," the use of digital representations of trademarks in virtual environments where no "real world" money is being exchanged or connected to the use of the marks may be viewed differently under the law than when trademarks are used in virtual environments where money can be exchanged. However, if the activity connects back to revenue-generating activities (e.g., generates tuition or other revenue for an institution of higher education), infringement may still be found if the use of a mark could imply endorsement or affiliation even if the mark is not itself being used to market a specific product or service.

In addition to the individual users who create or upload infringing content in virtual environments (e.g., use of brand names on avatar clothing, recreations of real-world stores with logos used in signs), the institution itself could be found at least secondarily liable for publishing infringing content. Ultimately, such an analysis would depend upon the likelihood that such use could mislead consumers and whether the use amounts more to artistic expression or whether some other overriding First Amendment interest could be demonstrated.

Personality Rights (Name, Image, Likeness)

Personality rights consist of both property and privacy rights concerning the use of one's name, image, likeness, or other unique identifiers (NIL) in public or for commercial purposes. Institutions of higher education must first seek permission to use a person's NIL for public or commercial purposes, just as institutions would need to do when sharing education records publicly under FERPA.

Regarding XR activities, institutions may be collecting unique identifiers beyond just name, image, and likeness that still qualify for these legal protections. Additionally, student avatars, particularly those that closely resemble the user, may also be eligible for personality rights protections. If avatars are made to resemble someone else, the use of such avatars in public or commercial spaces could infringe upon the personality rights of others. Just as an institution of higher education would need permission before taking a picture of a celebrity, or anyone else for that matter, and featuring them in an advertisement, permission would likewise need to be obtained before using an avatar that captures someone's likeness and using it in a similar manner.

In addition to seeking necessary permissions before recording and sharing XR experiences in public settings or for any commercial purposes outside of the XR learning environment, there could be instances where personality rights would be implicated in virtual worlds when the environment itself is made public. In addition to seeking necessary permissions, therefore, it may be advisable to establish policies to discourage students and other affiliated users from directly impersonating others when designing avatars that could be used in public spaces.

Negligence

Negligence can be found where there is an existence of a legal duty of care, which is then breached and causes harm. For liability to attach, the events causing the harm must be reasonably foreseeable. However, defenses, such as assumption of risk, may also be available to the defendant and can absolve or at least mitigate any potential liability.

Institutions of higher education have a legal as well as ethical duty to provide a safe learning environment.7 As institutions integrate XR technologies into their learning experiences, therefore, institutions must stay informed of any known dangers associated with the use of these technologies and employ reasonable mitigation strategies. In addition to being ethical practices, creating and effectively implementing lab safety protocols and disclosure/consent procedures, which aim to inform students of any known dangers, are among the ways in which institutions can help defend against claims of negligence. Such procedures should account for both potential harm that could occur in a virtual environment (e.g., trauma or other mental harm) as well as physical harm that could occur within the physical, real-world surroundings (e.g., falls or bystander harm occurring while a headset is on).

Speech and Expression Rights

Many of the regulatory and ethical risk areas outlined above can be mitigated through specific actions as well as policy and procedural updates made by institutions that could also limit the faculty curricular and instructional decision-making as well as faculty and student speech and expression. However, the U.S. Supreme Court has also held that teachers and students do not "shed their constitutional rights to freedom of speech or expression at the schoolhouse gate" (Tinker v. Des Moines Independent Community School District, 393 U.S. 506 [1969]). Institutional policy ultimately must therefore balance risk mitigation with notions of academic freedom and student expression, particularly at public institutions of higher education. For each recommendation offered in the following sections, any resulting implementation strategy should seek to balance these sometimes competing interests—ideally by still prioritizing student safety above all while avoiding the adoption of overly restrictive practices when accounting for the unique context of one's own institution and the specific activity being addressed.

Broader stakeholder engagement as part of policymaking and implementation cycles will undoubtedly help institutions arrive at this appropriate balance, assuming minimum legal compliance and defensible procedures remain at the foundation of the efforts. Regarding classroom activities, soliciting the ideas of instructional faculty and students will be especially vital and, at least in the case of faculty, may be explicitly required depending on the system of faculty governance that exists at the institution and the degree to which proposed changes may impact curriculum and instruction.

Safety, Privacy, and Security in XR: A Complex Problem

The exploration of XR in support of instruction tends to involve the use of one specific XR mode (AR or VR). Soon, any large-scale adoption of XR for instruction will involve multi-modal (2D, AR, or VR) scenarios accessed by a large group of geographically distributed users. We already see emerging general-purpose collaborative platforms (e.g., EngageVR, Spatial.io) supporting concurrent access from smartphones, AR devices (e.g., Magic Leap and Hololens2), and VR devices (e.g., Meta Quest).

It is important to note that with each new generation of XR devices developed, the level of data collection significantly increases. Increasingly, outward-facing sensors used for hand tracking, room scanning, and positioning (as mainly found today in VR headsets), will be complemented by inward sensors such as eye tracking cameras found in AR/MR devices (e.g., the Hololens 2 or the Magic Leap 1).

In the long run, we are likely to see an AR/VR device convergence similar to the Varjo X3 headset and more recently the pass-through options available on the Meta Quest2 and the MR option on the Meta Quest Pro headsets from a sensor or capabilities perspective, but it is very likely that institutions will have to manage multiple classes of devices for the short to medium term. Therefore, security, privacy, and safety impact assessments will have to consider, at a minimum, the use of different classes of XR devices to access the same instructional application or environment by geographically distributed end users, which type of data is collected and processed by each class of device, and whether the data is processed locally or remotely.

Notes

  1. Only data categories that would not generally be considered harmful to the student or an invasion of privacy if disclosed can be classified as directory information. Opt-out requests must continue to be honored even after the student is no longer enrolled at the institution unless the student has rescinded the request. 34 CFR § 99.3.

    ↩︎
  2. 34 CFR 602.17(g)

    ↩︎
  3. 34 CFR 602.17(h)

    ↩︎
  4. Other notable privacy laws with implications for cross-border data transfers and extraterritorial jurisdiction include those recently passed in China, Brazil, and New Zealand. IAPP and DLA Piper have created comparison tables and other resources for these and other international privacy frameworks. Another international privacy framework to watch for is the Data Protection Bill being developed by India.

    ↩︎
  5. The "Common Rule" refers to the federal regulations on the protection of human subjects—regulations that have been codified by multiple federal agencies, including Health and Human Services, at 45 CFR § 46.

    ↩︎
  6. For a more detailed explanation of how copyright issues can arise in the context of 3D data, see Andrea D'Andrea, Michale Conyers, Kyle K. Courtney, Emily Finch, Melissa Levine, et al., "Copyright and Legal Issues Surrounding 3D Data," in Jennifer Moore, Adam Rountrey, and Hannah Scates Kettler, eds., 3D Data Creation to Curation: Community Standards for 3D Data Preservation (Chicago: Association of College and Research Libraries, ALA, 2022).

    ↩︎
  7. Specifically, courts have held that institutions have a special relationship with students, creating a duty to protect from or provide warning of foreseeable dangers associated with learning experiences offered by the school. This "special relationship" has evolved over time based on shifts in cultural attitudes—see, e.g., Regents of University of California v. Superior Court (2018)—but would, at a minimum, extend to warning students of foreseeable dangers when engaged in XR learning and to implementing reasonable safety protocols.

    ↩︎