Methodology

The Campus of the Future project was an exploratory evaluation1 and as such utilized a mixed methods approach. The use of multiple methodologies was necessitated by several factors:

  • the duration of the project,
  • the emergent nature of the projects at participating institutions,
  • the existence of multiple forms of documentation for projects at many participating institutions, and
  • the fact that participants were engaged in implementing the technology and learning how to use it while the research was ongoing.

Three primary mechanisms were used for data collection during this project:

  • A start-of-project survey. This was an in-depth survey to collect data about the intended use(s) of the 3D technology at each institution. Respondents were asked to upload syllabi for course-related uses and/or grant proposals (or other write-up) for research-related uses of the technology. This survey asked respondents to articulate things such as (1) the learning objectives of the course(s) and/or the research objectives of the project for which the equipment would be used, (2) the evaluation criteria for the course or project, and (3) what would constitute success upon completion of this project. Content analysis was conducted on the documents provided by project participants as part of their responses to the start-of-project survey.
  • Biweekly status report surveys. These were lightweight surveys that participants were asked to fill out throughout the course of the project, with a break over the December holidays. These surveys asked participants (1) to do some rough time-tracking, asking approximately how many hours the project team spent working on the project over the previous two weeks, and (2) about any progress and successes, delays, or setbacks that the project team experienced over the previous two weeks.
  • In-depth interviews. These were semistructured interviews with the project leader or project team during which the EDUCAUSE research team asked project participants to provide more depth of detail about their teaching and research using the provided 3D technology—for example, unexpected or unplanned uses or outcomes that emerged, processes developed over the course of the project, and lessons learned.

In addition, several secondary mechanisms were used for data collection during this project:

A LISTSERV was set up to facilitate communication among project participants. At the start of the project, the researchers asked participants to post a brief description of the courses and/or projects for which they would be using 3D technology. Analysis of these brief descriptions informed the development of the start-of-project survey. Throughout the course of the project, this listserv saw light but steady use by the participants, to ask and answer questions, to coordinate across institutions, and to plan events. Content analysis was conducted on posts to the listserv that contained information about individual projects.

An extensive literature review was conducted on 3D technologies for both educational and noneducational uses.

Informal, unstructured interviews were conducted with a small number of project nonparticipants at institutions that have done significant work in developing makerspaces or in deploying AR, VR, and 3D technology similar to what current project participants deployed. While the institutions that participated in the Campus of the Future project were not representative of the state of higher education in the United States or globally, collecting some data from project nonparticipants provided a rough benchmark for evaluating just how unrepresentative project participants were.

Finally, the teams at several participating institutions created blogs to document the progress of their projects. These blogs were created primarily as a means to disseminate information about the activities of the teams for an audience of the local community at the institution, not for HP or EDUCAUSE specifically. Nevertheless, these blogs were extremely useful as a data source, since their presentation was tailored to the institution and the specific pedagogical needs of the faculty and students at the institution. Content analysis was conducted on the posts to these blogs that discussed work relevant to the project.

Note

  1. Kathryn E. Newcomer, Harry P. Hatry, and Joseph S. Wholey, Handbook of Practical Program Evaluation, 4th Ed. (Hoboken, NJ: Jossey-Bass, 2015).

    ↩︎