Methodology and Acknowledgments
Methodology
The qualitative findings in this report were derived from open-ended responses from individuals who participated in the ECAR Study of Undergraduate Students and Information Technology, 2019. ECAR conducts this annual study to shed light on how IT affects the college/university experience. These studies have relied on students recruited from the enrollment of institutions that volunteer to participate in the project. After institutions secured local approval to participate in the 2019 study (e.g., successfully navigating the IRB process) and submitted sampling-plan information, they received a link to the current year's survey. An institutional representative then sent the survey link to students in the institution's sample. Data were collected between January 15, 2019, and April 6, 2019, and 53,475 students from 160 institutional sites responded to the survey. The quantitative findings in the 2019 report were developed using 40,596 survey responses from 118 US institutions.
Students with disabilities were identified through the following question in the 2019 survey: "Do you have physical or learning disabilities that require accessible technologies or accommodations for your coursework?" Response choices included: "no"; "yes, I have one or more physical disabilities"; "yes, I have one or more learning disabilities"; "yes, I have both physical and learning disabilities"; and "prefer not to answer." Among the 40,596 responses, 2,224 students identified as having a physical and/or learning disability that required technology for their coursework. From this subset, 1,819 participants responded to the open-ended question, "What is the ONE thing you would like your instructors to do with technology to enhance your academic success?"
Open-ended responses were uploaded to Atlas.ti (v.8) and analyzed using thematic content analysis. Both inductive (theory building) and deductive (theory testing) approaches were used to analyze the data. Responses were excluded if students did not answer the question or included nondescript answers such as "n/a." Two independent coders produced an initial codebook that they used to identify and organize patterns in the data. Inter-rater reliability was established using two iterative rounds of independent coding at 85% and 92% rates of agreement. Based on these results, a second codebook consisting of the final 35 research codes was produced and used to create the themes for this report.
Additionally, our findings showed a logical relationship, a hallmark of credibility. For example, many responses that focused on use of the LMS made sense in the context of previous EDUCAUSE student reports, which recognized that students want faculty to use the LMS for posting course content. We were also able to create conceptual models from our analysis, showing that our themes aligned with previously observed students' experiences, such as students' desire to use their mobile devices in class. The dependability of this research was strengthened by subject-matter-expert review of findings, high levels of inter-rater reliability, a consistent use of a well-defined codebook during all stages of analysis, and data reduction through a constant comparative method. The authors reviewed coding processes frequently, engaged in further data reduction by merging codes, evaluated the applicability of codes to particular data, and revised and refined the codebook as necessary. Although qualitative research does not lend itself to generalizability, we have developed two models of how students with disabilities want their instructors to use technology, both in and outside the classroom. These models can be tested to determine applicability to other samples of higher education students.
Acknowledgments
The EDUCAUSE Center for Analysis and Research (ECAR) would like to first thank the students who took time to participate in the 2019 ETRAC student survey, from which these data were derived. Thanks are also in order to the survey administrators at the participating institutions who planned and deployed the survey to the students on their campuses. We also thank our accessibility subject-matter experts who offered their time and expertise in reviewing this study: Kirsten Behling, Associate Dean of Student Accessibility and Academic Resources at Tufts University, and Angela Jackson, Online Program Manager and Digital Accessibility Coordinator, Center for Teaching and Learning at the University of South Dakota. Their thoughtful feedback and suggestions have greatly improved the quality of the report.
Many thanks go out to the team of EDUCAUSE staff who made significant contributions to this report. Thank you to D. Christopher Brooks for his guidance, leadership, and support of this project from start to finish. A note of appreciation goes to Ben Shulman for his statistical review of the overview section of this report. Thanks also go to Kate Roesch for designing the engaging figures that helped us represent the relationships we identified in these data. We are grateful for Gregory Dobbin and the publications team for their attention to detail and editorial guidance, and for Lisa Gesner for her skilled content management and marketing of this project. Finally, thank you to Mark McCormack for his review of the manuscript and suggestions for making it stronger, as well as his enthusiasm and support for this important topic.