Main Nav

EDUCAUSE Analytics Sprint, Day 2 Recap: Analytics for Teaching, Learning, and Student Success

 

The focus of today’s Sprint activities moved from general issues for higher education into the area of teaching and learning, an area that seemed to strike participants as either the most or the least reasonable place to apply analytics. On the one hand, teaching is the core mission of every college and university, and if using data in new ways can benefit an institution’s teaching, who wouldn’t support that? On the other hand, the magic of learning depends on subtleties and dynamics that can’t be reduced to numbers, doesn’t it?

Let’s step back from that cliff for a moment. Consider these points:

  • Institutions of higher education collect vast amounts of data.
  • Correlations—however imperfect—do exist between various data points.
  • No two instructors are the same, nor are any two students.
  • Predictive models that don’t accommodate the exceptions and forgo discretion can do a grave disservice to the minority.

Where does that leave us? Today’s first presenter, Marsha Lovett, director of the Eberly Center for Teaching Excellence and teaching professor of psychology at Carnegie Mellon University, perhaps summed it up best when she said that learning analytics is not enough, but that it can be an important part. In her framing of the issue, she suggested that the notion of “prediction to action” leaves out two key components:

Prediction + Understanding leads to Targeted Action

Faculty concerns about analytics programs are valid, but they should be relatively straightforward to identify. A short list of faculty concerns includes workload, autonomy, the validity of analytical models, and ways in which analytics might be limiting—as one participant noted in Twitter, “Instructors may have excellent pedagogical reasons for NOT using the LMS...this is a potential impediment for analytics.” Another commenter brought up the question of teaching and learning that happens in “public” social networks and other spaces and how analytics can be expected to include those data. Any analytics program that is going to get off the ground will need to address these and similar issues.

Ellen Wagner, executive director of the WICHE Cooperative for Educational Technologies (WCET), gave us a sense of just how much data is being produced and collected, and she described a program undertaken to explore the factors that affect retention, progress, and completion. She pointed out that “It’s what we do with the analytical findings that really matter.” Echoing a point made yesterday, today’s discussion circled around—and in different ways landed on—the idea that the goal of analytics should be to move beyond the “what” and get to the “why.” The value of an analytics program lies in the combination of carefully selected data, sophisticated models, and the wisdom of people. 

Comments

Thanks!  This is very helpful.  I look forward to your recap of today's Sprint.

Marty

Gregory,

You do an excellent job summarizing the positives and negatives of analytics. As in all things in life, balance is the key. Yes, recognize the great value of analytics and the data it depends upon. But balance that with--as you point out--discretion and the ability to accomodate (and recognize) the exception.

Applying analytics with blind enthusiasm will lose support. Practicing analytics with a balanced approach that recognizes the limitations of analytics while taking full advantage of its benefits will ultimately garner greater support within the broader educational community.

Cindy