Main Nav

Navigating the Waters of Learning Analytics

Malcolm Brown, Director, and Veronica Diaz, Associate Director, EDUCAUSE Learning Initiative

Navigating the waters of learning analytics (LA) requires steering a careful course between a Scylla and a Charybdis. The Scylla is the gushing view that learning analytics is a magic bullet, a cure for many, if not all, of higher education’s ills. The Charybdis is the view that LA, based on usage data and devoted to efficiency, will turn higher education into an soulless pre-professional training business, no longer interested in discovery, knowledge creation, and critical thinking.  
It’s obvious that the challenge and the opportunity is to steer a course between these two extremes.
Some of LA’s detractors think of analytics as robotic-like, doling out canned answers to complex student issues. But, as recent events such as the ELI 2012 spring focus session and the 2012 Learning Analytics and Knowledge conference have shown,  the first generation of LA systems are already providing useful information that can contribute to student success.
And it would be a mistake to underestimate just how far analytics technology could go in providing insight into learning. It goes beyond winning at chess and Jeopardy: computers can now learn to recognize when people are frustrated and to tutor students in math. There has never been a tool with more potential to assist us in the enterprise of educating students. It seems reasonable to expect that, over time, learning analytics will move beyond counting and enable us to provide more sophisticated results.
What's more, it could inform and therefor improve "human" interventions. Successful retention and student support via learning analytics (still) requires a high touch approach. At some universities, faculty make phone calls to students that are struggling in their courses. At others, a retention specialist designs a plan to support students that are not succeeding. What if we could take interventions one step further and explore a just-in-time course design model? Could we build an analytics tools for course designers, for instance? Could we agree that some portion of the course is the domain of the instructor and some is the domain of the designer? If we could disaggregate the instructional function, a designer could make adjustments to portions of the course that are not functioning properly for students in a particular course while also freeing up faculty time to do other things.  
On the other hand, we cannot and should not use LA to abrogate our responsibilities in making decisions in our role as educators.  At the 2012 ELI SFS, George Siemens made an important distinction, remarking that “all the important stuff with analytics happens…after we’ve done the analytics.” To use a medical analogy, LA delivers symptoms; it is still up to us as educators to make the diagnosis and suggest a remedy.  The wisdom that we seek is knowing when to relinquish control and when not to.  This will require all of us, working as a community over time, to figure out.
In the end, we need to be clear about what we want LA to do for us. Does it or should it:

  • measure learning?
  • give feedback which can be used by instructors for continual and even near real-time improvement?
  • provide a way of detecting students who are in trouble and then assisting them with effective interventions?

While LA can be many of those things, it is not yet clear if it can be ALL of those things. It could be that supporting the student "now," in the immediate sense, requires many more resources than before and more resources than we have now. Are we resourced to be able to provide this kind of support and what models (intervention by whom) can we afford and make the most sense? Just how far LA can go is something that will require all of us, working as a community, to figure out.