-
Research
and PublicationsStay -
Conferences
and EventsAnnual Conference
October 15–18, 2013
Save the date!Events for all Levels and Interests
Whether you're looking for a conference to attend face-to-face to connect with peers, or for an online event for team professional development, see what's upcoming.
Stay -
Career
DevelopmentEDUCAUSE Institute
Leadership/Management Programs
Explore MoreCareer Center
Leadership and Management Programs
EDUCAUSE Institute
Advanced Programs
Project Management
Jump Start Your Career Growth
Explore EDUCAUSE professional development opportunities that match your career aspirations and desired level of time investment through our interactive online guide.
Stay -
Focus Areas
and InitiativesLatest Topics
EDUCAUSE organizes its efforts around three IT Focus Areas
Join These Programs If Your Focus Is
Stay -
Connect
and ContributeFind Others
Get on the Higher Ed IT Map
Employees of EDUCAUSE member institutions and organizations are invited to create individual profiles.
Stay -
About
EDUCAUSEUncommon Thinking for the Common Good™
EDUCAUSE is the foremost community of higher education IT leaders and professionals.
Stay
Measuring the success of the campus LMS
Here's a straw poll question, directed primarily at those whose responsibility includes the care, feeding, and promotion of the campus learning management system (LMS): what, currently, is your primary measure of a successful learning management system (LMS) project? Is it the percentage of faculty using it?
I know there are probably a variety of ways you might calculate the success of your LMS. But my question is: what is the **primary** or most valuable way or metric you use? If you could use only one such measure, what would it be?
Thanks
Malcolm
----------
Malcolm Brown
Director, EDUCAUSE Learning Initiative
email: mbrown@educause.edu
IM: fnchron (AIM)
Voice: 575-448-1313
**********
Participation and subscription information for this EDUCAUSE Constituent Group discussion list can be found at http://www.educause.edu/groups/.

















Comments
Joel Backon
Director of Academic Technology / History
Choate Rosemary Hall
333 Christian St.
Wallingford, CT 06492
203-697-2514
Malcolm: you posed such an open-ended question but maybe that's the secret to a good discussion.
Greetings All!
Indeed, a good discussion!
At UF we have relied heavily on adoption to assert success, i.e. courses and sections. That and how many instructors are howling for my head on a pike ;-)
However, as has already been observed, simple adoption is a quite narrow definition of success, especially when some units may mandate use while others may not. As a result, I am currently playing with the idea of trying to tease out differences between Return on Investment and Return on Value. ROI is, I think, appropriately addressed by adoption stats; but ROV strikes more deeply at less easily measured characteristics such as increased communication and improved student learning (both of which have rightly been mentioned.
Our first pass at measuring ROV is focusing on user satisfaction my means of student and instructor surveys (to launch later this Spring). Based on ideas that emerged in this tread, , part of these surveys will seek anecdotal evidence of improved communication and improved learning outcomes [Do you believe use of the LMS has improved your overall learning? – or some such question].
I agree with the posts identifying e-portfolios as a more effective tool to gauge learning outcomes; though tying those outcomes to the LMS becomes problematic. But cycling back to anecdotal evidence of LMS “success:”
What other measures or questions can you/we come up with that might add insight into success?
Peace,
Doug
-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*-*
Douglas F. Johnson, Ph.D.
Assistant Director for Learning Services
Office of Information Technology | University of Florida
Hub 132 | 352.392.4357, opt 3.
Between the idea and the reality … Between the conception and the creation … Falls the Shadow.
- T.S. Elliot, The Hollow Men
From: Trent Batson [mailto:trentbatson@ME.COM]
Sent: Friday, February 03, 2012 3:41 PM
Subject: Re: Measuring the success of the campus LMS
Malcolm: you posed such an open-ended question but maybe that's the secret to a good discussion.
LMS's such as Epsilen and Desire2Learn also include an electronic portfolio component. Odd matching in a way: the LMS harkening back to teacher-centered, the eportfolio sysem anticipating a time of active student learning with evidence as the basis for credentialing.
It is the eportfolio functionality that supports tracking cohort progress toward learning goals; it is what many of these systems are designed for. LMS's are not really designed for that function. There are about 20 vital and viable electronic portfolio systems (open source and proprietary) in the world.
I can see many ways to measure the success of electronic portfolio components of LMS's -- for example, the degree to which they are used on behalf of high-impact practices (Kuh, 2008, AAC&U). But I find it hard to find metrics to measure success in the broader sense for LMS's. With eportfolios, you can use standard measures of student engagement because studetns own and keep their eportfolios over time whereas LMS's are "owned" by faculty and are limited to the time of the course.
I would think one measure of a good LMS today would be to what extent the functionality in the LMS can be used also within the eportfolio module.
From a service standpoint, we always thought the best aspects of an LMS were how easy it was for faculty to create their course and to modify it. Or, to what extent they had easy access to previous course content.
The LMS market is huge and will be around for a long time, but it is certainly not the future, at least as the dominant teaching-learning communication space as the learning paradigm re-structures itself.
Great discussion, Malcom.
Best
trent
On Feb 3, 2012, at 3:24 PM, Martha Burtis (mburtis) wrote:
This depends entirely on how you define "success." If you think success is about being able to demonstrate adoption (and by doing so, suggest that the investment of money was worth it purely because people are choosing to use the system) than to approach this purely from a standpoint of how many people use the system might make sense.
If you're talking about "success" in terms of successfully demonstrating learning, then relying purely on numbers of adoptees would be inappropriate.
I can't speak to our LMS specifically, but we think about the success of the the blogging system we run (UMW Blogs) from a number of perspectives:
--we tout the number of faculty and student users and number of courses, in part because it is a grassroots initiative that is wholly opt-in (courses and accounts are not created for our faculty or students by default). We're proud of the way the system has been adopted by the community and we think it provides *some* evidence of its value and success.
--we work closely with faculty to help them understand the ways in which the system might be used to promote and demonstrate student success with regards to learning outcomes
--we talk a lot about anecdotal evidence of student learning, engagement, and the impact of publication on the ways students write and present themselves. We "watch" the system pretty closely for these anecdotes, and we are able to do so because the vast majority of the courses in the system are taught in the open.
I think having a broader conversation about what constitutes success is vital to understand the impact any system is having.
Martha Burtis
On Feb 3, 2012, at 3:11 PM, Backon, Joel wrote:
Perhaps. If one identifies improved communication between faculty and students as an improved learning outcome, then yes.
Joel
--
Joel Backon
Director of Academic Technology / History
Choate Rosemary Hall
333 Christian St.
Wallingford, CT 06492
203-697-2514
On Feb 3, 2012, at 1:54 PM, Gregory Ketcham wrote:
Joel:
What about a case where adoption and use of the LMS clearly increases communication between faculty and students? Is increased communication in and of itself an indicator of improved learning outcomes? Probably not; but could it be a related factor?
regards,
Greg
Joel Backon
On Feb 6, 2012, at 12:30 PM, "Nikki Reynolds" <nreynold@HAMILTON.EDU> wrote:
Hi, all -- just a clarification about "learning outcomes." Learning outcomes have been created by most institutions -- a set of 6 or 8 or so of what students should have achieved by the time they graduate. Based on those stated outcomes, on some campuses, rubrics have been created within colleges at a university, within major programs, within gen ed programs, and on down to the course level. The rubric creates standards for what it means to reach a learning goal. The disciplines decided on those standards.
Shahron Williams van Rooij
Connected by DROID on Verizon Wireless
-----Original message-----
To help satisfy those questions, we conduct a student tech survey each year (based loosely on the ECAR annual survey) - and are starting a faculty survey this year as well. We've collected very useful qualitative information specific to the LMS. (Satisfaction, features most utilized, mobile access, etc.) We've been pleasantly surprised by the student resposes. The LMS is often the #1 most accessed and utilized instructional tool - even ahead of library databases. (Granted, this may be biased because the faculty are requiring the students access resources only available through the LMS. However, most students seem satisfied with the tool, nonetheless.)
The quantitative metrics pulled from the LMS itself (I'll not mention the brand) are useless. That a faculty member posts a syllabus should in no way be equated with a faculty member who relies on the tool exclusively. Yet, that's what the metrics often reflect. ("Shells opened and accessed.") A new LMS we're considering should allow us to extract more useful data from the system. (Fingers crossed.)
A sidebar note from something I heard at the ELI Annual Meeting that I found intriguing. A panelist from EDx indicated that she envisions a not-so-distant future of customizable learning environments based entirely on Big Data. If it comes to fruition, it will turn the LMS world inside out. Imagine a system that automatically adapts to, say, high-risk students or students who are returning adult learners. Talk about a game-changer. When that product comes out, I'm buying lots of stock...
RG
Sent: Saturday, February 23, 2013 9:47 PM
To: INSTTECH@LISTSERV.EDUCAUSE.EDU
Subject: [INSTTECH] Measuring the success of the campus LMS
Susan
Susan M. Zvacek, Ph.D.
Senior Director, Teaching Excellence, Learning Technologies, and Faculty Development
Fort Hays State University
Hays, KS 67601
785-628-4194
smzvacek@fhsu.edu
From: Rob Gibson <rgibson1@EMPORIA.EDU>
To: INSTTECH@LISTSERV.EDUCAUSE.EDU
Date: 02/23/2013 11:19 PM
Subject: Re: [INSTTECH] Measuring the success of the campus LMS
Sent by: The EDUCAUSE Instructional Technologies Constituent Group Listserv <INSTTECH@LISTSERV.EDUCAUSE.EDU>
Malcolm posits a great question - one that I've attempted to reconcile for years. For reasons I've never been entirely clear, the LMS always seems to be under 'extra scrutiny' to justify its place, expense, and usage on campus. Never mind that the cumbersome ERPs would often be happily sidestepped if it were not for the fact that many campuses 'force' students through the portal in order to access the services they really want. Akin to a toll booth company telling everyone that their usage is very high - when the drivers simply want to get to their destination.
To help satisfy those questions, we conduct a student tech survey each year (based loosely on the ECAR annual survey) - and are starting a faculty survey this year as well. We've collected very useful qualitative information specific to the LMS. (Satisfaction, features most utilized, mobile access, etc.) We've been pleasantly surprised by the student resposes. The LMS is often the #1 most accessed and utilized instructional tool - even ahead of library databases. (Granted, this may be biased because the faculty are requiring the students access resources only available through the LMS. However, most students seem satisfied with the tool, nonetheless.)
The quantitative metrics pulled from the LMS itself (I'll not mention the brand) are useless. That a faculty member posts a syllabus should in no way be equated with a faculty member who relies on the tool exclusively. Yet, that's what the metrics often reflect. ("Shells opened and accessed.") A new LMS we're considering should allow us to extract more useful data from the system. (Fingers crossed.)
A sidebar note from something I heard at the ELI Annual Meeting that I found intriguing. A panelist from EDx indicated that she envisions a not-so-distant future of customizable learning environments based entirely on Big Data. If it comes to fruition, it will turn the LMS world inside out. Imagine a system that automatically adapts to, say, high-risk students or students who are returning adult learners. Talk about a game-changer. When that product comes out, I'm buying lots of stock...
RG
From: The EDUCAUSE Instructional Technologies Constituent Group Listserv [INSTTECH@LISTSERV.EDUCAUSE.EDU] on behalf of Phillip Long [longpd@MIT.EDU]
Sent: Saturday, February 23, 2013 9:47 PM
To: INSTTECH@LISTSERV.EDUCAUSE.EDU
Subject: [INSTTECH] Measuring the success of the campus LMS
Malcolm: to answer your 'success metric' for the LMS, sadly at my institution the answer is as you posited - faculty adoption rates. It has nothing to do with how they use it. And indeed it's a bogus measure because it's mandated that all academics must have a Bb course shell. There are exceptions permitted if the academic argues it and if the associate dean of T&L is open minded but the cultural norm is that the LMS is the corporate choice and thou shalt use it. Success circularly is in the nearly 100% adoption of this mandate…. Do you smell a problem here? Sign….
phil
(apologies for the delayed reply - it seems that for some reason this list no longer accepts posts to it from my UQ account only my MIT account - I don’t have the energy to correct this at the moment)
:: :: :: :: :: :: :: :: :: :: :: ::
Professor Phillip Long :: Ofc of the Deputy Vice Chancellor Academic :: The University of Queensland :: Brisbane, QLD 4072 Australia
ITEE
Director: Centre for Educational Innovation & Technology
http://ceit.uq.edu.au longpd@uq.edu.au
On 04/02/2012, at 4:45 AM, Backon, Joel wrote:
Fascinating that one would select statistics such as percentage of faculty adoption or raw numbers of courses and enrollments to measure the success of the LMS. Shouldn't we be looking for metrics of improvements in teaching and learning? I understand such metrics are more difficult to develop, but the investment in a CMS or any other technology tool could only be predicated on an improved learning outcome.
Joel Backon
Director of Academic Technology/History
Choate Rosemary Hall
333 Christian St.
Wallingford, CT 06492
203-697-2514
Sent from iPad2
True, students don’t need the instructors present in the LMS to enjoy its benefits. But do they expect them to be? And if they do, how can those expectations be managed effectively?
(This is an area of fairly intensive debate on our campus).
With kind regards,
Marianne Schroeder | Senior Manager, Teaching & Learning Technologies
The University of British Columbia | Centre for Teaching, Learning & Technology
102 – 1961 East Mall, Vancouver BC V6T 1Z1
Ph: 604.822.0255 |Fax: 604.822.2157 | mailto:marianne.schroeder@ubc.ca