Framing Action Analytics and Putting Them to Work

min read

© 2008 Donald Norris, Joan Leonard, Louis Pugliese, Linda Baer, and Paul Lefrere. The text of this article is licensed under the Creative Commons Attribution-Share Alike 3.0 License (http://creativecommons.org/licenses/by-nc-sa/3.0/).

EDUCAUSE Review, vol. 43, no. 1 (January/February 2008)

Framing Action Analytics and Putting Them to Work

Donald Norris, Joan Leonard, Louis Pugliese, Linda Baer, and Paul Lefrere

Donald Norris is President of Strategic Initiatives, Inc., a management consulting firm in Herndon, Virginia, which has been co-creating action analytics solutions with colleges, universities, and commercial clients since 2007. Joan Leonard is President of 3 Degrees Consulting. Louis Pugliese is President of Learning Diagnostics, Inc. Linda Baer is Senior Vice Chancellor for Academic and Student Affairs at the Minnesota State Colleges and Universities. Paul Lefrere is Associate Professor (Senior Lecturer) with the Institute of Educational Technology at the Open University of the United Kingdom, Professor of eLearning at the University of Tampere in Finland, and a Principal Consultant with Strategic Initiatives, Inc.

Comments on this article can be sent to the authors at [email protected], [email protected], [email protected], [email protected], and [email protected] and/or can be posted to the web via the link at the bottom of this page.

This article is a companion piece to the article “Action Analytics: Measuring and Improving Performance That Matters in Higher Education,” which describes the emergence of a new generation of tools, solutions, and behaviors that are giving rise to more powerful and effective utilities through which colleges and universities can measure performance and provoke pervasive actions to improve it.1 We call this new class of tools, solutions, and behaviors action analytics.2 The “Action Analytics” article provides examples of emerging analytics applications, as well as explanations of how open architecture technologies are lending themselves to a “cloud” of fresh applications and solutions that are not held hostage to the existing “stack” of ERP and other enterprise applications. The article also describes the imperative to change organizational capacity, culture, and behavior and to invent new measures and key performance indicators (KPIs) that are more appropriate to the new skills and habits of mind existing in our “flattening,” fast-changing world.

In this companion, online-only article, “Framing Action Analytics and Putting Them to Work,” we take the conversation a step further. We provide a model for framing and understanding the evolving set of functional capabilities that constitute action analytics. This model outlines a new, vibrant marketplace for analytics. Today, much of the current state of analytics is described and understood in terms of existing technologies, software applications, vendor tools, and/or offerings that incrementally extend the ERP stack and that do not really challenge the existing architectural and practice paradigm. As a result, many worthy niche applications and functionalities have never been fully incorporated into the ERP stack or may reside on a vendor’s product road-map for future consideration. Moreover, many integrated analytical capabilities and visualization/presentation functionalities have never before been practical or affordable for many colleges and universities. These utilities are now within the technical and managerial reach and budget of institutions, provided an institution is willing to engage in fresh perspectives and to craft an inclusive and expansive framework of “stack” and “cloud” architecture and solutions.

It is not possible to understand or appreciate the true potential of this comprehensive set of action analytics capabilities by viewing them as extensions of the existing ERP stack. Therefore, our four-element model outlines a new framework that describes what leaders, policy- and decision-makers, faculty, students, parents, and other Pre-K-20 education and workforce stakeholders need from the emergent generation of action analytics.

Growing a Vibrant Analytics Marketplace

Today's marketplace for analytics is fragmented and incomplete. It is also expressed in terms of existing ERP products and their extensions. Market supply and demand is defined in terms of tools, not functionalities. Market buying behavior is unsophisticated and evolving, eagerly questing for more penetrating insight on migration paths to advanced analytics capabilities. Specific characteristics of the marketplace include the following:

  • Commercial and homegrown administrative data marts, data warehouses, business intelligence and analytic tools to support strategic planning, accreditation, assessment and ERP/administrative and academic analytics
  • Various, often-incompatible commercial and homegrown portfolio solutions to capture, store, and potentially share student and faculty learning artifacts
  • Strategic enrollment management (SEM) and customer relationship management (CRM) software and service offerings that further muddy the “performance improvement” waters with a mix of features and functionalities that, at times, may overlap with each other

Recent acquisitions of business intelligence (BI) and data warehouse (DWH) software providers by major vendor companies (e.g., IBM/Cognos, SAP/Business Objects, and Oracle/Hyperion) are complicating the future of “vendor-neutral” analytics solutions.

No framework or blueprint exists today for institutions to determine how best to move from an ERP-centric infrastructure and course/program focus to an open architecture infrastructure and a culture of outcomes and aggressive performance improvement supported by a “cloud” of new software, tools, services, and solutions that sustain this new paradigm. Migration paths and/or best practices bringing together strategic planning, administrative ERP, academic and assessment analytics, and portfolio solutions for building capacity and creating an action analytics utility to improve enterprise performance and outcomes are at worst nonexistent and at best narrowly disseminated.

Higher education is in the nascent stages of yet another significant transformative change, stimulated by global competition and by both internal and external pressures for accountability and performance improvement. Drivers of this transformation include the following:

  • Increased competition for students and faculty/talent at all levels—locally, nationally, and globally
  • Expanding growth of for-profit educational enterprises that are focusing strongly on competency-based, flexible, customizable learning at undergraduate and graduate levels
  • Robust regional accreditation requirements, including the need to develop quality enhancement plans (QEP) with well-defined measurements and assessments that align with institutional strategic directions/goals
  • Performance-based funding efforts in many states and systems of institutions
  • Moving from snapshots of course/grades/graduation rates to broader, longitudinal sets of measures and metrics that
    • focus on improved articulation/throughput from high school to two-year institutions to four-year institutions, particularly within state systems;
    • apply standards and assessment of competencies;
    • actively acknowledge/incorporate industry and workforce education and training needs into courses, programs, degrees and certifications; and
    • ensure that the high school student is prepared for college and that the college graduate is prepared for the world of work

It’s déjà vu. Again. Today’s applications and analytics environment is reminiscent of the changes in the early days of the course management system (CMS):

  • The proliferation of first-generation homegrown and commercial authoring tools
  • Experimentation by early adopters (faculty and staff) in developing unique, stand-alone, online content/courses that resided anywhere and everywhere (individual hard drives, institutional network drives, and/or in a hosted website)
  • The achievement of a critical mass of online content/courses that then drove the need for content/course management standards and templates as part of an effort to build an institutional “brand” and to enhance the student experience
  • The growth of online delivery as an alternative or supplement to classroom-warranted “management” of online activities: track/manage student accomplishments (e.g., testing/assessment tools and integration with student systems/grade books); handle scheduling challenges (e.g., integration with the course catalogue); manage faculty load issues
  • The emergence of student portal solution(s) and student services “self-service,” along with the need for security and user authentication
  • The convergence/integration of tools, software, and services to create an academic delivery platform that supported online learning—hence the interchangeable acronym today, L/CMS (learning/course management system)

As noted, we are today witnessing a similar trajectory in analytics and performance measurement and improvement, evidenced by the following:

  • Institutional performance measurements and academic assessment silos with insufficient congruity between data sets
  • ERP vendors experimenting with applications that extend their stack into non-core functions and aspects of learning and academic data management, including both administrative ERP vendors (Oracle, SunGard Higher Education , Datatel) and academic ERP vendors (Blackboard)
  • Rudimentary systems for cognitive-assessment authoring tools to support online curricula
  • The increasing use of formulaic approaches to determine which applicants to recruit and admit, based on standardized exam scores, high school coursework, extracurricular activities, and other such factors
  • The shallow and imperfect penetration of the higher education market by BI tools
  • The adoption of associated measurement/scorecards/dashboards at an enterprise level with wide variations of successful implementations

A Four-Element Model of Action Analytics Capabilities

To frame the action analytics marketplace, we begin with four elements:

  • Strategic Planning Analytics,whichfocuses on enterprise performance and on positioning the enterprise for success in the face of competition
  • Administrative Analytics,which focuses on the administrative performance of the full-range of administrative and academic support processes
  • Academic/Assessment Analytics,which focuses on the performance of the individual learner and aggregations of learners, on faculty content management, on best practices for instructional design and curriculum development, and on assessment of learners’ and developmental outcomes
  • Learning/Career Analytics,which provides the interface between the institution and the world of work, focusing on learner-to-work performance

These elements are portrayed in the four diamonds in Figure 1: Action Analytics Capabilities. The boundaries between these four diamonds (functional areas) are fully permeable: data, information, and knowledge (information in context) must flow freely among and between them. To achieve the promise of action analytics, an institution needs to be able to deliver and develop capacities that relate to enterprise performance improvement—functional requirements, systems characteristics, and institutional requirements/considerations—in these four functional areas.

Figure 1

Figure 1

Strategic Planning Analytics

Strategic Planning Analytics enable institutions to strategically plan for success in competitive environments and deliver on the promised value propositions. In this context, strategic means “dealing with the enterprise’s relationship with its competitive environment.” These analytics are the instrument for articulating the value that institutions provide to learners and other stakeholders. Today’s emerging environment requires institutions to be held accountable, transparently. This includes the extent to which institutional goals, curricula, and practices are aligned to employment and workforce requirements. Strategic Planning Analytics draw from all of the other functional areas in providing summative metrics on the performance of the institution as a whole, with drill-downs to constituent colleges, departments, and programs.

Considerations and Requirements for Strategic Planning Analytics

To survive and thrive in the new environment, institutions must demonstrate and exhibit the following:

  • Institutional dashboards of KPIs that provide readily available access to insight on strategic planning, administrative, academic/assessment, and learning/workforce analytics
  • Institutional learning and developmental outcomes that support the academic mission
  • Program objectives that support college, program, division, and course goals
  • Course development and design that both support and reflect program objectives
  • Academic programs that align to real-world workforce supply and demand
  • Adequate and effective resourcing of institution-wide programs in a manner that equates to a financially viable cost of instruction
  • KPIs and associated longitudinal data required for accreditation compliance that can be tracked and measured on an ongoing basis and in advance or anticipation of future (re)accreditation and program certifications
  • Easy-to-use intuitive analytics that are made available to all end-users as a self-service model (faculty, staff, administrators, stakeholders), not just to the “power users” who can figure out the arcane practices of the current generation of BI tools to generate reports

Building Performance Measurement and Improvement Capacity

Future performance measurement and improvement capacity must include the following:

  • Institutional dashboards that portray KPIs reflecting the institution’s strategic and competitive position and that link with other families of measures reflecting academic, administrative, and learning/workforce outcomes
  • Extensive data-modeling and predictive analytics that link KPIs horizontally (across processes) and vertically (across organizational levels, from enterprise to individual and back again) at all levels of the institution
  • Objective mapping that provides linkages and portrays alliances among and between Pre-K-16, business and industry, and communities to build more relevant academic-to-career experiences and outcomes
  • Alignment of institutional strategies, goals, actions, responsibilities, and measurement at the institutional level with the corresponding, cascading set of strategies, goals, actions, responsibilities, and measurement at the college level and the department/program level
  • Alignment of the distinct but related processes of strategic and capital planning, accreditation, continuous improvement, and performance improvement that enable the free flow of information back and forth between these different processes to support strategic planning analytics
  • Capacity to execute strategies, lead and navigate change, take corrective actions, and measure progress
  • Well-developed collaboration capabilities that can be used in formulating, delivering, analyzing, and reporting on KPIs, as well as comprehensive alert features that require predetermined action(s)
  • User-friendly reporting capabilities vis-à-vis balanced scorecards, dashboards, and strategy maps that simplify the collection, review, management, and use of information for data-driven decision-making and as a fundamental driver of strategic planning processes
  • Multiple measures and samples of learning artifacts and outcomes residing in e-portfolios, survey results, traditional, and online student assessments, benchmarking results, and other qualitative and quantitative data sources
  • Capacity to scrape data, information, and knowledge from other “diamonds” and portray this in the strategic planning context, and vice versa

Administrative Analytics

Administrative Analytics enable institutions to measure their business/operational performance and continuously improve that performance. Administrative Analytics enable institutions to effectively manage the financial, human resources, development, and administrative health and well-being of the institution. This includes the extent to which the administrative functions are strategically aligned with the academic enterprise and are supported by the technology infrastructure. Administrative Analytics draw data, information, and knowledge from administrative and academic ERP, as well as from other campus-related auxiliary enterprises such as food services, the bookstore, parking, the health center, and facilities management. Administrative Analytics also illuminate how the institution’s administrative processes serve faculty, staff, and other customers.

In the past, many specialized administrative “point” solutions were not incorporated into traditional ERP solutions. Today, academic and/or administrative ERP solutions are being expanded to include CRM, SEM, and student affairs/student life (SA/SL), either as extensions of the ERP stack or as open architecture add-on functionalities. Assessment and assessment management solutions are also being added to the mix. This full combination of functionalities will be included in the “stack” and “cloud” configurations that are emerging across the higher education landscape of the future.

Considerations and Requirements for Administrative Analytics

To survive and thrive in the new environment, institutions must demonstrate and exhibit the following:

  • Summary dashboard snapshots of KPIs for administrative processes that are easily accessible by executive leaders, administrators, and staff
  • Mapping and alignment of administrative functions that support strategic and long-term planning and enable continuous improvement and process reinvention
  • Administrative analytics that provide key information supporting strategic human resource and succession planning
  • Administrative analytics that support assessment of improved efficiency and effectiveness within the operation
  • Innovative and barrier-free operating processes, policies, and procedures to support flexibility, relevance, and innovation
  • Scenario-based functionality and dynamic and predictive modeling to support strategic direction/goals and planning, budgeting, and capital planning
  • Ability to gain a 360-degree view of quantitative data originating from parallel data warehouses (e.g., administrative ERP, academic ERP, assessment) supporting different processes
  • Ability to blend, compare, and contrast predetermined administrative performance indicators

Building Administrative Analytics Capacity

Future performance measurement and improvement capacity must include the following:

  • Dashboards that monitor and report variances to KPIs such as ongoing calculations of enrollment headcount and retention trends
  • Analytic reports related to cost accounting by student, class, and program cost centers
  • Demographic analyses by institutional program, course gender, and ethnicity data
  • Trend analysis in student-to-faculty ratios and other measures of productivity
  • Analytics to track/measure year-over-year retention rates/improvements and document best practices
  • Ability to track, report and manage alumni and donor contributions and trending
  • Ability to manage endowments based on an emerging trend wherein donors require specific objectives/benchmarks to be met before funding is released/continued; forecast future endowment
  • Dashboarding ability to track and report graduate school acceptance and alumni transition data by program
  • Administrative processes for and reports on research studies conducted by institutional research offices
  • Ability to track/report on institution-sponsored/supported student co-curricular and professional organizations

Academic/Assessment Analytics

Academic/Assessment Analytics focus on learning and developmental issues at the individual, section, course, program, department, college, and institutional levels. These analytics enable institutions to become more learner-focused and to establish internal/external benchmarking against predetermined course, program, and certification objectives.

Considerations and Requirements for Academic/Assessment Analytics

To survive and thrive in the new environment, institutions must demonstrate and exhibit the following:

  • Summary dashboard snapshots of academic/assessment KPIs that are readily available
  • Quantitative and qualitative course, program, and certification data that is collected and readily available
  • Academic program/department annual plans and course competencies that align with institutional strategic directions and goals
  • Well-defined measures and assessments for each course, program, and certification offering
  • Indicators to gauge readiness for academic change

Building Academic/Assessment Analytics Capacity

Future performance measurement and improvement capacity must include the following:

  • Quantitative results that inform departments of specific refinements needed to improve curriculum development based on evidence and monitor continuous progress of student achievement compared with defined learning objectives and normative data within the institution and nationwide
  • Evaluation of normative data in order for faculty and administrators to assess the level of student achievement within a field of study compared with that of other students in the program and with the national comparative data
  • Micro-level (individual) and macro- level (department, college, enterprise) data analysis for faculty and provost level use
  • Outcomes/standards-based assessments at course, program, and unit levels as well as for local, state, institutional, or federal compliance
  • Academic program evaluation and analyses
  • Action analytics in the areas of enrollment management, student retention/success programs, predetermined intervention strategies and the services to support them, and collaboration within and across career pathways

Learning/Career Analytics

Learning/Career Analytics enable institutions to track and manage longitudinal collections of learning, co-curricular development, and other artifacts of learner capability to support 360-degree profiles of students/adult learners from college to and through career transitions. This tracking includes engagement of students early on (i.e., high school) to identify college-readiness goals. It also supports more opportunities for college-level work in the high school. Providing career pathway exploration, mapping, and service learning is also important. These arrays of knowledge can be stored in a variety of forms and formats, including student portfolios that contain both public and private information. Ultimately, these learner outcome repositories could be part of portable, individual competency portfolios for life. The data in them could also be scraped to support data mining on best practices and pathways to career success.

Considerations and Requirements for Learning/Career Analytics

To survive and thrive in the new environment, institutions must demonstrate and exhibit the following:

  • Readily available summary dashboard snapshots of learning/workforce KPIs
  • Industry standards, employer competency requirements, and workforce considerations that are included in learning outcomes and are aligned within courses, programs, and certifications as appropriate (e.g., Perkins IV)
  • Input from business/industry advisory boards that are integral to each department and college
  • Alliances of Pre-K-16 and local, regional, statewide workforce planning
  • Early interventions with students in middle/intermediate school and high school, concurrent enrollments, and other mechanisms to build readiness
  • Aggregated knowledge on successful career paths, school-to-work transitions, and related measures made available to learners, their families, and the community at large

Building Learning/Career Analytics Capacity

Future performance measurement and improvement capacity must include the following:

  • Dashboards and KPIs focusing on learning-to-work transitions
  • A platform that supports a pervasive, institution-wide culture of evidence and the optimization of student learning and faculty instruction
  • A structure and design that can easily enable faculty and students to aggregate and present evidence of continuous improvement tracked against institutional learning and performance outcomes
  • Multiple portfolios/snapshots that can be dynamically connected, juxtaposed, and seamlessly transported from college to career
  • Specific achievement standards defined and tracked by faculty and students
  • Ability of portfolio owners to control access, direct portfolio viewing, and track portfolio access and delivery
  • Ability to conduct employment searches and connect to employment recruiting sites for submission or linking to personal portfolios

Summary: The Four-Element Action Analytics Capabilities Model

Taken together, these four interconnected elements, or diamonds, of action analytics provide a full spectrum of robust analytics, as portrayed in Figure 2.

  • All four diamonds are permeable and share data, information, and knowledge.
  • The Academic/Assessment Analytics and the Learning/Career Analytics diamonds provide a 360-degree view of individual learner performance—academic, co-curricular, career-building, and developmental—that includes not just current institutional performance records but a cumulative transcript of accomplishments.
  • The Learning/Career Analytics diamond portrays individual competencies/capacities in work contexts and is fully portable, transferable, and “scrape-able” for data mining (but only those data sets approved for “public” use).
  • The Academic Analytics diamond provides highly granular, multidimensional assessments of individual learner performance that can be used as part of dashboard, alert, and intervention systems for improving student retention and success.
  • The Academic Analytics diamond aggregates from the individual learner and course levels to program, department, and institutional levels and can be benchmarked against external peers and comparative standards.
  • The Administrative Analytics diamond focuses on the business processes—administrative and academic support—that are subject to continuous improvement and benchmarking against internal/external standards and norms.
  • The Strategic Planning Analytics diamond focuses on enterprise-level performance by aggregating Administrative, Academic/Assessment, and Learning/Career Analytics to program, department, college, and enterprise levels, enabling comparison with external benchmarking standards and communication of accountability and value with external stakeholders and policy-makers.
  • In the Strategic Planning Analytics diamond, clear alignment of mission, values, strategies, goals, actions, and measures across the enterprise must occur at all levels and for all processes.
  • All four analytics diamonds achieve alignment of measures, dashboards, balanced scorecards, and strategy maps, and also a well-developed capacity to drill down from dashboard measures to individual details that fall within cohorts or groups at risk.

Figure 2

Figure 2

Figure 2 illustrates the flow of comparative and benchmarking data into all four diamonds. It also depicts the outward flow of data regarding learner-to-work performance from the Learning/Career Analytics diamond and the two-way flow of accountability/value benchmarking data from the Strategic Planning Analytics diamond. In this context, the term strategic refers to interactions between the enterprise and its environment. Such communications with all stakeholders, institutional executive leaders, policy- and decision-makers, faculty, students, parents, and other Pre-K-20 education and workforce employers are critical to the successful twenty-first-century institution.

Putting the Action Analytics Model to Work

This model establishes the expectations that an institution’s leaders should stipulate the systematic introduction and cross-institution deployment of sector-leading action analytics capabilities, as a key part of their strategy for competitive success, in order to position the institution to maximize its performance on as many measures as possible. This is a necessary precursor to success in the hyper-competitive environment of the twenty-first century. Such are the stretch goals that are within our grasp in the new, open-architecture environments in which action analytics will thrive.

Once these stretch goals have been established, leaders should use them to deconstruct the capabilities and limitations of the existing technology infrastructures, applications, and solutions and to build migration paths shaped by this vision. Doing so requires asking hard questions of vendors and CIOs and demanding straight answers and creative responses:

  • How can we make easy-to-use analytics available to all end-users, not just power users?
  • How can we most easily scrape data, information, and knowledge from all campus systems and data sources?
  • How can we cut the cost and raise the ROI of our business intelligence solutions and make them easier to configure and implement?
  • How can we easily share data, information, and knowledge about “actions that work” in serving underserved students, in furthering workforce development, and in reducing remediation needs across Pre-K-12, postsecondary education, workforce, and adult learner settings?
  • How can we mine data on career and employment success across venues and use it to inform public and institutional policy and individual choices?

Five years ago, in many institutions, the answers to these questions would have been: “We can’t.” “These things are impossible.” Today, the answers have changed. Action analytics are making all of these stretch goals possible and affordable.

Notes

1. Donald Norris, Linda Baer, Joan Leonard, Louis Pugliese, and Paul Lefrere, “Action Analytics: Measuring and Improving Performance That Matters in Higher Education,”

EDUCAUSE Review, vol. 43, no. 1 (January/February 2008), pp. 42–67, http://www.educause.edu/er/erm08/erm0813.asp.

2. The term action analytics™ has been trademarked by Strategic Initiatives and cannot be used commercially without express permission.