Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan

The following has some reflection/questions generated while reading

Macfadyen, L. P., & Dawson, S. (2012). Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan. Educational Technology & Society, 15 (3), 149–163.

The abstract for the paper is

Learning analytics offers higher education valuable insights that can inform strategic decision-making regarding resource allocation for educational excellence. Research demonstrates that learning management systems (LMSs) can increase student sense of community, support learning communities and enhance student engagement and success, and LMSs have therefore become core enterprise component in many universities. We were invited to undertake a current state analysis of enterprise LMS use in a large research-intensive university, to provide data to inform and guide an LMS review and strategic planning process. Using a new e-learning analytics platform, combined with data visualization and participant observation, we prepared a detailed snapshot of current LMS use patterns and trends and their relationship to student learning outcomes. This paper presents selected data from this “current state analysis” and comments on what it reveals about the comparative effectiveness of this institution’s LMS integration in the service of learning and teaching. More critically, it discusses the reality that the institutional planning process was nonetheless dominated by technical concerns, and made little use of the intelligence revealed by the analytics process. To explain this phenomenon we consider theories of change management and resistance to innovation, and argue that to have meaningful impact, learning analytics proponents must also delve into the socio-technical sphere to ensure that learning analytics data are presented to those involved in strategic institutional planning in ways that have the power to motivate organizational adoption and cultural change.

Summary

Tells the story of a “learning analytics” analysis of existing LMS usage didn’t influence considerations at a Canadian university (I’m guessing it was UBC) around selecting a new LMS. Argues that such strategic considerations are important and that learning analytics can and should inform such considerations. Draws on change management literature, Rogers diffusion theory, other analytics literature and the nature of the university culture/context to explain why this may not have happened and what might be required to change.

While recognising the assumption of the importance of the strategic approach, I tend to think it is the fundamental assumptions of such an approach – especially in educational technology – and their almost complete mismatch with both the university and broader context that may be at fault here.

The authors were operating in the confines of the strategic approach. However, given that they cite McWilliam (2005) as part of the culture they are aiming for. I wonder why they didn’t apply the same thinking in McWilliam to the practice of institutional educational technology. e.g. McWilliam identifies Deadly Habit No. 5 as “Curriculum must be set in advance” which I see as tightly connected to one of the deadly habits of management. i.e. that the institutional vision must be set in advance.

Introduction

The intro is divided into the following named sections

  • The promise of learning analytics.
    A general summary of the potential of analytics. But ends with a couple of broader points
    • Institutions and senior administrators are “key users and stakeholders” with “enhancement of institutional decision-making processes and resource allocation as core objectives”.
    • And more interesting “the postmodern context of constant and dynamic change in higher education and technological innovation” increases the value of learning analytics as a tool for those folk to figure out actions that are (citing Kavanagh & Ashkanasy, 2006) “achievable within the capacity of the organisation to absorb change and resource constraints”.

    I’ve long argued/thought that if change is so core to the context, then the type of process that involves institutions and senior administrators (and the nature of the people themselves) are inappropriate. Using analytics to improve those existing processes reminds me of the Drucker quote “Management is doing things right; leadership is doing the right things”.

    There’s also the argument about whether or not the analysis of past practice is a useful guide for the future in a rapidly changing context.

  • The importance of strategic investment in learning technologies and e-learning.
    Makes the case for why strategy is important for learning quality e.g.

    In other words, decision-making processes relating to organization of
    institutional resources – human and material – and planning for more effective use of existing resources are a critical feature of excellent institutions.

    This is then linked to the idea of there being some known principles/practices that are “significant predictors of educational gain” – e.g. Chickering and Gamson’s (1987) 7 principles – and that ICTs and the LMS have been shown to do nice things.

    Then the point is made that the “teaching climate within higher education is becoming increasingly complex”. Student numbers and student diversity are increasing. Hence learning tools – like the LMS – are important and are hence institutional resources/concerns.

  • The catalyst for change
    Responsible institutions are being strategic, thinking about resource allocation, reviewing technologies etc. but “a further catalyst for a new LMS review was the LMS vendor’s announcement that the current LMS product would not be supported after 2013”.

    I wonder if in the absence of this “further catalyst” whether the strategic thinking process would have lead to need to change vendor? If not, is it really being strategic?

    Uses Kotter’s (1996) view of change and its first step which includes “careful examination of the current context…” to identify analytics as a way to understand the context and inform decision making.

  • Employing e-learning analytics to understake a current state analysis
    LMS reporting tools terrible, still coming. This institution worked with an analytics software copmany to get a reporting tool to look at use of current LMS and hence inform the strategic process and then

    Through participant observation in the review and planning process we were able to investigate the degree to which the e-learning intelligence revealed influenced institutional decision-making. (p. 151)

    Some of the data is shown, but the real kicker is

    More critically, we discuss the reality that the data developed in this e-learning analytics process did not significantly inform subsequent strategic decision-making and visioning processes, and consider some of the factors that may have limited its impact.

Approach and tools

Starts with explict mention of ethics, including various Canadian requirements. Data is limited to a single academic year (2009-2010) for credit courses. Describes the tools used in the analysis. Also describes the participant observation of the decision making around the LMS.

Selected findings

A selected summary

  • 18,809 course sections – 14,201 undergraduate.
  • 52,917 students enrolled.
  • 388 distance learning sections, 304 fully online, 84 print-based format.
  • 21% of course sections with an LMS site. 14% of lower level sections. 25% of upper-level sections.
  • 80.3% of students enrolled in at least one LMS supported course.
  • 61% of LMS sections were medium-sized sections (15-79 students). 22% for large (80+).
  • 30% of all teaching staff used the LMS (1,118 out of 3061, including all varieties).
  • User time online varied widely by role, faculty, department and course code.
  • In terms of user time online the order of tool use is
    • LMS Content page
    • Discussion
    • Organiser
    • Assessment

    Beyond these four all other tools used minimally. Content page almost 3 times larger than discussion.

  • File type of content investigated and graphed (majority image 41%, PDF 18%, HTML 16%)
    Wonder what the image files include (and don’t include). Lots of buttons and navigation images? Or actual “learning material”?
  • Draws on Dawson et al (2008) categorisation of tools by purpose: engagement with learning community, working with content, assessment, and administrative tasks.
  • This is used to explore and find more support for correlation between learning outcomes and their engagement with tools in fully online (emphasis in original) courses. Similar findings in courses with different use of LMS.

More interestingly

From review of these documents, and from participation in continuing committee discussions, we observed that although completion of the current state analysis was noted, no further references to or interpretations of the findings were made in later meetings or documentation. (p. 157)

Discussion and implications

Benchmarking

This type of analysis feeds benchmarking.

  • 2010 campus computing survey suggests US public universities average around 60% of course sections using the LMS.
    Of course this isn’t based on data, but perceptions of the people surveyed.
  • This institution is at 21%. 70% of teaching staff did not use it. But 80% of students take at least on LMS course.

The staff figure is reminiscent of the chasm. I wonder whether the low level of use of the LMS was considered in the LMS evaluation?

Makes the argument that for online courses, that the activity data “begin to provide lead indicators of the appropriateness of the course load as a result of the implemented learning activities”.

Though this seems to have some assumptions. e.g. that the current measure of learning outcomes (final grades) are an effective measure and then….The point is made that educational theorists “emphasize the importance of peer to peer interaction for facilitating the learning process” but that most of what students are doing in the LMS is content delivery.

This practice is linked to the “old wine in new bottles” use of technology and “It is only at this later innovation stage that learning technologies will be fully utilized to support a pedagogical practice of engagement that will significantly enhance the overall student learning experience”. I wonder what the impact on achieving the “later innovation stage” is provided by institutions changing LMS every few years?

Suggests reaching this stage requires the “kind of cultural changes described by McWilliam (2005)”.

Informing strategic planning?

If the committee is the main group charged with integrating IT and learning and teaching, why didn’t it engage with what the data revealed?

The answer in the paper, not surprisingly given the above, is based on a teleological set of assumptions.

The paper reports “that subsequent deliberations and decision-making focused almost exclusively on technical questions relating to ‘ease of migration.'” and more

While there is an obvious imperative to ensure that any new enterprise technology is functional, scalable and reliable, an exclusive focus on technology integration issues, in the absence of development of a pedagogical vision, quickly neutralizes the likelihood that learning analytics data may catalyze organizational change with a focus on the student experience and learning outcomes. A focus on technological issues merely generates “urgency” around technical systems and integration concerns, and fails to address the complexities and challenges of institutional culture and change. (p. 159)

The suggestion is that

What will determine whether it succeeds or fails in this effort will be its ability to develop a clear vision for learning technologies and lead the cultural change that reaching it requires.

The problem I have with this is that the effort becomes focused on establishing what the clear vision should be. i.e. the stakeholders – who tend to be inherently diverse, include potential political rivals, and those focused on purpose proxies (e.g. the above focus on integrating with existing technical systems rather than quality L&T) – waste time trying to get agreement on a vision which they then have to communicate and gain acceptance for from the broader and even more diverse potential user base.

Interesting, there are quotes an argument that perhaps learning analytics isn’t enough

Interestingly, this mismatch between opportunity and implementation may be more widespread than enthusiastic analytics literature suggests. In their 2005 review of 380 institutions that had successfully implemented analytics, Goldstein & Katz (2005) note that analytics approaches have overwhelmingly been employed thus far “to identify students who are the strongest prospects for admission…[and]…to identify students who may be at risk academically” – that is, to improve enrollment and retention, rather than for institutional strategic planning. Similarly a recent survey of literature on implementation of educational data mining found that only a small minority of these report on the application of EDM to institutional planning processes (Romero & Ventura, 2010).

Why numbers aren’t enough

We suggest here that this may be the result of lack of attention to institutional culture within higher education, lack of understanding of the degree to which individuals and cultures resist innovation and change, and lack of understanding of approaches to motivating social and cultural change.

There is now a move into difusion theory as a tool to explain (can I hear @cj13 starting to go off?). Used to frame a few paragraphs about why analytics reports on LMS usage data is likely to have limited impact on those involved?

Broader discussion about the “realities of university culture”. There is some mention of the disconnect between the business change management literature “heavy emphasis on the role of leaders in motivating and managing successful change and innovation” and the reality of university culture/life where “any direct interference in faculty democracy is not welcome”.

Where to from here?

The data isn’t enough. Change literature cited suggesting that the heart and head needs to be engaged.

Still claims that using the data to highlight process/room for growth against targets and vision is useful. But makes the important point

Interpretation remains critical. Data capture, collation and analysis mechanisms are becoming increasingly sophisticated, drawing on a diversity of student and faculty systems. Interpretation and meaning-making, however, are contingent upon a sound understanding of the specific institutional context. As the field of learning analytics continues to evolve we must be cognizant of the necessity for ensuring that any data analysis is overlaid with informed and contextualized interpretations.

But I wonder, if given the inherent irrationality of human decision makers, whether “informed and contextualised” is enough/achievable?

An interesting suggestion

In addition, we propose that greater attention is needed to the accessibility and presentation of analytics processes and findings so that learning analytics discoveries also have the capacity to surprise and compel, and thus motivate behavioural change.

That sounds like a good design-based research project.

More here also on the difficulty for non-experts to understand what is being shown.

I like this closing “research must also delve into the socio-technical sphere to ensure that learning analytics data are” but not so much its application to strategic institutional planning.

4 thoughts on “Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan

  1. Pingback: Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan | Analyse This | Scoop.it

  2. Pingback: The Blog of D. Jones | Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan | Educación flexible y abierta | Scoop.it

  3. Pingback: Weblog of D. Jones | Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan | Educación flexible y abierta | Scoop.it

  4. Pingback: Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan | Aprendiendo a Distancia | Scoop.it

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s