Do any universities know what’s happening with their LMS?

Just over two years ago some colleagues and I wrote up some initial exploration of the logs from the different Learning Management Systems (LMS) used by a university. The initial explorations included:

  • An initial examination of feature adoption.
    Between the two LMS there were broadly different adoption patterns which appeared to be tied to the nature of the systems and how they were developed. One of the systems had adoption levels well above levels reported elsewhere.
  • Looking to see if there was a link between the level of student activity on the LMS and final grades.
    We found that for one class of students the link existed. The more they used the LMS, the higher their grade. But for another class of students, this relationship did not hold.
  • A quick look to see if factors such as academic staff participation or curriculum design effected student participation.
    For example, if academic staff participation (number of posts on forum) was high then the link between student activity and final grade existed. However, if academic staff participation was very low, then the top ranked students (based on final grade) did not use the LMS.

We were simply trying to explore the vast volumes of data generated by LMS usage to see if any useful information could be generated. Mainly because it seemed that the institution simply wasn’t using this information in anyway. Instead decisions were being made with little idea of how the LMS was actually being used.

I left the institution not long after that work. My understanding is that it hasn’t improved its use of the LMS data (I guess there remains a question about whether or not the information stored in the LMS can be used effectively). This is in spite of the increase work around academic analytics.

I’m wondering, are there any universities that are actively using LMS data to improve decisions? If so, what information are they using and how?

I’m particularly interested in feature adoption. Are there any universities that are looking at what features of their lovely integrated system are being used?

Why is this important?

Some folk want the LMS to die, I am perhaps amongst that number. However, it appears that almost all University management assume that “e-learning == LMS”. They remain widely used and hence I am interested in seeing how they are used and what that means.

Not to ment that if most university management and IT folk ascribe to the idea of strategic/rational/corporate management – as most do these days – then obviously they are already collecting this information in order to inform their data informed decision making. How else can they make effective judgements about the value of their selection of the LMS and subsequent strategies to encourage its effective use? (Apart from the standard “we made the decision, the decision is good” approach).

15 thoughts on “Do any universities know what’s happening with their LMS?

  1. Problem with LMS is that given the stuff attached to them (a lot of stuff), the easiest thing to replace them with is something that looks like another LMS. Which is why the decisions taken around this stuff is so important.

    1. I agree Chris. You can’t get fired for selecting an LMS. Though it likes increasingly like “You can’t get fired for selecting an open source LMS”.

      The lack of analysis seems to me to be a major contribution to this problem. The folk in charge form a particular view of the LMS that isn’t always based on what is actually being done with the LMS.

      A few senior folk at our common previous institution expressed great surprise when it was discovered that large numbers of academic staff hardly spent anytime on their LMS course site.

      I’m hoping a bit of analysis might help break up those ideas, or at the very least be interesting. e.g. I’m wondering if there’s been a decline in the use of the LMS discussion forum etc in light of the spread of Facebook.

  2. M-H

    We certainly collect and analyse this information – I’ve just come out of a meeting where we discussed doing it for this year. But I suspect most Unis don’t – they seem barely able to support their LMS users, let alone analyse what’s going on.

    1. M-H

      Nothing’s been published. We do check all sites before they’re released to students, and they are classified on a scale with six ‘modes’, so we have some idea of how complex they are – partly by the tools used, but also by how they’re used (eg if they replace f-t-f; if you need to log in to the site to do the work in the Unit).

      We were trying to track changes in tool use over years through the reports in the LMS itslef, but as we moved from WebCT to Bb9 last year it’s not possible, because the reporting is different. We use the reports mostly for our own purposes – planning workshops and other support according to what people are actually doing or not doing in the LMS. It’s two lines in the annual report – usage trends for the past year. Plus we can use it to underpin the budget for the following year: “We need bigger servers because more people are using the LMS to deliver desktop capture” Or ‘More people are using the LMS for summative assessment; we need more support and workshops in this area to lower the risk of problems.” Or “65,000 assignments a month are being uploaded through the LMS – are there any risks associated with that?” (True, BTW).

      I gather that someone in our ICT is using open source software to do some of the tracking, because Bb doesn’t do it all. We’re just beginning to understand what’s possible and what’s important.

  3. M-H

    Sorry, forgot the surprises – they’re mainly related to amount of use. 5 million elearning sessions in 2010, and 1.5 million downloads of lecture capture (sigh). We have about 50,000 students, and about 6000 Units of Study have a website (around 80% of all UoS), plus we now have eportfolios and community sites – not yet counted.

  4. David –

    I sure hope you get some more responses. At my univ, the answer is a big NO. We’re working on providing deans with a “dashboard” of metrics they can track and (hopefully) manage their online courses. We will start with an inventory of all existing courses and assign them to levels (1-4). Nothing else has been decided, so any suggestions are appreciated.

    Las Vegas

    1. G’day Kevin,

      I think we’ve got any responses we’re likely, both in terms of numbers and also types of responses. On Twitter, a couple of other folk have mentioned that nothing or the type of activities you are doing covers most institutions.

      Have another post coming wondering about the value of these practices.


  5. M-H

    One thing we have been doing for several years is providing the Deans of Faculties with quite detailed listings of which Units of Study have an LMS site and a rough guide to the complexity of each site (see my comment above). These are the result of a complex arrangement with eh Planning Office, which provides enrolment stats, and information from the LMS and our site checks. It doesn’t get down to the tool level, but it does give a picture of the siz and shape of elearning in each faculty. (Sydney has a famously strong faculty structure – many pedagogical decisions are based there.)

  6. M-H

    There is a big push at Sydney now for ‘curriculum renewal’, and we have had the portfolio of ‘learning spaces’ added to our responsibilities. (That means that we work with the ICT and building infrastructure units to plan L&T facilities.) So we’re talking to faculties in terms of identifying what they have and what they need in terms of formal teaching spaces, informal learning spaces (computer access labs, wifi, power-points in foyers etc, places for students to work in groups) and virtual learning spaces (LMS websites, eportfolios etc). So the LMS metrics (called the coverage stats) are part of that discussion. Some faculties are more interested than others, but they’re slowly all getting more interested, because the Uni has changed the way that funding works, and now they have to account for their use of space, and how it affects their pedagogy.

    It’s a big change, and it will take years to work through. But some effects are showing: one (small) faculty has made a conscious decision to change their pedagogical approach, and are looking for lots of support with collaborative learning and group work; this will require a big increase in the level of their online support. They already have high eLearning coverage, but it’s mostly information provision. They want to change the way their staff think about eLearning. Nothing small! :)

    1. High coverage, but mostly information provision seems to aptly describe the majority of institutions. It’s what the older stats I’ve seen suggest.

      Changing how academics (and students) think about e-learning seems to be the key, not sure anyone’s cracked it yet.

  7. This is such a helpful post, thank you.

    I’ve noticed in my own institution “changing how academics (and students) think about e-learning” seems to come ahead of “finding out what academics and students think about e-learning”. That is, there’s a slightly reformist presumption, but I’m not sure what it’s based on.

    M-H, I’d be interested to know more about the six modes that you mention.

  8. M-H

    The only thing we have found to help is doing specific projects with academics. They approach us with a problem, and we help them to a solution, often (but not always) using the LMS. This has changed the way a few people use it. But despite dissemination sessions this hasn’t really raised the level of student interaction or collaborative learning in general across the Uni. Loosely structured ‘groupwork’ seems to me to exacerbate the problem – it’s badly done and students quite rightly hate it.

  9. M-H

    Sorry, I didn’t see your comment earlier, MfD. Hope this helps. I believe it’s based on some categorisation that was used by DEST in the early 00s.

    Category of Unit of study website
    A1 – informational
    The unit of study website provides information resources only (eg, unit of study outline, readings, links to other related websites)
    A2 – supplemental
    A1+ the unit of study website provides activities requiring active student participation but these are not part of the assessment framework for the unit of study.
    B1 – blended (assessed)
    A1+ The unit of study website provides activities requiring active student participation and these are assessed as part of the students’ performance.
    B2 – blended (replacement)
    A1+ The unit of study website provides activities requiring active student participation and these have replaced some of the face-to-face class time of students.
    B3 – blended (replacement and assessment)
    A1 + The unit of study website provides activities requiring active student participation and these are assessed and have replaced some of the face-to-face class time of students
    C1 – fully flexible
    The unit of study website supports a unit of study which can be completed almost completely off campus (for example, the unit of study might require a residential weekend)

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s