Or, an attempt to share some thinking about the idea behind an – almost obligatory – application for external funding.
The car analogy
A few weeks ago one of my neighbours up the road had left the lights of his ageing Mitsubishi Magna on. They were on all night. As an older car – arguably of questionable quality – the Magna allowed him to get out of the car with the lights still on. I believe he got out of the car during daylight and it wasn’t obvious to him that his lights were still on. The car, his tool for driving, didn’t warn him of this problem.
On the other hand, one of the cars my family owns is a Honda Jazz. It’s almost as old as the Magna, but arguably Honda have put in a bit more thought. If you remove the keys from the ignition and the headlights are still on, the car starts an incessant and annoying beeping. The car “has a bit of smarts” built in. It warns the driver that there’s a problem.
The other car we own is a VW Golf GTI (see photo). I love this car for a variety of reasons. One of the very minor reasons is that if you remove the key from the ignition, and the headlights are still one, the car turns the lights off. It also has automatic windscreen wipers that do a pretty good job. If it rains, they start.
The “LMS” is like a Magna
The Learning Management Systems used by most Universities remind me a great deal of the Magna (perhaps a Model T Ford is a better example). They don’t contain a lot of smarts. If something is going to happen within the LMS, then either the students or academics using the LMS have to do it. The LMS doesn’t provide much assistance for the people using it when they fail to pick up on a problem, much like the Magna didn’t warn my neighbour that his headlights were still on.
Using analytics to produce a Golf GTI
Some colleagues and I are currently throwing around an idea to use analytics to make the LMS (in our case Moodle) – or perhaps the broader institutional learning environment – more like a Golf GTI than a Magna.
Perhaps the original or best known example of this is the Signals work at Purdue University. In part, we’d be looking to replicate something like this, but this is only part of the story. We’d also be aiming to identify how a range of the other patterns that have been identified through analytics can be leveraged to modify the LMS to be a more pro-active member of the distributed cognitive system that is learning and teaching within a university (i.e. make the LMS more like the Globe Theatre). The theory being that if the quality of cognitive processes within the institutional learning systems is better, then the (student learning) outcomes should also be better.
This is one approach to responding to the learning analytics challenge identified by Dawson et al (2008, p. 222)
no longer simply to generate data and make it available, but rather to readily and accurately interpret data and translate such findings to practice
How might it happen?
Much of what happens around analytics is driven from the top-down, and there’s a place for that. An alternative that I’m keen to explore with this project is what happens when the question of analytics, the LMS and distributed cognition is examined from the perspective of the teaching staff and the students. What different questions and tools might be useful? This perspective drives the following initial suggested process:
- Continue the identification and examination of various patterns in the usage data.
- Identify a small set of courses – initial project participants – and
- Identify the issues and aims they have for their courses.
- Explore whether there are insights from analytics and potential actions they (and the LMS) can take to address the issues/aims.
- Modify the LMS environment as a result and observe what happens.
- Return to “a”.
- Broaden the release of these changes to other courses and observe what happens.
Well, that’s an initial stab. More work to do. More reading to do.