Another day and another #ascilite12 paper to think about. This is the one where my periodic engagement is potentially driving my co-author slightly crazy. I’m sure this contribution will add further to that.
The basic idea of the paper is to
- Draw on a few more insights/patterns from the data gathered as part of the Indicators project.
This includes usage data from a single Australian university from 3 different “Learning Management Systems” over a 10+ year period.
- Use the lens of complex adaptive systems to
- Identify potential problems with existing trends in the application of learning analytics.
- Identify some potential alternative applications of learning analytics.
The idea builds on some of Col’s earlier thinking in this area and is likely to inform some of the next steps we take in this area.
The problems that we seem to have identified so far are
- The hidden complexity behind the simple patterns.
- Abstraction losing detail.
- Decomposition preventing action.
- It’s not a causal system
- Correlation, causation confusion.
- Overweening conceit of causality.
There must be more of these problems. I do wonder if a closer reading of some of the CAS literature would provide more insights.
For each of these problems we’re hoping to
- Illustrate the nature of the problem with evidence from the data.
- Offer insight into why this is a problem from Complex Adaptive Systems theory.
Morrison (2006) gives a good overview of CAS, some its application to education and some limitations.
- Suggest a better approach based on the insights from CAS.
Hidden complexity – abstraction losing detail
This in part picks up and illustrates part of the message Gardner Campbell made in his presentation “Here I stand” as part of LAK’12. i.e. that the nature of learning analytics and its reliance on abstracting patterns or relationships from data has a tendency to hide the complexity of reality. Especially when used for decision making by people who are not directly engaged in the reality.
Col has illustrated this in this post using the traditional relationship between LMS use and grades (more use == better grades). The nice trend gets interrupted when you start looking at the complexity behind that nice trend. For example, one student who achieved a HD in every course having widely varying numbers of posts/replies in different courses. Similar to Ken’s discoveries when looking at his teaching. Same academic in a few different courses having widely varying practices.
A concrete example is management passing a rule that every course must have a course website that includes a discussion forum, even when for entirely appropriate reasons an entirely on-campus course decides it’s not appropriate.
Decomposition preventing action
The structure and organisation of universities are based on top-down decomposition. All of the various required functions are identified and divided into various parts. There’s HR, IT, the teaching academics in faculties etc. With each decomposition there is loss of the whole. Each sub-component starts to focus on their bit. This is where you get IT departments focusing on uptime, security and risk regardless of the effects it has on the quality of learning.
You can see the effect of this in the learning analytics literature. ALTC grants having to take a browser/client-based approach to tool development because the IT department won’t provide access to the database. It’s one of the reasons why the Indicators project is a little further ahead than most, even though we are a very small group. Through a series of accidents we had access to data and the skills necessary to do something with it.
The effect is also visible in the location of data. Student results are in the student records system. LMS activity is in the LMS etc. This is why “dashboards” are the solution. They bring the data into a single system that is maintained by the data mining folk within an institution. Even though the real value of the patterns revealed by these systems is within the learning environment (the LMS for most), not in yet another systems
You can also see this in the increasing tendency for “dashboards” to be the organisational solution to learning analytics. It’s what the data mining folk in the institution already do, so why not do it for learning analytics? The only trouble is that the information provided by learning analytics is most useful within the LMS. A contributing factor to some of the limitations of the tools and the difficulty staff and students have using them.
The major difficulty for learning analytics is that action in response to learning analytics takes at the teaching/learning coal-face. Not in the dashboard system or the other places inhabited by senior management and support staff.
It’s not a causal system
University senior management assume that they can manipulate the behaviour of people. For example, is the lovely quote I often use from an LMS working group. One of the technical people suggested “that we should change people’s behaviour because information technology systems are difficult to change”. As a complex system there simply isn’t the causality there.
For example, when Moodle was introduced at the institution in question there was grave concern about how few Blackboard course sites actually contained discussion forums. A solution to this was the implementation of “minimum course standards” accompanied by a checklist that was ticked by the academic and double checked by the moderator to assure certain standards were implemented. e.g. a discussion forum. Subsequent data reveals that while all courses may have had a discussion forum, a significant proportion of courses have discussion forums with fewer than 5 posts. This is the mistaken “overweening conceit of casuality”.
Then there is the obvious confusion between correlation and causation. i.e. simply because HD students average more use of the LMS, this doesn’t mean that if all students use the LMS more they’ll get better marks.
Okay, so given these problems what might you do differently. A few initial suggestions
- Put a focus on the technology to aid sense-making and action especially to aid academics and students.
The technology can’t know enough abou the context to make decisions. It can, however, help the people make decisions.
- Put this into the learning environment currently used by these folk (i.e. the LMS).
It has to be a part of the environment. If it’s separate, it won’t be used.
- Break-down the silos.
Currently, much of learning analytics is within a course or across an institution, or perhaps focused on a specific student. Academics within a program need to be sharing their insights and actions. Students need to be able to see how they are going against others…
This is not meant to represent a new direction for the practice of learning analytics. Rather one interesting avenue for further research.
Morrison, K. (2006). Complexity theory and education. APERA Conference, Hong Kong (pp. 1-12). Retrieved from http://edisdat.ied.edu.hk/pubarch/b15907314/full_paper/SYMPO-000004_Keith Morrison.pdf