Gardner captures one of my major concerns with how analytics may proceed, especially within institutions that are increasingly driven by accountability, efficiency and other concerns. Concerns that they are responding to with top-down management. Gardner’s uses the metaphor of the human mind/learning being as complex as M-Theory (actually more complex) and that learning analytics as commonly thought of is equivalent to measuring M-Theory using a simple cartesian graph.
The end result is that it simplifies learning and how we treat to an extent that is meaningless.
He connects this view of analytics with the LMS approach to e-learning and the traditional nature of curriculum that are all in the simple domain. Learning analytics just continues this. Lots of imagery with school as a feedlot or a Skinner box.
Gardner talks about four strong cautions for analytics
Four strong cautions
- “Student success”
Typically defined within analytics as the student passing doesn’t mean the same as succeeding in life. e.g. given of high performing high school student with no idea of what to do next.
A lot on this that resonates, more below.
- Points of “Intervention”
Just one idea is that of an analytics system that, rather than intervening just before the student fails (as most current analytics projects are trying to achieve), intervenes just as the students begins to understand.
- The “Third wave”
Draws on John Naughton – From Gutenberg to Zuckerberg: What you really need to know about the Internet (2012) – to illustrate “Complexity is the new Reality”
- Feedback matters – a lot
- Systems demonstrate self-organisation
- EMERGENCE – synergies – new phenomena
Naughton also talks about double loop learning where “success means more than positive outcomes “relative to pre-established targets” – Which sounds very much like learning objectives – Instead it means that learners “need to reflect on the appropriateness, in the light of unfolding events, of the assumptions (the mental model) used to set up those actions and targets”
Gardner’s quote (or close to it somewhere in here) is “A real learning analytics system must be able to learn.”
Also mention of the Pardox of the Active User
Shelia MacNeil offers another summary of Gardner’s talk and points to other work. It was from Shelia’s post that I came across Exploiting activity data in the academic environment which is a somewhat broader example of analytics including some useful insights into privacy etc issues around the data.
Shelia identifies some very useful questions
What are the really useful “analytics” which can help me as a learner, teacher, administrator, etc? Which systems provide that data just now ? How can/do these stakeholders access and share the data in meaningful ways? How can we improve and build on these systems in ways which take into account the complexity of learning? Or as Gardner said, how can we start framing systems and questions around wisdom?
Implications for the indicators project
The types of questions identified are exactly the areas which the Indicators Project was (and is about to start again) attempting to explore. The point about complexity is also timely, as that is the perspective that will underpin our work. Consequently I will be reading a bit more of Naughton.
I especially like the point about double loop learning. For three main reasons
- It captures one important distinction between traditional business intelligence approaches and what we hope to do with the indicators project.
- It highlights how we’d like to use analytics, i.e. to help university academics engage in double loop learning about how and why they teach.
- It frames a concern I have out the outcomes focus of much university education, i.e. we’re measuring students against outcomes we think are important and we’ve established ahead of time, rather than asking them to reflect on their assumptions and mindsets.
In particular, I’m thinking this might be an interesting point of departure for thinking about how courses I’m responsible for might evolve.