As part of the LAK11 course Howard Johnson has commented on an earlier post of mine. This post is a place holder for a really nice quote from Howard’s post, an example from recent media reports, and perhaps a bit of a reflection on responses to analytics.
The quote, some reasons and an example
I like this quote because it summarises what I see as the most common problem with the institutions I’ve been associated with. Especially in recent years as there’s been a much stronger move toward the adoption of more techno-rational approaches to management.
A utopian leaning vision can only be achieved with hard work and much effort, but a dystopian vision can be achieved with only minimal effort.
Improving learning and teaching within a modern university context is a complex task. There is no one right solution, there is no simple solution, no silver bullet. Improving learning and teaching is really hard work.
The trouble is that short-term contracts for senior management (which at some institutions now reach down to what were essentially head of school roles) and other characteristics of the organisational context mean that it is simply not possible for that really hard work to be undertaken. The organisational characteristics of Australian universities is increasingly biased towards a focus on the easy route. Something that can be implemented quickly, appear to return good results and enable a senior manager to boast about it when attempting to renew his/her contract and/or apply for a better job at a better institution.
Poor and disadvantaged students were clear winners, with university offers to students from low socio-economic backgrounds increasing by 8 per cent, following the higher participation targets set by the federal government after the 2008 Bradley review of higher education.
I find it very hard to believe that all of these institutions have adopted a utopian vision that has seen their learning and teaching practices, policies, resourcing and systems appropriately updated to respond to the very different needs and backgrounds of these students. Including the necessary re-visiting of the curriculum and learning designs used in their large introductory courses. The courses these students are going to be facing first and which traditionally, at most institutions in most disciplines, have significant failure rates already.
Instead, I see it much more likely that they’ve simply changed who they’ve accepted. At best, they may have thrown some additional resources (an extra warm body or two) to some central support division that is responsible for helping these students. These folk may even have had a couple of meetings with staff who teach those first year courses.
This is not to suggest there aren’t some brilliant folk doing fantastics work in both the central divisions responsible for the bridging and orientation of these students, or in the teaching of large first year courses. It is to suggest that this work is often/usually in spite of the organisational vision, not because of it. It is also to suggest that the existence of such work is almost certainly not repeatable or sustainable. My guess is you could go to any institution boasting how well it is serving these students and by selectively removing a handful of people cause the edifice of good practice to fall apart. The institutional systems wouldn’t be able to continue the good practice in the absence of those key folk.
The utopian vision professed by these institutions will be the result of the hard work of a few who have generally had to battle against the institutional vision and context.
One utopian vision for learning analytics
As Howard suggests much of the discussion of analytics has focused on the dystopian vision. It’s a vision I see as most the likely outcome. At least in the current institutional context.
But at the same time, I do believe that some applications of analytics can help improve the learning and teaching experience of students and staff. It’s important to be aware of and keep highlighting the dystopian vision, but it’s also important – and perhaps past time – to develop and move towards a utopian vision. Or at least to learn from trying. The following is an attempt at the early formulation of one of these. This particular vision connects with some of what I’ve been trying to do. The following does assume an institutional context for learning – that’s what I’m familiar with – am not sure how much of it would be useful for outside an institutional learning context.
Having just listened to John Fitz’s presentation via the lak11 podcast I’d like to pick up notion he mentioned of the self-regulated learner and the idea that analytics can provide useful assistance to that learner. A brief and incomplete summary of an aspect of John’s point would be that there is value in providing the learner with the information provided by analytics in order to enable the learner to make their own decisions.
I would like, however, to expand that to idea to the notion of the self-regulated teacher and the potential benefits that analytics can provide them. From my perspective there are at least three broad types of learner involved in any institutional learning context. They are:
- The formal student learner enrolled in a course/program.
These folk are primarily interested in learning the “content” associated with the course.
- The formal teacher learner charged with running the course/program.
These folk are/should be primarily interested in learning how they can improve the learning experience of the student.
- The institutional learner within which the course/program is offered.
These “folk” are/should be primarily interested in learning how to improve the learning experience of the students and teachers within the institution. Similar to Biggs’ (2001) quality feasibility ideas. Though they are more often primarily interested in defining the learning experience, rather than engaging with and improving existing practices.
At this stage, I’m interested in how analytics can be used to help learner types 1 and 2. I’m keen on changing the learning/teaching environment for these learners in ways that help them improve their own practice (what I see as the task for learner type 3 and the task they aren’t doing). For right or wrong, for most of the higher education institutions I’m associated with the learning environment means the LMS. At least in terms of the contributions I might be able to make.
My small-scale utopian vision is the modification of the LMS environment to effectively bake in analytics informed services and modifications that can help student and teacher learners become more aware of possibly relevant improvements to their practice. Some examples include:
- Some early, unfinished work on an indicators block for Moodle.
- Michael de Raadt’s working progress bar for Moodle.
- Shane Dawson’s SNAPP tool.
However, I don’t think these examples go far enough. There’s something missing. Additional thought needs to be given to the insights from the behaviour change literature which suggests that simply knowing about something isn’t sufficient to encourage change in behaviour.
This comes to the idea of scaffolding conglomerations. One idea for such a conglomeration might be to:
- Embed SNAPP into an LMS (e.g. Moodle).
At the moment, SNAPP is a browser based tool so it can only generate visualisations based on data in courses that the user has access to. For most people in most LMS this means you are limited by the inherent course division fundamental to LMS design. You can’t see and act upon the social networks evident in other courses.
- Build around SNAPP some responses based on common patterns.
One example might be a “Prompt all isolated students” feature that would present the academic with a template email (designed based on insights from theory or experience) that can be sent automatically to all discussion forum participants that aren’t connected to others. It might automatically include some statistics showing success rates between students that are isolated and those that are connected.
- Enable user-contribution of common responses.
Enable staff to add their own pattern response sequences.
- Link SNAPP data with other Moodle and institutional data.
Allow staff and students to see additional anonymised information with the SNAPP visualisations. e.g. shade red all those students who exhibit network connections similar to those who have failed the course previously.
- Provide links to resources about good practice.
When SNAPP detects a pattern where one person (e.g. the teacher) is the focal point of all interaction within a discusion forum, it provides a link to the literature and instructional design practice that suggests this is wrong and identifies approaches to modify practice.
- Makes SNAPP data visible to other teachers within a cohort.
All teachers within the psychology courses can see the network visualisations in each others courses. Thereby making visible and open for discussion social norms within those courses.
Time to stop worrying about dystopian vision (and also writing about a potential utopian vision) and start doing something. As per the Alan Kay quote
Don’t worry about what anybody else is going to do… The best way to predict the future is to invent it.