Category Archives: reflectivealignment

Learning Analytics: engaging with and changing learning and teaching

The following is an attempt to build a bit more on an earlier idea around the use of learning analytics. It’s an attempt to frame a different approach to the use of learning analytics and to share these ideas in preparation for a potential project.

In part, the project is based on the assumption that the current predominant applications of learning analytics are

  1. By management as a tool to enable “data-based” decision making.
  2. By providing tools to students that allow them to reflect on their learning.
  3. By researchers.

And that as identified by Dawson, Heathcote & Poole (2010) there is a

lack of research regarding the application of academic analytics to inform the design, delivery and future evaluations of individual teaching practices

i.e. while there existing applications of learning analytics by/for management, researchers and students is important and should continue, there is a need to explore how learning analytics can be used by teaching staff to inform and improve their practice.

The theoretical basis the current project idea are, in summary,

  1. Drawing on Seely-Brown & Duiguid’s (1991) ideas around how “abstractions detached from practice distort or obscure intricacies of that practices” there is value in examining how what learning analytics might do by focusing on an in-depth engagement with actual academic practice to better enable exploration, understanding and innovation around the application of learning analytics to individual teaching practices.
  2. The quality of student learning outcomes is influenced by the conceptions of learning and teaching, and the perceptions of the teaching environment held by teaching staff (Trigwell, 2001; Prosser et al, 2003; Richardson, 2005; Ramsden et al, 2007).
  3. Learning analytics can be useful in revealing different and additional insights about what is going on within a course (and in other courses).
  4. Transforming the insights from learning analytics to informed pedagogical action is, for the majority of academics, complex and labour intensive (Dawson, et al, 2010).
  5. Distributed leadership – built on foundations of distributed cognition and activity theory – seeks to distribute power (the ability to get things done) through a collegial sharing of knowledge, of practice, and reflection within a socio-cultural context. (Spillane et al, 2004; Parrish et al 2008).
  6. Encouraging academics to engage in reflection on their teaching is an effective way to enhance teaching practice and eventually student learning (Kreber and Castleden, 2009).

Consequently, the project seeks to engage groups of academics in cycles of participatory action research where they are encouraged and enabled to explore and reflect on their courses they have taught with various learning analytics tools and other lenses. In preparation for this a range of existing analytics tools and forms of analysis will be applied to the courses. In response to the cycles the tools/analyses may be modified or new ones created. In particular, the tools will be modified/developed to make it easier for academics to transform the information provided by the application of learning analytics into informed pedagogical action.

In particular, the project will explore how the tools can be modified to enable the sharing of knowledge, practice and reflection between the participants and eventually the broader academic community. To break down the course-based silos and make it easier for academics to see what other staff have done and with what impacts.

References

Dawson, S., Heathcote, L., & Poole, G. (2010). Harnessing ICT potential: The adoption and analysis of ICT systems for enhancing the student learning experience. International Journal of Educational Management, 24(2), 116-128. doi:10.1108/09513541011020936

Kreber, C. and H. Castleden (2009). “Reflection on teaching and epistemological structure: reflective and critically reflective processes in ‘pure/soft’ and ‘pure/hard’ fields.” Higher Education 57(4): 509-531.

Parrish, D., Lefoe, G., Smigiel, H., & Albury, R. (2008). The GREEN Resource: The development of leadership capacity in higher education. Wollongong: CEDIR, University of Wollongong.

Prosser, M., P. Ramsden, et al. (2003). “Dissonance in experience of teaching and its relation to the quality of student learning.” Studies in Higher Education 28(1): 37-48.

Spillane, J., Halverson, R., & Diamond, J. (2004). Towards a theory of leadership practice: a distributed perspective. Journal of Curriculum Studies, 36(1), 3-34.

Ramsden, P., M. Prosser, et al. (2007). “University teachers’ experiences of academic leadership and their approaches to teaching.” Learning and Instruction 17(2): 140-155.

Richardson, J. (2005). Students’ approaches to learning and teachers’ approaches to teaching in higher education. Educational Psychology, 25(6), 673-680.

Seely Brown, J., & Duguid, P. (1991). Organizational learning and communities-of-practice: Toward a unified view of working, learning, and innovation. Organization Science, 2(1), 40-57.

Trigwell, K. (2001). “Judging university teaching.” The International Journal for Academic Development 6(1): 65-73.

Schemata and the source of dissonance?

The following is intended to be an illustration of one of the potential origins of the gap between learning technologists and educators. It picks up on the idea of schemata from this week’s study in one course and connects to my point about the dissonance between how educational technology is implemented in universities and what we know about how people learn.

I’m sure folk who have been around the education discipline longer than I will have seen this already. But it is a nice little activity and not one that I’d seen previously.

An experiment

Read the following paragraph and fill in the blanks. If you’re really keen add a comment below with what you got. Actually, gathering a collection of responses from a range of people would be really interesting.

The questions that p________ face as they raise ch________ from in_________ to adult are not easy to an _________. Both f______ and m________ can become concerned when health problems such as co_________ arise anytime after the e____ stage to later life. Experts recommend that young ch____ should have plenty of s________ and nutritious food for healthy growth. B___ and g____ should not share the same b______ or even be in the same r______. They may be afraid of the d_____.

Now, take a look at the original version of this paragraph.

Is there any difference between it and what you got? Certainly was for me.

Schemata

This problem was introduced in a week that was looking at Piaget and other theories about how folk learn. In particular, this example was used as an example of the role schemata play in how people perceive and process the world and what is happening within it.

I am a father of three wonderful kids. So, over the last 10+ years I’ve developed some significant schemata around raising kids. When I read the above paragraph, the words that filled the blanks for me were: parents, children, infant, answer, fathers, mothers,….and it was hear that I first paused. None of my children really suffered from colic, so that didn’t spring to mind, but I started actively searching for ways I could make this paragraph fit the schemata that I had activated. i.e. I was thinking “parent”, so I was trying to make these things fit.

Schemata are mental representations of an associated set of perceptions etc. The influence how you see what is going on.

I’m somewhat interested in seeing what words others have gotten from the above exercise, especially those without (recent) experience of parental responsibilities.

A difference of schemata

Learning technologists (or just plain innovative teachers) have significantly different schemata than your plain, everyday academic. Especially those that haven’t had much experience of online learning, constructivist learning, *insert “good” teaching practice of your choice*. Even within the population of learning technologists there is a vast difference in schemata.

Different schemata means that these folk see the world in very different ways.

A triumph of assimilation of accommodation

The on-going tendency of folk to say things like (as in an article from the Australian newspaper’s higher education section) Online no substitute for face to face teaching says something about their schemata and (to extend the (naive/simplistic) application of Piaget) the triumph of assimilation over accommodation.

For Piaget people are driven to maintain an equilibrium between what the know and what they observe in the outside world. When they perceive something new in the world they enter a state of disequilibrium and are driven to return to balance. For Piaget, there are two ways this is done.

  1. Assimilation – where the new insight is fitted into existing schemata.
  2. Accommodation – where schemata are changed (either old are modified or new are created) to account for the new insights.

I’d suggest that for a majority of academic staff (and senior management) when it comes to new approaches to learning and teaching their primary coping mechanism has been assimilation. Forcing those new approaches into the schemata they already have. i.e. the Moodle course site is a great place to upload all my handouts and have online lectures.

As I’ve argued before I believe this is because the approaches used to introduce new learning approaches in universities have had more in common with behaviourism than constructivism. Consequently the approaches have not been all that successful in changing schemata.

30% of information about task performance

Over on the Remote Learner blog, Jason Cole has posted some information about a keynote by Dr Richard Clark at one of the US MoodleMoots. I want to focus on one key quote from that talk and its implications for Australian higher education and current trends to “improve” learning and teaching and adopt open source LMS (like Moodle).

It’s my argument that this quote, and the research behind it, has implications for the way these projects are conceptualised and run. i.e. they are missing out on a huge amount of potential.

Task analysis and the 30%

The quote from the presentation is

In task analysis, top experts only provide 30% of information about how they perform tasks.

It’s claimed that all the points made by Clark in his presentation are supported by research. It appears likely that the support for this claim comes from Sullivan et al (2008). This paper address the problem of trying to develop procedural skills necessary for professions such as surgery.

The above quote arises due to the problems experts have in describing what they do. Sullivan et al (2008) offer various descriptions and references of this problem in the introduction

This is often difficult because as physicians gain expertise their skills become automated and the steps of the skill blend together [2]. Automated knowledge is achieved by years of practice and experience, wherein the basic elements of the task are performed largely without conscious awareness [3]. This causes experts to omit specific steps when trying to describe a procedure because this information is no longer accessible to conscious processes [2]

Then later, when describing the findings of their research they write

The fact that the experts were not able to articulate all of
the steps and decisions of the task is consistent with the expertise literature that shows that expertise is highly automated [2,3,5] and that experts make errors when trying to describe how they complete a task [3,6,7]. In essence, as the experts developed expertise, their knowledge of the task changed from declarative to procedural knowledge. Declarative knowledge is knowing facts, events, and objects and is found in our conscious working memory [2]. Procedural knowledge is knowing how to perform a task and includes both motor and cognitive skills [2]. Procedural knowledge is automated and operates outside of conscious awareness [2,3]. Once a skill becomes automated, it is fine-tuned to run on autopilot and executes much faster than conscious processes [2,8]. This causes the expert to omit steps and decision points while teaching a procedure because they have literally lost access to the behaviors and cognitive decisions that are made during skill execution [2,5].

The link to analysis and design

A large number of universities within Australia are either:

  1. Changing their LMS to an open source LMS (e.g. Moodle or Sakai), and using this as an opportunity to “renew” their online learning; and/or
  2. Busy on broader interventions to “renew” their online learning due to changes in government policies such as quality assurance, graduate attributes and a move to demand funding for university places.

The common process being adopted by most of these projects is from the planning school of process. i.e. you undertake analysis to identify all relevant, objective information and then design the solution on that basis. You then employ a project team to ensure that the design gets implemented, and finally you put in a skeleton team that maintains the design. This works in terms of information systems (e.g. the selection, implementation and support of a LMS) or broader organisational change (e.g. strategic plans).

The problem is that the “expert problem” Clark refers to above means that it is difficult to gather all the necessary information. It’s difficult to get the people with the knowledge to tell all that they know.

A related example.

The StaffMyCQU Example

Some colleagues and I – over a period of almost 10 years – designed, supported, and evolved an information system call Staff MyCQU. An early part of it’s evolution is described in the “Student Records” section of this paper. It was a fairly simple web application that provided university staff with access to student records and range of related services. Over it’s life cycle, a range of new and different features were added and existing features tweaked, all in response to interactions with the system’s users.

Importantly, the systems developers were also generally the people handling user queries and problems on the “helpdesk”. Quite often, changes to the system would result in tweaks and changes. Rather than being designed up front, the system grew and changed with people using it.

The technology used to implement Staff MyCQU is now deemed ancient and, even more importantly, the system and what it represents is now politically tainted within the organisation. Hence, for the last year or so, the information technology folk at the institution have been working on replacement systems. Just recently, there’s been some concrete outcomes of that work which has resulted in systems being shown to folk, including some of the folk who had used Staff MyCQU. On being shown a particular feature of the new system, it soon became obvious that the system didn’t include a fairly common extension of the feature. An extension that had actually been within StaffMyCQU from the start.

The designers of the new system, with little or no direct connection with actual users doing actual work, don’t have the knowledge about user needs to design a system that is equivalent to what already exists. A perfect example of why the strict separation of analysis, design, implementation and use/maintenance that is explicit in most IT projects and divisions is a significant problem.

The need for growing knowledge

Sullivan et al (2008) suggest cognitive task analysis as a way to better “getting at” the knowledge held by the expert, and there’s a place for that. However, I also think that there is a need, especially in some contexts, for recognition that the engineering/planning method is just not appropriate for some contexts. In some contexts, you need more of a growing/gardening approach. Or, in some cases you need to include more of the growing/gardening approach into your engineering method.

Rather than seeking to gather and analyse all knowledge separated from practice and prior to implementation. Implementation needs to be designed to pay close attention to knowledge that is generated during implementation and the ability to act upon that knowledge.

Especially for wicked problems and complex systems

Trying to improve learning and teaching within a university is a wicked problem. There are many different stakeholders or groups of stakeholders, each with a different frame of reference which leads to different understanding of how to solve the problem. Simple techno-rational solutions to wicked problems rely on the adoption of one of those frames of reference and ignorance of the remainder.

For example, implementation of a new LMS is seen as an information technology problem and treated as such. Consequently, success is measured by uptime and successful project implementation. Not on the quality of learning and teaching that results.

In addition, as you solve wicked problems, you and all of the stakeholders learn more about the problem. The multiple frames of reference change and consequently the appropriate solutions change. This is getting into the area of complex adaptive systems. Dave Snowden has a recent post about why human complex adaptive systems are different.

Prediction

Universities that lean too heavily on engineering/planning approaches to improving learning and teaching will fail. However, they are likely to appear to succeed due to the types of indicators they choose to adopt to as measurements of success and the capability of actors to game those indicators.

Universities that adopt more of a gardening approach, will have greater levels of success, but will have a messier time of it during their projects. These universities will be where the really innovative stuff comes from.

References

Sullivan, M., A. Oretga, et al. (2008). “Assessing the teaching of procedural skills: can cognitive task analysis add to our traditional teaching methods.” The American Journal of Surgery 195: 20-23.

The grammar of school, psychological dissonance and all professors are rather ludditical

Yesterday, via a tweet from @marksmithers I read this post from the author of the DIYU book titled “Vast Majority of Professors Are Rather Ludditical”. This is somewhat typical of the defict model of academics which is fairly prevalent and rather pointless. It’s pointless for a number of reasons, but the main one is that it is not a helpful starting point for bringing a out change as it ignores the broader problem and consequently most solutions that arise from a deficit model won’t work.

One of the major problems this approach tends to ignore is the broader impact of the grammar of school (first from Tyack and Cuban and then Papert). I’m currently reading The nature of technology (more on this later) by W. Brian Arthur. The following is a summary and a little bit of reflection upon a section titled “Lock-in and Adaptive Stretch”, which seems to connect closely with the grammar of school idea.

Psychological dissonance and adaptive stretch

Arthur offers the following quote from the sociologist Diane Vaughan around psychological dissonance

[In the situations we deal with as humans, we use] a frame of reference constructed from integrated sets of assumptions, expectations and experiences. Everything is perceived on the basis of this framework. The framework becomes self-confirming because, whenever we can, we tend to impost it on experiences and events, creating incidents and relationships that conform to it. And we tend to ignore, misperceive, or deny events that do not fit it. As a consequence, it generally leads us to what we are looking for. This frame of references is not easily altered or dismantled, because the way we tend to see the world is intimately linked to how we see and define ourselves in relation to the world. Thus, we have a vested interest in maintaining consistency because our own identity is at risk.

Arthur goes onto to suggest that “the greater the distances between a novel solution and the accepted one, the large is this lock-in to previous tradition”. He then defines the lock-in of the older approach as adaptive stretch. This is the situation where it is easier to reach for the old approaches and adapt it to the new circumstances through stretching.

Hence professors are ludditical

But haven’t I just made the case, this is exactly what happens with the vast majority of academic practice around e-learning. If they are using e-learning at all – and not simply sticking with face-to-face teaching – most teaching academics are still using lectures, printed notes and other relics of the past that they have stretched into the new context.

They don’t have the knowledge to move on, so we have to make them non-ludditical. This is when management and leadership at universities rolls into action and identifies plans and projects that will help generate non-ludditical academics.

The pot calling the kettle black

My argument is that if you step back a bit further the approaches being recommended and adopted by researchers and senior management; the way those approaches are implemented; and they way they are evaluated for success, are themselves suffering from psychological dissonance and adaptive stretch. The approaches almost without exception borrow from a traditional project management approach and go something like:

  • Small group of important people identify the problem and the best solution.
  • Hand it over to a project group to implement.
  • The project group tick the important project boxes:
    • Develop a detailed project plan with specific KPIs and deadlines.
    • Demonstrate importance of project by wheeling out senior managers to say how important the project is.
    • Implement a marketing push involving regular updates, newsletters, posters, coffee mugs and presentations.
    • Develop compulsory training sessions which all must attend.
    • Downplay any negative experiences and explain them away.
    • Ensure correct implementation.
    • Get an evaluation done by people paid for and reporting to the senior managers who have been visibly associated with the project.
    • Explain how successful the project was.
  • Complain about how the ludditical academics have ruined the project through adaptive stretching.

Frames of reference and coffee mugs

One of the fundamental problem with these approaches to projects within higher education is that it effectively ignores the frames of reference that academics bring to problem. Rather than start with the existing frames of reference and build on those, this approach to projects is all about moving people straight into a new frame of reference. In doing this, there is always incredible dissonance between how the project people think an action will be interpreted and how it actually is interpreted.

For example, a few years ago the institution I used to work for (at least as of CoB today) adopted Chickering and Gamson’s (1987) 7 principles for good practice in undergraduate teaching as a foundation for the new learning and teaching management plan. The project around this decision basically followed the above process. As part of the marketing push, all academics (and perhaps all staff) received a coffee mug and a little palm card with the 7 principles in nice text and a link to the project website. The intent of the project was to increase awareness of the academics of the 7 principles and how important they were to the institution.

The problem was, that at around this time the institution was going through yet more restructures and there was grave misgivings from senior management about how much money the institution didn’t have. The institution was having to save money and this was being felt by the academics in terms of limits on conference travel, marking support etc. It is with this frame of reference that the academics saw the institution spending a fair amount of money on coffee mugs and palm cards. Just a touch of dissonance.

What’s worse, a number of academics were able to look at the 7 principles and see principle #4 “gives prompt feedback” and relate that to the difficulty of giving prompt feedback because there’s no money for marking support. Not to mention the push from some senior managers about how important research is to future career progression.

So, the solution is?

I return to a quote from Cavallo (2004) that I’ve used before

As we see it, real change is inherently a kind of learning. For
people to change the way they think about and practice education, rather than merely being told what to do differently, we believe that practitioners must have experiences that enable appropriation of new modes of teaching and learning that enable them to reconsider and restructure their thinking and practice.

Rather than tell academics what to do, you need to create contextualised experiences for academics that enable appropriation of new models of teaching and learning. What most senior managers at universities and many of the commentators don’t see, is that the environment at most universities is preventing academics from having these experiences and then preventing them from appropriating the new models of teaching.

The policies, processes, systems and expectations senior managers create within universities are preventing academics from becoming “non-ludditical”. You can implement all the “projects” you want, but if you don’t work on the policies, processes, systems and expectations in ways that connect with the frames of reference of the academics within the institution, you won’t get growth.

References

Cavallo, D. (2004). Models of growth – Towards fundamental change in learning environments. BT Technology Journal, 22(4), 96-112.

Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin, 39(7), 3-7.

Functional fixedness, analytics, and the LMS

A blog post on the website of Gilfes Education Group (apparently a “network of independent education experts”) picks up on the Indicators project and its take on academic analytics. The post seems to largely in agreement with what we’re doing and the reasons behind it.

The following seeks to pick up on a point made in the Gilfus post about the problem arising from ownership of the data and some of the other barriers that have been proposed. The argument I develop in the following that functional fixedness is a major barrier to the effective appropriation of academic analytics to help improve learning and teaching.

But first, an experiment

Imagine if you will that we’re in a room together. I’m going to set you a task. Here’s some matches, a box of tacks and a candle (see the image below). Your task is to attach the candle to a cork board on the wall in way that means that wax from the candle does not drip onto the table that is underneath the cork board.

Candle problem set up

How do you do it?

The solution is given an image at the end of this post.

Apparently, if I rephrase the problem solution a little to the following, it might improve your chances of success.

Here’s some matches, a box of tacks and a candle (see the image below). Your task is to attach the candle to a cork board on the wall in way that means that wax from the candle does not drip onto the table that is underneath the cork board.

Functional fixedness

If you’re anything like my brother-in-law on whom I tested this out in person, you did not arrive at the solution quickly, if at all. This experiment is called the candle problem and has been used to demonstrate the problem of functional fixedness.

Functional fixedness suggests that you have fixated on the design function of the object – i.e. the box of tacks is designed to hold the tacks – so much that you cannot see how it might be put to a different use to solve this problem. To put it in the words of German and Barrett (2005)

Problem solving can be inefficient when the solution requires subjects to generate an atypical function for an object and the object’s typical function has been primed

In other words, the problem description above had the box’s typical function primed as holding the tacks, hindering your ability to see another use for the box.

Academic analytics, the LMS and functional fixedness

For most universities there is an existing set of information systems. There’s the learning management system (LMS) in which learning takes place, and there is the data warehouse and associated business intelligence tools for providing reports and information. The people within these organisations, especially those already supporting (the IT folk) and using (management) the data warehouse, have been primed to see a typical use for these systems. They are fixated on using the LMS and data warehouse in a particular way.

Add into this mix the typical under resourcing/inefficient management of IT, and the typical top-down, techno-rational approach to management and it is simply too difficult for organisational members to see the case for moving aspects of academic analytics into the LMS.

It doesn’t help that it’s messy

The matter isn’t helped much by the benefits of moving aspects of academic analytics into the LMS are somewhat uncertain and messy. Being uncertain and messy aren’t characteristics of an approach likely to overcome functional fixedness. Especially in organisational environments where being efficient (defined as doing what we already do or have strategically planned to do) is the main intermediate goal. But then this is why innovation is hard in organisations, innovation is messy.

References

German, T. and H. C. Barrett (2005). “Functional fixedness in a technologically sparse culture.” Psychological Science 16(1): 1-5.

Solution

The solution to the Candle problem is represented in the following image.

Candle problem solution

Course websites and “libertarian paternalism”

Stephen Downes makes a valid point about my recent question about whether or not academics should manually create websites. I agree with his underlying point that academics should not be forced to use the institutional approach. Given any option I would not suggest such an approach.

Incompetent paternalism

However, at least within some Australian institutions academics are being forced to accept an institutional approach. That approach is typically expressed as “minimum service standards” which are specified by management. The academics are than expected to manually fulfill those standards. I have a problem with this approach, but if it is being adopted, then at the very least implement it in an efficient and effective way.

What is happening in these situations could be described as “incompetent” paternalism. Academics are being treated as children (a theory X perspective of academics underpins this approach). Management as the parents have to specify codes of behaviour. But when it comes to implementing this code of behaviour, management are actually making using an inefficient and ineffective approach.

I disagree strongly with this approach, but if management believe it, is it too much to ask them to do it efficiently and effectively? That’s one perspective, my real interest is in a third way that tries to effectively merge features of both incompetent paternalism and the academic free for all.

The “libertarian” paternalist alternative

The model that evolved in the early part of this decade could be described as a “libertarian” paternalist approach. It’s a bit of a stretch but I think the metaphor works.

The theory was/is that an appropriately skilled group, taking an adopter-focused and emergent development approach could develop a default course site that could effectively be used by a group of courses. That default course site could be automatically created. But since the group was using an emergent development approach, the default course site would continue evolving.

The default course site did not remove the academics freedom of choice. As implemented, academics could modify the default course sites in two ways:

  • Use the “LMS” to modify or add to the default course site; or
    Here’s an example default course site from 2006.
  • Create a real course site.
    Here’s a default course site where the academic has created a real course site.

This wasn’t a perfect solution, it still wasn’t flexible enough. We had plans to enable better merging of the default and real course sites. i.e. if a real course site was created, it could replace the default course site and make better use of the services offered by the “LMS”. But they never got off the drawing board.

In the almost 8 years that this approach was used, the “LMS” averaged around 300 course sites a year. The number of real course sites (i.e. academics doing their own thing) was never reach 10% in any given year. It averaged around 4% of course sites a year.

Lack of appropriation

Importantly, where possible, the aim was to observe what everyone was doing, especially the <10% creating the real courses sites, and use those insights to modify the default course sites.

The current management approach of specifying minimum standards is being driven by external desires, not by the experience of academics and students using the minimum standards.

References

Jones, D. and T. Lynch (1999). A Model for the Design of Web-based Systems that supports Adoption, Appropriation and Evolution. First ICSE Workshop on Web Engineering, Los Angeles.

The confusion of small and big changes

Over the last couple of days I’ve enjoyed a small discussion that has arisen out of some comments Kevin has made on my blog. This post is an attempt to partially engage with the most recent comment. I echo Kevin’s conclusion, I’d love to hear anyone else’s take on this.

The unanswered question

The main point I’d like to discuss is the question of small versus big changes. I have an opinion on this, but there’s not enough evidence to suggest that it’s an answer. The basic question might be phrased as: How do you improve the quality significant improvement in the quality of L&T in universities? You could make this much more general, along the lines of “How do you change organisational practices?”, but I’m going to stick with the specific.

I’m familiar with two broad responses:

  • Revolutionary (usually top-down) change; and
    This is where the necessary change is identified by someone, eventually they get agreement/the ability to implement the change through some sort of change management process. This usually involves some big change. e.g. the adoption of a new LMS for a university, trashing the LMS and adopting WPMU for L&T, adopting university wide graduate attributes, requiring every academic to have a formal teaching qualification etc. Or even more radical, the death of universities and their replacement by something else.
  • Evolutionary (usually bottom-up) change.
    Small-scale changes in practice, usually at the local level.

Kevin’s comment gives a good summary of the common problem with the evolutionary change approach

In my experience, especially at a large institution, taking the “small changes” route is the road to perdition. For me, this means I have to engage in a million little negotiations to get the small to accumulate to something significant. At the rate I’m going it will take me two lifetimes to bring about real change in the English Department.

As I mentioned above and indicate by the heading for this section, I don’t have what I would call an answer. I have an argument for the approach I would take and some evidence to support it, but I don’t think I can call it “the answer” (yet).

What I think is the answer

Last I year I gave a presentation called Herding cats, losing weight and how to improve learning and teaching (slides and video are available). In that presentation, the analogy used is that revolutionary change is like herding cats and that evolutionary change is like losing weight. Using this analogy I argue that the herding cats approach to improve the quality of teaching at a University has not worked empirically and that there is significant theory to explain why it will never work. That same theory suggests that an evolutionary approach informed by lessons learned from weight loss, is much more promising.

The general solution I suggest is one slide 200 or so (it was only a 60 minute presentation) and goes under the title “reflective alignment” and can be summarised as

All aspects of the learning and teaching environment are aligned to enable and encourage academic staff to reflect on their teaching with the aim of achieving 3rd order change.

Framed another way, the teaching environment at a university encourages and enables academics to be changing their thinking and practice of teaching. That is essentially do what they do now, make small changes each time they teach a course, but rather than changes that are not constrained by the same ways of thinking about teaching.

Having academics continually making these sorts of 3rd order changes (within an institution that encourages and enables them to make those 3rd order changes) will result (I think) in radically different and significantly improved learning and teaching.

When small changes won’t work

Like Kevin, I think that universities relying on small changes to improve learning and teaching will not work. Mostly because the university environment does not encourage nor enable the type of small scale changes that are required.

In the herding cats presentation a large part of the time was listing the parts of the university teaching environment that actively prevents the type of 3rd order change that is necessary. In fact, much of the bleating in posts on this blog are complaining about these limitations. Some examples include:

  • Rewards that favour research, not teaching.
    No matter how many formal teaching qualifications an academic is forced to acquire, if they get promoted (both at their current and other universities) through the quality of their promotion, then they will focus on research, not teaching.
  • Pressures arising from quality assurance and simplistic KPIs.
    Since the mid-1990s I’ve observed that it is only the courses with large failure rates or student complaints that get any attention from university management. Students, like most people get scared when their expectations aren’t meant. That means if you try something innovative students will complain. In addition, if you try something innovative you might have problems, which management hate. If you try something different, you are more likely to have to waste time responding to “management concerns”. The presentation references research showing that this is preventing academics from trying innovative work.

    With the rise of quality assurance and corporate aproaches to management, this trend is getting worse.

  • Removal of autonomy;
    As I’ve argued in a couple of posts top-down management is removing academic autonomy and perhaps purpose and subsequently reducing academic motivation.
  • Constraining systems;
    Increasingly universities are using information systems to perform learning and teaching. Those systems are designed on particular assumptions that limit the ability to change. The most obvious example is the LMS (be it open or closed source). This recent post includes discussion of this point around the LMS.

    The people, processes and policies within universities are being set up to use these systems. If you use something different, you are being inefficient.

  • Simplistic understandings of innovation.
    When it comes to understanding innovations (e.g. something as simple as a new LMS), universities have naive perspectives of the adoption process. As recognised by Bigum and Rowan (2004) this naive perspective assumes that the innovation passes through the adoption process largely unchanged, which means that the social must conform with the innovation.

    i.e. As the institution starts to adopt Moodle across all its courses, Moodle can and should stay exactly the same. You only need to show people how to use Moodle, nothing more. If what they want to do is not supported by Moodle, then they need to conform to what Moodle does, regardless of the ramifications.

My argument is that all of this and other factors within a university environment actively prevent small changes having broad outcomes. The university environment is actively discouraging 3rd order change and isn’t even very good at achieving 2nd order change.

But small change can’t make a big difference

Ignoring all that, people still get stuck on the idea of lots of small change creating really big change. They are wrong.

To justify that, first let me draw on people recognised as being much smarter and more important than I (Weick and Quinn, 1999)

The distinctive quality of continuous change is the idea that small continuous adjustments, created simultaneously across units, can cumulate and create substantial change.

The main reason people have trouble with this idea (I think) is that they think that the world is ordered and predictable. That the world is an ordered system. If you make a small change, you get a small effect. However, when you’re talking about a complex system, small changes can create radical outcomes.

I don’t have time to expand on this here, it’s talked about in the presentation I mentioned above. Anyway, Dave Snowden and any number of other people make this point better than I.

Big and small change in the wrong place

Here’s a new idea. One of the reasons why I think most universities are failing to improve the quality of their teaching is that they are focusing on big and small change in the wrong places.

In my experience, most universities are trying to make big improvement in teaching by introducing big changes in what academics do. Use a different system, use a different pedagogy, radically change your teaching so you are constructively aligned, get a teaching qualification etc. But at the same time, there is no radical change in the how the teaching environment works. There are no solutions to the above problems with the environment.

What I am suggesting is that there should be big changes in the environment to enable small changes on the part of the academic. In fact, in the presentation I argue that the aim is to help the academics do what good teaching academics have always done (Common, 1989)

Master teachers are not born; they become. They become primarily by developing a habit of mind, a way of looking critically at the work they do by developing the courage to recognise faults, and struggling to improve.

References

Bigum, C. and L. Rowan (2004). “Flexible learning in teacher education: myths, muddles and models.” Asia-Pacific Journal of Teacher Education 32(3): 213-226.

Common, D. (1989). “Master teachers in higher education: A matter of settings.” The Review of Higher Education 12(4): 375-387.

Weick, K. and R. Quinn (1999). “Organizational change and development.” Annual Review of Psychology 50: 361-386.