Implementing an institution-wide learning and teaching strategy: lessons in managing change

The following is a summary and possibly some reflection on Newton (2003). I’m still trying to decide if, as I read literature associated with the PhD, if I should take the time to produce these summaries. I wonder if, instead, I should concentrate on writing the thesis….

Essentially illustrates that academics have different perspectives of strategy than management – SNAFU principle perhaps. Suggests need for better understanding of change and policy implementation. Reinforces much of what I think, some nice references, but doesn’t necessarily indicate an appreciation of or if ateleological approaches might be more appropriate.


Examines an attempt to implement a “learning and teaching strategy” within the UK HE context. Describes the background through the 90s – very corporate, quality based. Is an institutional case study into how strategy and policy are “responded to” by academic managers, academic staff and students. Identifies a range of factors that can undermine successful policy implementation. Offers lessons which can inform management, in particular management of change.

Of course, there appears to be an immediate assumption that top-down/strategic change is appropriate. Given my previous post, I’d suggest the possibility that prescribing something may not be the best way to go.


This is a follow on to a previous article (Newton, 1999) around the same institution. The earlier article as more detail on setting, approach etc. The method/approach is described as a “systematic experiment in reflective practice taking the form of an extended conversation with a developing organisational situation”. A “range of methods and sources has been used to provide a basis for stabilizing the views of key order and groups, including the results of a questionnaire survey and data from a series of typed, semi-structured interviews”.

Does mention following the precepts of an ‘appreciated’ approach (Matza, 1969) – is this linked with appreciated inquiry of which I have a sense of disquiet and of which Dave Snowden is quite scathing?

Pressures on higher education institutions

Launching into the, by now, fairly traditional setting of the higher education sector – massive change, competition for students, impact of ICTs forcing change in the delivery of education, low funding, demands for efficiency gains…

Some nice quotes/references

What hasn’t changed is the perception that higher education is beset by what one Vice-Chancellor described as ‘grotesque turbulence’ (Webb, 1994, p. 43). All who work in higher education today continue to have to deal with the ‘complex interaction between the planned and the serendipitous’ (Webb, 1994, p. 43).


The growth of external and internal regulation and monitoring became associated with academic deprofessionalisation. This increased accountability, expressed in various areas of policy and strategy, has been characterised by the rise of ‘audit culture’ (Power, 1994), and by what Shore and Wright (2000, p. 57) have termed the ‘rapid and relentless spread of coercive technologies into higher education’.

Not to mention the rise of problems associated with this sort prescription

By the end ofthe 1990s, many academics had grown resistant to the ‘intrusion’ associated with the growth of the ‘quality industry’ in UK higher education.

Positions the hole for this research in that in the era post-Dearing in the UK there is some caution amongst institutions. Suggests that the profound impact/transformation of academics and teaching has one under-researched and under-theorised.

Development of a L&T strategy – institutional case study

Sets out the context for this case study,

Institution: a “non-elite” institution, teaching-led, rather than research led, expanded during the 90s, underfunded, experienced organisational turbulence and change, leading to unresolved tensions that undermine attempts at improvement, yet to resolve challenges raised by external changes

External regulatory context. Calls for L&T strategies only arose in the 97 with Dearing report. Resulting in two agencies – Institute for Learning and Teaching and Quality Assurance Agency. References and descriptions for this “new managerialism. Wales funding agency pushes for strategies. Most institutions had underdeveloped strategies – few the product of extensive or open consultation.

Institutional policy context. Management saw the need for L&T strategy before the external requirement. Implement version 1.0 by 1997. Implementation issues and version 2.0 developed during 99/2000.

version 1.0 – perspectives of stakeholders

Version 1.0 included a wide-ranging general policy document with specific recommendations, targets and a costed implementation schedule.

Senior management. Raised profile of L&T and assessment issues, generated debate and critical comment. Investment in staff development sessions – with high levels of invovlement. Production of web-based materials only small number of staff. But overall quality improved and measurable extension of teaching packs and directed learning materials.

Staff. Front line academics covered later. Academic managers thought success not easily visible. Implementation patchy. Some deadlines/targets not met, subsequent decrease in perceived value. Some saw aims as idealistic, some to techno-centric, no defintiion of “good teaching and learning”. Most innovators were enthusiasts, who may have innovated regardless of the strategy. In sufficient ownership and a lack of bottom-up commitement.

Students student focus groups not included. Some conclusions about the “thin veneer of student-centredness”.

Academics and implementation of strategy

Front-line academic and policy process. Above suggested it is contested. Coal-face adapt and shape policy. Various references about this.

Policy reception – factors influence implementation. From quantitative data and observation – arise 5 concepts or barriers to implementation

  1. Loss of ‘front-line’ academics’ autonomy.
    Corporotisation increasing institutional requirements/impingement on teaching. The need to demonstrate compliance taking away emphasis on teaching and innovation.
  2. Policy and strategy overload.
    This one certainly resonates with my local context at the moment. The shifting, growing nature of policy and requirements – “the goalposts keep moving”. Uncertainty over expectation.
  3. Bureaucratisation of teaching.
    The rise of “task corruption”. More important to fill in the forms and plans, than actually be a good teacher. Some good quotes here.
  4. Local practices and local culture.
    Seen both negatively and as a source of information. Negatively through “game playing”. Positively illustrating weaknesses of top down policy.
  5. The ‘shift from teaching to learning’.
    This forms part of the prescription embedded within the strategy. Quotes from staff about students wanting to be taught. Disconnect from reality, limited impact on staff.

Lessons learned

  • Centralised consultation processes lead to a lack of ownership and effort required to support implementation.

    Indeed, as has been argued earlier, strategies do not implement themselves or lead automatically to improvement—even where there may be consensus amongst academic managers and front-line academics regarding the ‘desirability’ of a strategy. Even where general principles are agreed, ‘implementation has to be localised and quality enhancement planned for. As Gibbs argues, ‘implementing learning and teaching strategies requires more than a statement of policy’ (HEFCE, 1999b, p. 4).

  • Implementation must engage with the tensions that arise.
    Implementation reveals tensions as things change, knowledge increases etc. These need to be responded.
  • There is no blue print for an L&T strategy.
  • There is a need for a greater degree of sophisticiation in institutional thinking in strategic planning and policy implementation.


Suggests the ethnographic approach is useful in highlighting certain perspectives – agree. But there’s also the issue of the single person doing the interpretation.

Strategy driven mostly be external needs, is less likely to succeed.

The nature of universities – characterised by turbulence and uncertainty – require better understanding of change. Wariness of planned change perspectives. need to be more sensitive to the diverse views and practices of the academic community. Policy needs constant evaluation……


Newton, J. (2003). “Implementing an institution-wide learning and teaching strategy: lessons in managing change.” Studies in Higher Education 28(4): 427-441.

Prescription, adaptation and failure around improving univeristy teaching

The following post and its content has been shaped by (at least) three separate influences:

  1. My on-going attempt to establish some ways of thinking about how you effectively support the improvement of teaching within universities – currently going under the label of “reflective alignment”
  2. A post by Damien Clark that attempts to integrate some of my ramblings into his own thoughts.
  3. The article by Knight and Trowler (2000) that I’m currently reading entitled “Department-level cultures and the improvement of learning and teaching”.

Lightning McQueen

I’ve found the Knight and Trowler (2000) article particularly good because it has expressed and explained quite effectively a number of points that I believe currently make most institutional attempts to improve teaching less than successful (Yes, there’s a good chance that confirmation bias plays a significant role here. But then I think I’m right ;) ). In this post, I’m hoping/planing to focus on the following points:

  • Prescription – why most institutional approaches to improving teaching generally rely on prescription and why this is always destined to fail.
  • Adaptation – how whatever “innovation” is introduced into a social setting, especially one like a university and the practice of teaching, will be adapted by the participants both negatively and positively. Importantly, a suggestion that institutional leaders need to forget about proscribing the negative effects and instead focus on encouraging the positive. Not to mention the need to more effectively engage with context and ignore “best” practice.
  • Improvement is a journey, not a blueprint – where I’ll try and outline the foundations of an alternative approach to improving teaching.

In the last section, I’ll also explain why I’ve used a photo of a Pixar movie character at the start of this post.


Damien writes in his post

It occurs to me that prescribing any particular learning theory (such as constructive alignment) is not the answer

Absolutely, this is the problem I have with most of what is practiced around improving teaching at universities, it seeks to make prescriptions. This is one example of what I label within the reflective alignment idea as “level 2” knowledge, which is defined as:

  1. What the management does.
    This is the horrible simplistic approach taken by most managers and typically takes the forms of fads. i.e. where they think X (where X might be generic skills, quality assurance, problem-based learning or even, if they are really silly, a new bit of technology) will make all the difference and proceed to take on the heroic task of making sure everyone is doing X. The task is heroic because it usually involves a large project and radical change. It requires the leadership to be “leaders”. To wield power, to re-organise i.e. complex change that is destined to fail.

When applying “level 2” knowledge about improving teaching it is typical for a small group of folk to go away, identify based on their expertise and perspectives what the solution is and then prescribe it for everyone else. Where everyone might be the program, department or the institution. You can see this quite often when there are headlines like “All students will complete at least one online course”, “All courses in our medical program use Problem-based learning”, or “All courses will have an online presence”, or even worse “All courses will have an online presence that consists of A, B, C and E with an option of F”.

Paul Ramsden – an example of “level 2” knowledge

On of the interesting aspects of the Knight and Trowler (2000) paper is that they offer a criticism of Paul Ramsden’s work. This is the first criticism of that work I’ve heard (which may say more about the breadth and depth of my reading) and one that resonates strongly with the point I’m trying to make here. It also appears to criticise the idea of “transformational leadership”, which I’m also not a fan of – two birds one stone, perhaps.

Knight and Trowler (2000) argue that Ramsden’s (1998) suggestions for improving teaching illustrate the perspective of a leader that prescribes a solution with little focus on how it will be received by the academics that will be required to adopt it. They give an example to illustrate this

Ramsden suggests that departmental leaders establish a student liaison forum where students can meet staff over lunch to canvass ideas and creative options for better teaching and learning. Such an event would be a desirable effect, rather than an achievable cause, of departmental change. In practice, in the departments most in need of change such a proposal would be met with a mixture of resistance, avoidance, coping or reconstructing strategies related to staff and students’ interpretation and reception of such an idea and its underpinning assumptions. The same is true of most of the rest of Ramsden’ s proposals, such as forming groups of staff interested in working through key texts on teaching during their lunchtimes or encouraging peer observation of teaching by being the first to be observed.

This resonates strongly with me and my experience. Just last year I saw an attempt at “forming groups of staff” fail after a couple of meetings. And I see this all the time with the “prescriptions” that are rolled out by institutions.

The prescription approach ignores the findings from work on workarounds (Ferneley and Sobreperez, 2006), shadow systems (Jones et al, 2004) and task corruption. It ignores that nature of academics and teaching process.

Most importantly and pragmatically, it does NOT work. Knight and Trowler (2000)

Likewise, attempts to improve teaching by coercion run the risk of producing compliance cultures, in which there is `change without change’ , while simultaneously compounding negative feelings about academic work

Of course, there’s a neat research project in finding empirical evidence to back that claim up. It might go something like this:

  • Take a look at all of the attempts to improve teaching at an institution or two, three..over a certain time period.
  • Categorise those approaches based on the level of prescription.
    e.g. how far removed from the coal face academics was the prescription decision made? What type of participation did coal face academics have in preparing the prescription? e.g. were they “consulted” (and then ignored) about what they thought of the idea? Were they involved heavily from the start? How different is the prescription from current practice?
  • Determine how successful those prescriptions have been.
    First criteria would be, “is it still being used?”. The second criteria could be, “How is it being used?”. i.e. find out whether or not academics are working around the prescription. Lastly, “What impact has the prescription had?”.

Adaptation – why prescription fails?

Why do I think this approach fails? Well, there are the empirical results arising from my observations. Observations of prescription after prescription fail either through lack of use or task corruption. There are, however, also theoretical reasons and/or beliefs about the nature of teaching, academics, universities and how to effectively enable change. The following covers one particular area around the importance and inevitability of adaptation.

The importance and ignorance of place

The Ps Framework: a messy version

In the Ps Framework I have identified “Place” as the environment in which it all takes place. It is the foundation. The nature of the “Place” (or the context) in which teaching takes place is an essential influence on what is possible and what happens. Importantly, there is also the idea that “Place” is unique. The institution I work for is different others. The departmental culture you belong to is different from the one I belong to.

Knight and Trowler (2000) suggest

Yet how this is done will vary from context to context. Case studies of actual innovations such as the Rand Change Agent Study (1974-78) have confirmed that the need to achieve mutual adaptation of the innovation and the context is one important component of successful innovations

There are many related perspectives, including Gonzalez (2009)

Factors arising from the context within which the staff member is teaching also proved to influence the approach finally adopted

Not to mention the Trigwell framework (2001) I’ve used repeatedly.

So what has this got to do with the failure of the prescription approach to improving teaching? Knight and Trowler (200) quote Fullan

… one of the basic reasons why planning fails is that the planners or decision makers of change are unaware of the situations that potential implementers are facing. They introduce changes without providing a means to identify and confront the situational constraints and without attempting to understand the values, ideas and experiences of those who are essential for implementing any changes. (Fullan, 1991, p. 96)

Academics are knowledge workers

How do you think academics react when a prescription is made that illustrates little or no understanding of the constraints within which they operate? Let’s take a little test of interactivity and have a poll. Go on, interact.

View Poll

Perhaps it’s no surprise which of the above options I believe to be somewhat unlikely. One reason I think this is that I believe academics are knowledge workers. As knowledge workers academics have considerable autonomy about how they perform tasks and often can and do resist the imposition of new technology and changes to routine. Which links to and is informed by Drucker’s views of knowledge workers “Knowledge workers own the means of production. It is the knowledge between their ears. And it is a totally portable and enormous capital asset.”

Knight and Trowler (2000) suggest

Creating an environment in which lecturers feel that they have control over their teaching, that teaching is valued and that they have room to take chances, has been found to assist in the move towards a student-focused approach which leads them towards deep learning and significant conceptual change.

Senge (1999) offers a view on the impacts of a prescriptive approach to change

Top driven change…do(es) not reduce fear and distrust, nor unleash imagination and creativity, nor enhance the quality of thinking in the organization

Inevitability of adaptation

Arising from the view of academics and knowledge workers, the importance of context and more generally the social shaping of technology literature it is inevitable that any innovation or prescription will be adapted as it is adopted. Response to change in academic contexts always produces unintended results (Meister-Scheytt and Scheytt, 2005), outcomes are unpredictable and fuzzy (Knight and Trowler, 2000). In part because “human agency means that there is choice and that actions can be taken to maximise work satisfaction in the face of structural changes” (Knight and Trowler, 2000).

People, particularly academics when it comes to teaching, will modify how a prescription operates. Partly in an aim to “handle” the prescription but also, importantly, because introducing a change in a context will generate new experiences and new insight that will shape the system, its culture and expectations. Perhaps for good and perhaps for bad.

Improvement is a journey, not a blueprint

So what’s the solution? Knight and Trowler (2000)

We suggest that learning organisations require learning managers: managers who are reflective practitioners and who apply their analytical skills to the important activity systems with which they are engaged, and develop with other staff appropriate, contextualised, strategies for change. Fullan (1993) reminds us that change is a journey, not a blueprint. Journeys are usually engaged in with a specific destination in mind, but the one reached may be significantly different from that originally envisaged and there are usually as many reasons for going as there are travellers.

Making great time, rather than having a great time

My eldest son has a growing fascination, along with many kids his age, with animation, and in particular movies from Pixar. Over the weekend, after much badgering, he received a copy of Cars. In that movie one of the characters has the line

Cars didn’t drive on it to make great time. They drove on it to have a great time.

The prescription approach is an example of teleological design. Teleological design places an emphasis on the destination, not the journey. Ateleological design reverses that.

I’m suggesting that improving teaching requires a much more ateleological approach. In attempting to explain the difference my co-authors (Jones, Luck et al, 2005) and I came up with the following

An analogy involving how to plan an overseas trip can provide a more concrete example of the differences between teleological and ateleological design. The extreme teleological approach to such a trip involves taking a package tour. Such a tour has a fixed, upfront plan designed by a group of experts, with little or no knowledge of the individual traveller, to appeal to a broad cross section of people. The extreme ateleological approach involves the traveller not having a fixed plan. Instead the traveller combines deep knowledge of her personal interests with a growing contextual knowledge of the destination to make unique choices that best suit her preferences and quickly modify her journey in response to unexpected events.

Knight and Trowler (2000) combine Weick and Fullan to arrive at

As Weick (1995) has observed in his analysis of organisational sense-making, aims are often elucidated after action, which suggests that the progress of change is more likely to be successful when it follows the path of `ready, fire, aim’ rather than the more usual `ready, aim, fire’ (Fullan, 1993, p. 31).

It’s more than that

So, do you just let each individual academic embark on their own back-packer journey of teaching. Doing what they want, when they want? No, that’s not what I’m arguing. Even with the back-packer analogy above, being an effective back-packer requires/is improved by an infrastructure that:

  • Improves/expands the travelers knowledge of the potential paces to visit.
  • Provides the necessary resources for the traveler to reach those places.

At this stage, I’m going to stop trying to extend this to a description of a solution. I’m stuck in a writer’s block and this post is already too long. Pick this up later.

Departmental leadership?

Knight and Trowler (2000) argue that

cultural change for the better can occur when the focus of leadership attention is at the level of the natural activity system of universities: the department or a subunit of it. However, cultural change has to be collaborative and is therefore unpredictable. Managers work in rather than on cultural contexts and their most important skills revolve around perceptiveness towards and analysis of these contexts

The build on this to suggest that middle managers – department heads – and how they lead are an important contributor to the quality of teaching. In particular, their use of approaches that “support the backpacker”.

I’m not convinced that the department-based approach is all that effective. I’m not sure there is an appropriate level of diversity within such groups to ensure a broad enough selection of destinations for travel.

I’m also not convinced that Knight and Trowler’s (2000) emphasis on leadership, especially that of middle management, is the full story. It continues the emphasis on Level 2 knowledge about improving learning and teaching (an emphasis on what management does) and also assumes that what a single individual does (the leader) is the complete story.

For me, the entire system, its processes and polices has to be focused on what the teacher does. On providing the infrastructure that provides the teacher with (at least) the two points introduced in the last section.


Ferneley, E. and P. Sobreperez (2006). “Resist, comply or workaround? An examination of different facets of user engagement with information systems.” European Journal of Information Systems 15(4): 345-356.

Fullan, M. (1991). The New Meaning of Educational Change. London, Cassell.

Gonzalez, C. (2009). “Conceptions of, and approaches to, teaching online: a study of lecturers teaching postgraduate distance courses.” Higher Education 57(3): 299-314

Jones, D., S. Behrens, et al. (2004). The rise and fall of a shadow system: Lessons for enterprise system implementation. Managing New Wave Information Systems: Enterprise, Government and Society, Proceedings of the 15th Australasian Conference on Information Systems, Hobart, Tasmania.

Jones, D., J. Luck, et al. (2005). The teleological brake on ICTs in open and distance learning. Conference of the Open and Distance Learning Association of Australia’2005, Adelaide.

Knight, P. and P. Trowler (2000). “Department-level Cultures and the Improvement of Learning and Teaching.” Studies in Higher Education 25(1): 69-83.

Meister-Scheytt, C. and T. Scheytt (2005). “The complexity of change in universities.” Higher Education Quarterly 59(1): 76-99.

Trigwell, K. (2001). “Judging university teaching.” The International Journal for Academic Development 6(1): 65-73.

Ramsden, P. (1998). Learning to Lead in Higher Education. London, Routledge.

PhD Update #5 – a new low

Well this week has been the worst yet in terms of progress on the PhD, at least with the last 5 weeks of updates. Most of it has been work related. Issues and events that have taken away the time, motivation and peace of mind necessary to effectively engage with PhD work.

On the upside, today’s been pretty effective, perhaps the best for the weeks. Hopefully this trend can continue.

What I’ve done

Last week I wanted to try and

  • Complete at least 2 sections of the Ps Framework for Chapter 2 – probably “Past Experience” and “People”. If I’m motiviated, perhaps add “Product”.
    At best I’ve made some small movement on “Past Experience” and a fairly big step with part of “Product”.
  • Cleaned up a lot of the literature I’ve found in the last week.
    Have only done a modicum of this.

In terms of PhD related blog posts, this week has produced:

  • one ring to rule them all;
    Fairly good start on one section of the “Product” part of chapter 2, some good references and points starting to be developed. (at least I’m happy with them).
  • Myth of rationality;
    Some good components of what will go into the “Process” part of the Ps Framework. Including some literature to suggest that the supposedly rational process is far from it.
  • Poor crafstman;
    More related to “Past experience” and “People” to do with the technology not improving L&T.
  • Making the LMS mythic;
    More criticisms of the LMS approach to e-learning, drawing on some literature and Postman’s ideas about 5 things to know about technological change.
  • Postman’s 5 things about technology change; and
    Came across a speech by Postman in which he outlines 5 things to know about technology change. Definite resonances/application in the Ps Framework.
  • Cognition – we’re not rational.
    Early steps, sparked by another post, on developing some ideas for the “People” component of the Ps Framework.

What to do next week?

Essentially finish what I said I would do last week and do more of the Ps Framework. Don’t let current events get me down.

“One ring to rule them all”: Limitations and implications of the LMS/VLE product model

As part of the PhD I’m developing the P Frameworks as a theory for analysing/understanding the factors the impact the organisational implementation of e-learning. Essentially, I argued that that the current institutional practice of e-learning within universities demonstrates an orthodoxy. Further, I argue that this orthodoxy has a number of flaws that limit, some significantly, potential outcomes.

In this post, and a few following, I’m going to develop a description of what I see as the orthodoxy associated with the “Product” component of the Ps Framework, what I see as the flaws associated with that orthodoxy, and the impacts it has on the institutional impact of e-learning. The “Product” component of the Ps Framework is concerned with

What system has been chosen or designed to implement e-learning? Where system is used in the broadest possible definition to include the hardware, software and support roles.

One ring to rule them all

The emphasis in this post is on the “one ring to rule them all” approach characteristic of an enterprise system like a learning management system (LMS)/virtual learning environment (VLE).

The product is almost always a LMS

It is broadly accepted that the almost universal response to e-learning within universities has been a selection of a Learning Management System (LMS) aka Virtual Learning Environment (VLE) or Course Management System (CMS). By 2005 there was an almost universal adoption of just two commercial LMSs (Coates, James, & Baldwin, 2005). The 2003 Campus Computing project reports that more than 80% of United States universities and colleges utilize a LMS (Morgan 2003). Elgort (2005) cites work that indicates that 86% of 102 UK universities are using a LMS and all 18 surveyed New Zealand based institutions used a LMS. Smissen and Sims (2002) found that 34 of the 37 Australian universities were using one of two LMS – Blackboard or WebCT. If not already adopted, Salmon (2005) suggests that almost every university is planning to make use of an LMS. Indeed, the speed with which the LMS strategy has spread through universities is surprising (West, Waddoups, & Graham, 2006).

The trend in recent years has been a move away from commercial systems to open or community source systems such as Moodle or Sakai. Whether or not your LMS is open source or commercial, doesn’t change the underlying product model. All LMS are based on the enterprise or “one ring to rule them all” approach.

In terms of the limitations this brings to e-learning and its implications for practice, there is no significant difference between open source and commercial LMSes.

What is an LMS?

LMSes are software systems that are specifically designed and marketed to educational institutions to support teaching and learning and that typically provide tools for communication, student assessment, presentation of study material and organisation of student activities. A university’s LMS forms the academic system equivalent of enterprise resource planning (ERP) systems in terms of pedagogical impact and institutional resource consumption (Morgan, 2003).

There are more similarities than differences between individual learning management systems. Each LMS consist of a standard set of tools for communication, assessment, information distribution and management. Beyond these standard features LMS distinguish themselves through micro-detailed features (Black, Beck, et al. 2007).

An LMS is an integrated system. A unified collection of different services or tools produced by a single vendor that can be managed through a single interface. While the interface, abstractions and some of the tools will be different between different LMS, the underlying model of an integrated system remains the same.

Based on experience (I’ve been trying to explain this since around 2000) the points I’m trying to make are somewhat easier made, if the argument is accompanied by graphical representations. So let’s start with the next two images. These are intended to represent, at a very high level, two different LMSes. There’s a different colour, a slightly different shape, however, there is some commonality in the structure. They are a collection of services, slightly different sized/shaped squares, and both have an overall shape. The overall shape is meant to represent the functionality perceived by the organisation and its users. There is some commonality in shape between the two systems, but moving from one to another does involve some negotiation, translation and change.

Abstraction of an LMS

Abstraction of an LMS

LMS design largely focuses on satisfying certain functional requirements, such as the creation and distribution of on-line learning material and the communication and collaboration between the various actors (Avgeriou, Retalis et al. 2003) . There is a list of common functions that are now expected of an LMS (quiz tool, discussion forum, calendar, collaborative work space, grade book etc) consequently there are more similarities than differences between different LMS (Black, Beck et al, 2007). The only real difference between LMS, lie in marketing approaches (Carriere, Challborn and Moore, 2005).

The following image provides an expanded view of one LMS, where the individual features have been identified. This will be revisited later.

Expanded LMS abstraction

What’s the product model of an LMS? – one ring to rule them all

The following sentence was used above. A university’s LMS forms the academic system equivalent of enterprise resource planning (ERP) systems in terms of pedagogical impact and institutional resource consumption (Morgan, 2003). The LMS product model is essentially equivalent to that of the ERP system. It’s an integrated system. i.e. it contains lots of different modules which are all tightly integrated, generally because they are provided by a single vendor (or in the case of open source, by a single community). The ERP model has become “the dominant strategic platform for supporting enterprise-wide business processes” and have been generally implemented to overcome issues arising from custom development (Light, Holland et al, 2001).

What are the limitations of this model?

The focus of this post is not to talk about the limitations and implications of the design decisions that have gone into most LMS. For example, the decision to use the “course” as the major approach for organising content and interactions. This particular topic has gotten broad coverage within the e-learning literature. For example, and in a obvious case of self-citation, Beer and Jones (2008) provide a brief discussion of these limitations and provide pointers to relevant literature. I will cover this topic in the thesis, but not in this post.

The focus here is on the limitations associated with the “product model” – the one ring to rule them all approach. The list of limitations of this approach suggested here includes:

  • You can’t change the system.
  • The organisation and its people are Forced to adapt to the system.
  • You are limited to a single vendor or community.

These limitations have some significant ramifications for the practice of e-learning within a university. These limitations:

  • increase risk;
  • reduce quality;
  • increase complexity of implementation and support;
  • reduce flexibility and competitiveness; and
  • significantly constrain innovation and differentiation.

Can’t change the system

The nature of enterprise systems and the main reasons for adopting them mean that they cannot be changed. Modifying enterprise systems will: increase development time, increase required staff resources during and after implementation, reduce the capability to upgrade the system by making it more difficult, and go against the reasons for adopting an enterprise system in the first place (Light, Holland and Wills, 2001). Modifying such a complex system leads to expensive maintenance requirements (Dodds, 2007)

Hence the phrase “vanilla implementation”. When you implement an enterprise system, you cannot change it. You must/should implement it as is. This is supposed to be a good thing as enterprise systems are expected to implement “best practice”.

This creates a mismatch between the LMS and the nature of the context and activity it is meant to be supporting. e-learning or learning and teaching more generally takes place within a context that is rapidly changing in terms of technology, understandings of how to use the technology and a broad array of other societal trends and influences.

Let’s take the simplest of these, technology. Both the hardware and the software technologies underlying online education are undergoing a continuing process of change and growth (Huynh, Umesh, & Valacich, 2003). Any frozen definition of ‘best’ technology is likely to be temporary (Haywood, 2002). Increasing consumer technological sophistication adds to demand for sustained technological and pedagogical innovations (Huynh et al., 2003).

With an enterprise system the institution can’t change the system, they have to rely on the vendor or community to change the system. This might work for broader societal trends, but it certainly doesn’t work for organisational requirements. Commercial vendors aren’t interested in the unique customisation needs of an individual organisational client.

Forced to adapt to the system

Over the last 10/15 years many universities have implemented enterprise systems for a range of tasks. All too often these sytems make the organisation conform to the system, the system forces teaching and research to conform to the IT system (Duderstadt, Atkins et al, 2002). Such systems lack the end-users view of business processes and require the institution to modify its practices to accommodate the system (Dodds, 2007). Theoretically, not doing so forgoes an opportunity for positive change, as such systems are meant to embody best practice (Dodds, 2007).

However, the ‘best practice’ view embodied in the LMS, may not be a match for the institution’s interests (Jones, 2004). Such systems impose their own logic on a companies strategy, structure and culture and push the company towards generic processes even when customised processes may be a source of competitive advantage (Davenport, 1998) Technology is not, of itself, liberating or empowering but serves the goals of those who guide its design and use (Lian, 2000). The tools themselves are never value-neutral but are replete with values and potentialities which may cause unexpected responses (Westera, 2004).

For an LMS, this implies some level of standardisation of teaching and learning processes towards those supported by the LMS. As two of the most highly personalised sets of processes within institutions of higher education, any attempt at standardising teaching and learning is likely to be radical, painful and problematic (Morgan, 2003). It will increase the difficulty of implementation and most likely cause resentment amongst academics and students due to the imposition of a change of uncertain value.

The standardisation of, and the values embedded in, CMS design can create a number of operational conditions for the client institution that push teaching and learning in a particular direction. For example, most CMS vendors assume a self-paced learner and so these systems are not rich in interaction or collaboration tools (Bonk, 2002) beyond simple chat rooms, email and discussion forums. CMSs are by nature structured and have limited capability for customisation (Morgan, 2003). A choice for enterprise CMS made for administrative reasons can result in students having access to different pools of electronic resources, thus affecting the quality of their educational experiences (Dutton & Loader, 2002).

Let’s illustrate this graphically. I’m going to reuse these graphical representations in latter posts as I develop and suggest a much better alternative.

First, let’s look at the organisation in its pre-enterprise system state. For these purposes I’ll suggest that there are two main components:

  • the social system; and
    This is the collection of practices, expectations and beliefs about how learning and teaching and related activities are performed. It can range from how big a course is, what a course is called (my institution used call a “course” a “unit”), how teaching responsibilities are allocated, what happens at the start of term, teaching preferences etc.
  • the infrastructure.
    The hardware, software and other support systems and processes that support the social system and how it works.


The above representation is very optimistic. It assumes that the infrastructure and the social system integrate very well. There are few gaps between the infrastructure and the social system. The reality I believe is much worse, there are usually significant gaps that require people within the social system to take on busy work to overcome the limitations.

If the institution is looking at adopting an LMS, then chances are the gaps – or least the gaps as perceived by some people – are quite large and require the implementation of an LMS to fill them.

Post LMS implementation the representation looks something like the following.

Post LMS

Note the insertion of one of the LMS representations from above. Also note that the LMS has over-riden aspects of the structure of the Social System. This represents the necessity of the social system to be changed in order to fit with the unchanging nature of the LMS.

I should point out that the similar over-riding of infrastructure by the LMS is probably not 100% accurate. There may be a bit of over-riding, where the infrastructure is changed to suit the LMS. For example, I know of one institution that had to buy completely new server hardware because the new LMS didn’t like the existing hardware. However, in many cases the gap between existing infrastructure and the LMS will be bridged by “middleware”. An odds and sods collection of technical approaches used to manipulate the structure/output of the infrastructure to suit the middleware.

Important: This type of “modification” is deemed to be appropriate and efficient. However, any “modification” that helps bridge the gap between the LMS and the social system, is deemed to be “bad”.

Single vendor or community

Any changes that are made to the enterprise system must be made by the vendor (commercial) or community (open source). As the previous two limitations point out, the institution cannot change the system. They must rely on the vendor/community to do so.

Motiviations In many cases this can cause a problem because the motivations of the vendor/community (more so the vendor) don’t always match those of the organisation. For example, Avgeriou, Retalis et al (2003) suggest

…the quality requirements of LMSs are usually overlooked and underestimated. This naturally results in inefficient systems of poor software, pedagogical and business quality. Problems that typically occur in these cases are: bad performance which is usually frustrating for the users; poor usability, that adds a cognitive overload to the user; increased cost for purchasing and maintaining the systems; poor customizability and modifiability; limited portability and reusability of learning resources and components; restricted interoperability between LMSs.

A question of scale. The single vendor or community now becomes the development bottleneck. Very early on with my involvement with e-learning I suggested that it would be impossible for a single institution (in this case a university) to provide all of the services and functionality required of e-learning within a university (Jones and Buchanan, 1996). My argument here is that the same thing applies to a single vendor or open source community.

At its simplest the single vendor or community (no matter how large the community) is always going to be smaller than the broader community. As a simple example I performed a number of Google searches, the following list shows what I searched for and the number of “hits” Google found:

  • moodle discussion forum – 167,000 hits;
  • sakai discussion forum – 234,000 hits;
  • blackboard discussion forum – 238,000 hits; and
  • web discussion forum – 45,100,000 hits.

The following graph of the above figures reinforces that we’re talking about an order of magnitude difference.

Comparison of search results for "discussion forum"

Given the relative sizes of the community, where do you think the best discussion forum is going to come from. One of the minute communities around a specific LMS, or the much larger general web-based community? This is one aspect of the idea of “worldware” developed by Steve Erhman and defined as

Let’s define worldware to be hardware or software that is used for education but that was not developed or marketed primarily for education.

Is there an alternative?

So, if the enterprise system approach has so many problems, is there an alternative? The simple answer is yes, best of breed. Though, as with any wicked problem, it’s not necessarily the answer. In fact, as I’ll argue in a latter post, I don’t believe the traditional best of breed approach is appropriate for e-learning. Until then, I’ll finish with the following table from Light, Holland and Wills (2001) that seeks to compare best of breed with the ERP approach.

Best of breed Enterprise
Organisation requirements and accommodations determine functionality The vendor of the ERP system determines functionality
A context sympathetic approach to BPR is taken A clean slate approach to BPR is taken
Good flexibility in process re-design due to a variety in component availability Limited flexibility in process re-design, as only one business process map is available as a starting point
Reliance on numerous vendors distributes risk as provision is made to accommodate change Reliance on one vendor may increase risk
The IT department may require multiple skills sets due to the presence of applications, and possibly platforms, from different sources A single skills set is required by the IT department as applications and platforms are common
Detrimental impact of IT on competitiveness can be dealt with, as individualism is possible through the use of unique combinations of packages and custom components Single vendor approaches are common and result in common business process maps throughout industries. Distinctive capabilities may be impacted on
The need for flexibility and competitiveness is acknowledged at the beginning of the implementation. Best in class applications aim to ensure quality Flexibility and competitiveness may be constrained due to the absence or tardiness of upgrades and the quality of these when they arrive
Integration of applications is time consuming and needs to be managed when changes are made to components Integration of applications is pre-coded into the system and is maintained via upgrades


Avgeriou, P., S. Retalis, et al. (2003). An Architecture for Open Learning Management Systems. Advances in Informatics. Berlin, Springer-Verlag. 2563: 183-200.

Beer, C. and D. Jones (2008). Learning networks: harnessing the power of online communities for discipline and lifelong learning. Lifelong Learning: reflecting on successes and framing futures. Keynote and refereed papers from the 5th International Lifelong Learning Conference, Rockhampton, Central Queensland University Press.

Black, E., D. Beck, et al. (2007). “The other side of the LMS: Considering implementation and use in the adoption of an LMS in online and blended learning environments.” Tech Trends 51(2): 35-39.

Bonk, C. (2002). Collaborative tools for e-learning. Chief Learning Officer: 22-24, 26-27.

Carriere, B., C. Challborn, et al. (2005). “Contrasting LMS Marketing Approaches.” International Review of Research in Open and Distance Learning 6(1): 1492-3831.

Coates, H., R. James, et al. (2005). “A Critical Examination of the Effects of Learning Management Systems on University Teaching and Learning.” Tertiary Education and Management 11(1): 19-36.

Davenport, T. (1998). “Putting the Enterprise into the Enterprise System.” Harvard Business Review 76(4): 121-131.

Dodds, T. (2007). “Information Technology: A Contributor to Innovation in Higher Education.” New Directions for Higher Education 2007(137): 85-95.

Duderstadt, J., D. Atkins, et al. (2002). Higher education in the digital age: Technology issues and strategies for American colleges and universities. Westport, Conn, Praeger Publishers.

Dutton, W. and B. Loader (2002). Introduction. Digital Academe: The New Media and Institutions of Higher Education and Learning. W. Dutton and B. Loader. London, Routledge: 1-32.

Elgort, I. (2005). E-learning adoption: Bridging the chasm. Proceedings of ASCILITE’2005, Brisbane, Australia.

Haywood, T. (2002). Defining moments: Tension between richness and reach. Digital Academe: The New Media and Institutions of Higher Education and Learning. W. Dutton and B. Loader. London, Routledge: 39-49.

Huynh, M., U. N. Umesh, et al. (2003). “E-Learning as an emerging entrepreneurial enterprise in universities and firms.” Communications of the AIS 12: 48-68.

Jones, D. (2004). “The conceptualisation of e-learning: Lessons and implications.” Best practice in university learning and teaching: Learning from our Challenges. Theme issue of Studies in Learning, Evaluation, Innovation and Development 1(1): 47-55.

Lian, A. (2000). “Knowledge transfer and technology in education: Toward a complete learning environment.” Educational Technology & Society 3(3): 13-26.

Light, B., C. Holland, et al. (2001). “ERP and best of breed: a comparative analysis.” Business Process Management Journal 7(3): 216-224.

Salmon, G. (2005). “Flying not flapping: a strategic framework for e-learning and pedagogical innovation in higher education institutions.” ALT-J, Research in Learning Technology 13(3): 201-218.

Smissen, I. and R. Sims (2002). Requirements for online teaching and learning at Deakin University: A case study. Eighth Australian World Wide Web Conference, Noosa, Australia.

West, R., G. Waddoups, et al. (2006). “Understanding the experience of instructors as they adopt a course management system.” Educational Technology Research and Development.

Westera, W. (2004). “On strategies of educational innovation: between substitution and transformation.” Higher Education 47(4): 501-517.

The myth of rationality in the selection of learning management systems/VLEs


Over the last 10 to 15 years I’ve been able to observe at reasonably close quarters at least 3 processes to select a learning management system/virtual learning environment (LMS/VLE) for a university. During the same time I’ve had the opportunity to sit through presentations and read papers provided by people who had led their organisation through the same process.

One feature that the vast majority of these processes have reportedly had was objectivity. They were supposedly rational processes where all available data was closely analysed and a consensus decision was made.

Of course, given what I think about people and rationality it is of little surprise that I very much doubt that any of these processes could ever be rational. I think most of the folk claiming that it was rational are simply trying to dress it up, mainly because society and potentially their “competitors” within the organisation expect them to be, or at least appear to be, rational.

I don’t blame them. The vast majority, if not all, of what is taught in information systems/technology, software development and management automatically assumes that people are rational. It’s much easier to give the appearance of rationality. This really is a form of task corruption, in this case the simulation “type” of task corruption.

The reality?

So, if it isn’t rational and neat, what is it? Well messy and contingent and highly dependent on the people involved, their agendas and their relative ability to influence the process. And I’ve just come across probably the first paper (Jones, 2008 – and no, I’m not the author) that attempts to engage with and describe the messiness of the process.

It’s also somewhat appropriate as it provides one description of the process used by the Open University in the UK to adopt Moodle, the same LMS my current institution has selected.

The paper concludes with the following

There is no one authoritative voice in this process and whilst the process of infrastructural development and renewal can seem to be the outcome of a plan the process is one that is negotiated between powerful institutional interests that have their roots in different roles within the university. Negotiation is not only between units and the process of decision making is also affected by the sequence of time in taking decisions, for example by who is in post when key decisions are taken. Decisions taken in terms of the technological solutions for infrastructural development have definite consequences in terms of the affordances and constraints that deployed technologies have in relation to local practices. The strengths and weaknesses of an infrastructure seem to reside in a complex interaction of time, artefacts and practices.


If we know that, even in the best of situations, human beings are not rational, and we know that in situations involving complex problems involving multiple perspectives, that the chances of a rational, objective decision is almost possible then:

  • Why do we insist on this veneer of rationality?
  • Why do we enter into processes like an LMS evaluation and selection using processes that assume everyone is rational?
  • Are there not processes that we can use that recognise that we’re not rational and that work within those confines?

Comment on Moodle

The paper includes the following quotes from a couple of senior managers at the Open University. When asked about the weakness of the approach the OU were taken, one senior manager responded

Weakness ? …the real weakness is probably in the underlying platform that we’ve chosen to use, Moodle. That’s probably the biggest weakness, and I think we made the right decision to adopt Moodle when we did. There wasn’t another way of doing it.

Then a senior manager in learning and teaching had this to say, continuing the trend.

Where Moodle was deficient was in the actual tools within it, as the functionalities of the tools were very basic. It was also very much designed for – in effect – classroom online. It’s a single academic teaching to a cohort of students. Everything’s based around the course rather than the individual student. So it’s teaching to a cohort rather than to an individual, so a lot of the work has gone in developing, for example, a much more sophisticated roles and permissions capability. There really are only 3 roles administrator, instructor, and student, but we have multiple roles…

This is particularly interesting as my current institution has some similarities with the OU in terms of multiple likely roles.

Of course, given that organisations are rational, I’ll be able to point out this flaw to the project team handling the migration to the new LMS. They will investigate the matter (if they don’t already know about it), and if it’s a major problem incorporate a plan to address it before the pilot, or at least the final migration.

Of course, that’s forgetting the SNAFU principle and the tension between innovation and accountability and its effects on rationality.


It has been pointed out to me that the penultimate paragraph in the previous section, while making the point about my theoretical views of organisations and projects, does not necessarily represent a collegial, or at least vaguely positive, engagement with what is a hugely difficult process.

To that end, I have used formal channels to make the LMS implementation team aware of the issue raised in Jones (2008).

I have also thought about whether or not I should delete/modify the offending paragraph and have decided against it. There will always be ways to retrieve the original content and leaving both the paragraph and the addendum seems a more honest approach to dealing with it.

I also believe it can make a point about organisations, information systems projects and the information flows between users, developers and project boards. The SNAFU principle and various other issues such as task corruption do apply in these instances. Participants in such projects always bring very different perspectives and experiences, both historically and of the project and its evolution.

To often, in the push to appear rational the concerns and perspectives of some participants will be sidelined. Often this creates a sense of powerlessness and other feelings that don’t necessarily increase the sense of inclusion and ownership of the project that is typically wanted. Often the emphasis becomes “shoot the messenger” rather than deal with the fundamental issues and limitations of the approaches being used.

The push to be a team player is often code for “toe the company line”, a practice that only further increases task corruption.

I have always taken the approach of being open and transparent in my views. I generally attempt to retain a respectful note when expressing those views, but sometime, especially in the current context, that level may not meet the requirements of some. For that I apologise.

However, can you also see how even now, I’m struggling with the same issues as summarised in the SNAFU principle? Should I take more care with what I post. To an extent of avoiding any comments that might be troubling for some? Since, if I’m too troubling, it might come back and bite me.

Or is it simply a case of me being rude and disrespectful and deserving of a bit of “bite me”?

What do you think? Have your say.


Jones, C. (2008). Infrastructures, institutions and networked learning. 6th International Conference on Networked Learning, Halkidiki, Greece.

“Blame the teacher” isn’t new to technology-mediated learning

I’ve been banging on about the tendency for educational technology folk, especially those in the technologists alliance to “blame the teacher” as the reason why technology-mediated learning hasn’t achieved all of its promise.

I came across a paper that illustrates just how long this tendency as been around. Petrina (2004) passes a critical eye over Pressey’s early work on the first teaching machines in the early 1920s and what broader lessons there might be. But it also includes a couple of quotes that illustrate Pressey’s tendency to “blame the teacher” for the failure of his utopian dreams.

First, his dream

Within the next twenty years special mechanical aids will make mass psychological experimentation commonplace and bring about in education something analogous to the Industrial Revolution. There must be an ‘industrial revolution’ in education in which educational science and the ingenuity of educational technology combine to modernize the grossly inefficient and clumsy procedures of conventional education.

And of course now, “blame the teacher”

the intellectual inertia and conservatism of educators who regard such ideas as freakish or absurd, or rant about the mechanization of education when the real purpose of such a development is to free teachers from mechanical tasks.

Pressey about that his machine might

provoke some sentimentalists to an outcry against ‘education by machine’

Later in the paper Petrina tells the story about one teacher who was enthuiastic about Pressey’s teaching machine. This teacher was already in the habit of giving a true/false test at the beginning of every class by the method of him reading the true/false statements to the class and his sister doing the scoring. Pressey’s machine appeared to match his current practice and at the same time do away with the need for him to use his sister.


Petrina, S. (2004). “Sidney Pressey and the Automation of Education, 1924-1934.” Technology and Culture 45(2): 305-330.

Poor craftsman – or the “blame the teachers” excuse


I strongly believe in the notion that both learning and teaching, and attempting to improve learning and teaching, are wicked design problems to which there is no single answer, there are no right nor easy answers. The better answers lie in a broad recognition, understanding and synthesis of the diverse perspectives that exist; an in-depth understanding of the local context in which you are trying to operate; and a broad (usually much broader than people assume) set of knowledge of concepts and fields that might help.

The following is an attempt to describe one perspective and to explain why I think there may be other perspectives that highlight more fruitful was forward. In an attempt to enliven the discussion, some of the terms I use may be seen to denigrating. That is not my attempt. I’m simply trying to keep people awake and encourage them to read and ponder the following.

A poor craftsman

Anyone who has known me for any length of time will have heard me use the phrase: “A poor workman blames his tools”. Generally this in relation to someone blaming vi or the UNIX command line for being difficult, or someone at cricket blaming their bat for getting them out.

A recent post of mine entitle technology will not change the way we teach sparked a reaction from Ray Tolley. I think initially, because I used eportfolios as the specific case for a broader point, i.e. that the introduction of a new technology will not, by itself, change the way academics within universities teach.

In the subsequent discsussion within the comments on the post, Ray used the same quote, different words.

A bad craftsman blames his tools but a good craftsman always uses the right tools for the job.

In addition, his post in response to mine certainly includes some description of some “poor crafstman”.

I do believe there is something in this quote when applied to learning and teaching in a university context (I’m limiting myself to the context of which I have some experience – learning in other contexts may be another matter, but there might be connections). There are a number of academics who are “poor craftsman” and should be dealt with as such.

However, it is very common to hear university management and university staff who are employed to support/enable the work of teaching academic staff to extend the “poor craftsman” assumption too far. When this happens it becomes what I’ve called the “blame the teacher” approach to university management. (This earlier post explains the origins of the “blame the teacher” idea and how it is borrowed from Biggs’ constructive alignment work.

The technologists alliance

The “blame the teacher” approach is also used by technology innovators to explain why their brilliant innovation hasn’t been adopted by more than a handful of other folk. I know, I’ve used this line in the past myself. However, for a while now I believe that this sort of approach is not productive and illustrates an developer-focus, rather than an adopter focus (Surry and Farquhar, 1997).

“Blame the teacher” allows the innovator/manager to avoid responsibility, or at least avoid the more difficult task of understanding what it is about the context within which the learning and teaching is occuring which is allowing and encouraging teaching academic staff to be “poor teachers”.

It is this avoidance, which I believe, contributes to the problems that Geoghegan (1994) establishes with the “technologists alliance”. I’ve talked about this before but the point is in this quote from Geoghegan

Ironically, while this alliance has fostered development of many instructional applications that clearly illustrate the benefits that technology can bring to teaching and learning, it has also unknowingly worked to prevent the dissemination of these benefits into the much larger mainstream population.

By ignoring the context, the “alliance” is, apparently unknowingly, working towards preventing the adoption of their innovation or the achievement of their stated goal.

Or put another way, by ignoring the perspectives of the context of the “poor craftsman” they are turning them off their idea. Especially, if the local context is particularly troubling.


Surry, D. and J. Farquhar (1997). “Diffusion Theory and Instruction Technology.” Journal of Instructional Science and Technology 2(1): 269-278.

It is true that those who learn under difficult conditions are better students, but are they better because they have surmounted difhculties or do they surmount them because they are better? In the guise of teaching thinking we set difficult and confusing situations and claim credit for the students who deal with them successfully.


Skinner, B. F. (1958). “Teaching Machines.” Science 128: 969-977.

Making the LMS/VLE “mythic”

In my last post I pointed to a talk by Postman that outlined five things we should know about technological change. This list has resonated me due to my involvement with elearning within universities and feelings that it is failing, often due to naive views of how technology can be implemented and what effects it will have on teaching and learning. This post continues/starts an attempt to make connections with Postman’s list and elearning.

Has elearning failed? Zemsky/Massey versus Sloan-C

Back in 2004 a report came out entitled “Thwarted Innovation: What happened to e-learning and why.” (Zemsky & Massey, 2004). It caused quite a furore because it basically claimed that elearning had failed. Claiming a major flaw in something that a lot of people hold near and dear is, to the cynical amongst us, a well-known and quite effective publishing strategy for increasing citations (150+ on Google Scholar) – an important measure of academic quality. But there can be to it than that. That’s one of the points this tries to make in general, without making any final claim about the Zemsky and Massey (2004) report.

Their claim that elearning had “failed” was always going to get a rise out of Geoghegan’s (1994) technologists alliance.

Aside: note of the date on the Geoghegan quote – 1994. This is not an idea arising from the last 10/15 years of elearning. It’s from a previous period in the history of technology-mediated learning. But I feel that it still applies to today’s practice of e-learning.

Geoghegan (1994) identifies the technologists alliance as including

faculty innovators and early adopters, campus IT (IT here is instructional technology – US phrase that includes instructional designers and information technology folk) support organizations, and information technology vendors with products for the instructional market.

I’m guessing that the Sloan Consortium (Sloan-C) could quite easily be included as a member of the technologists alliance.

Not surprisingly, Sloan-C and its members formulated a response to Zemsky and Massey. You can find it here. I’m currently working through their response, but I was struck by a particular quote that seems to connect with one of Postman’s five things.

Making elearning mythic

Sloan-C’s response to Zemsky and Massey, includes the following

That is, until technology becomes used without being noticed and, more importantly, without interfering with the mission of online education—i.e., delivering knowledge to anyone anywhere, articles such as TI may continue to be produced, claiming that eLearning has failed. Sloan-C is proud to be a part of the world-wide movement to insure that eLearning does not fail!

The “technology becomes used without being noticed” immediately made me think of Postman’s fifth idea about technological change, i.e. it becomes mythic. Postman describes it as

a common tendency to think of our technological creations as if they were God-given, as if they were a part of the natural order of things.

Now, it appears that the Sloan-C folk were trying to suggest that the difficulty and unreliability of the type of technology mentioned in Zemsky and Massey (2004) is a major problem and explanation for their findings. That is, problems with the implementation meant that it wasn’t transparent, but when those problems are fixed, all will be good.

Postman points out the problem when a technology becomes mythic

When a technology become mythic, it is always dangerous because it is then accepted as it is, and is therefore not easily susceptible to modification or control.

Problems with mythic technology

Back in the late 80s and early 90s, my current institution had quite a large, and to some extent in some areas of activity, very power distance education centre. A centre responsible for helping the university develop and deliver print-based distance education materials. For that centre, print-based technology had become mythic. i.e. if you did distance education, you used print.

The entire centre, its workflows, practices and structures were set up for print-based technology. Such technology required a lot of money and resource, and hence the inertia of that thinking is huge. It is a good 10 years since it became obvious that print-based distance education materials were becoming only a small part of a much broader collection of experiences enabled by e-learning. However, until very recently, print-based materials was still the focus of most of the money and most of the people and processes within that organisation.

Even now, many long-term academics within the institution are still expecting the old ways of print-based education to continue, even though organisational change has made that next to impossible.

Print-based education had become mythic. It became impossible to modify it. Even to question it. Even though some of us have been doing it for well over 10 years.

LMS/VLE – the mythic practice of institutional e-learning

When it comes to current practice of elearning within higher education, I’ve seen this again and again. Especially around the question of learning management systems/virtual learning environments. The selection and implementation of a LMS/VLE has become the standard, accepted and often unquestioned approach to elearning within universities. Surprisingly, that unquestioning approach is still strongly held even though there is a growing body of literature and personal experience arising from folk who have had to use and support these systems within universities.

Arguably, what has become mythic is the assumption that it is the responsibility of the institution to provide the infrastructure. So even when institutions get a small idea and think “we’ll have to do something with blogs or wikis”. The immediate assumption is that the institution must provide the blog or wiki. Only the institution can be assumed to reliably provide what is required by students.

Only a small step from there, is the idea of out-sourcing. i.e. the institution can save itself some money and resources by paying an external company to provide the infrastructure. The trouble is that this is really inserting a proxy into the equation. It still assumes that it is necessary for the institution to provide the infrastructure, the system. In this case they simply pay someone else to do it, but they are still paying it.

It’s re-arranging the deck chairs on the Titanic.

The same thing applies to open source learning management systems/virtual learning environments. It’s still the same approach that has become mythic.


The above has traveled broader afield than I had intended, and I have to get onto other things. So a quick summary of what I was thinking I might have covered in the above:

  • The adoption of an LMS within universities is mythic amongst a number of folk, current extensions (open source or out source) retain the old fundamentals, just add a few wrinkles.
  • The type of response provided by Sloan-C might be explained by folk for whom elearning, or some definition, has become mythic and consequently their response might be, at least partly, faulty. (As I read their response more I’ll form an opinion on that one).
    Actually, I just returned to that document and found I had reached the end. Sorry, but I don’t find it a convincing response. The major aspect of the response is to point to 20 million online learners. Quantity doesn’t tell us anything about quality of the learning experience – much of the literature suggests it is very poor. There are also other measures such as cost, return on investment etc.

    It then points to the National Centre for Academic Transformation projects which spend a lot of money and resources performing some radical transformations. I don’t see this practice scaling well.

  • Equally, Zemsky and Massey’s view might be flawed either through one or a combination of their lack of knowledge of e-learning – possibly through flawed methods (which appear to be there), or by being to stuck in their own patterns.

Point 1 above, raises the question of what is the alternative? Or perhaps what are the alternatives? Some will say personal learning environments. For me these are only a small aspect of the alternative. As the thesis work gets finished, I’ll share more of what I think the alternative is here.

Extension to the summary

Anyone who knows my work or has skimmed my publications will assume that the alternative answer I will propose will be Webfuse. This is the instantiation of the design theory I’m formulating for my PhD

Most people will be wrong. This is because most people are still stuck with the mythic nature of the LMS. They think Webfuse is an LMS. This can be seen in the rhetoric being circulated within the institution.

This perception is best illustrated by the comments I’ve been getting for 10 years. Why don’t you sell Webfuse?

Comments of this type assume Webfuse is an LMS. That it can be sold just like Blackboard, Desire2Learn or it can be made open source like Moodle or Sakai. They are wrong.

Webfuse is a different kettle of fish all together. The “mythic nature” of the LMS is perhaps one of the biggest hurdles to overcome in explaining the difference. Something I still need to work on.


Geoghegan, W. (1994). Whatever happened to instructional technology? 22nd Annual Conferences of the International Business Schools Computing Association, Baltimore, MD, IBM.

Zemsky, R. and W. F. Massey. (2004). “Thwarted innovation: What happened to e-learning and why.” Retrieved 1st July, 2004, from

Postman’s – 5 things to know about technological change and e-learning

In doing a quick search for references to help out in the last post, I came across this page, which appears to be a transcript of a speech given by Neil Postman title “Five Things We Need to Know About Technological Change”. According to this post (that page has gone away, so a new link to a PDF transcript)it “was delivered by Postman in 1998 to a gathering of theologians and religious leaders in Denver, Colorado.”

Given my current and recent fascination with “Past Experience and e-learning, I particularly like these couple of quotes from Postman’s address.

Experiencing technological change as sleep-walkers

In the past, we experienced technological change in the manner of sleep-walkers. Our unspoken slogan has been “technology über alles,” and we have been willing to shape our lives to fit the requirements of technology, not the requirements of culture. This is a form of stupidity, especially in an age of vast technological change. We need to proceed with our eyes wide open so that we many use technology rather than be used by it.

And this one on who should be allowed to talk about new information technologies.

One might say, then, that a sophisticated perspective on technological change includes one’s being skeptical of Utopian and Messianic visions drawn by those who have no sense of history or of the precarious balances on which culture depends. In fact, if it were up to me, I would forbid anyone from talking about the new information technologies unless the person can demonstrate that he or she knows something about the social and psychic effects of the alphabet, the mechanical clock, the printing press, and telegraphy. In other words, knows something about the costs of great technologies.

I do believe that Postman is often thought as a simple Luddite. As against technology entirely. There are almost certainly other limitations on his work, however, the following quote suggests he’s not a Luddite

We must not delude ourselves with preposterous notions such as the straight Luddite position.

The 5 things

You really should read the address, but here’s a summary.

  1. Culture always pays a price for technology.
    e.g. cars and pollution (and many other less obvious examples).
  2. There are always winners and losers in a technological change.
  3. Every technology embodies a philosophy, an epistemological, political or social prejudice.
    The printing press de-values the oral tradition.
  4. Technological change is not additive, it is ecological.
    The invention of the printing press in Europe, did not create “old Europe + the printing press”. It created a new and different Europe.
  5. Technology becomes mythic, it becomes seen as part of the natural order of things.

Application to e-learning

How might this apply to e-learning – I don’t have time right now – but you might wish to take a look at this post which leverages Postman’s points into a series of questions for the use of ICTs in schools

One quick example before I go, in terms of technology being mythic, see what happens when you suggest to a university that they get rid of their learning management system. Even more mythic, what do you think would happen if you suggested getting rid of lecture theatres?

Cognition – we’re not rational and how it impacts e-learning

It’s a small world. I work in Rockhampton at a university and last year traveled to Canberra for a Cognitive Edge workshop (which I recommend). One of the other participants was Cory Banks who, a few years ago, was a student at the university I work at. He’s obviously moved onto bigger and better things.

Our joint Cognitive Edge experience indicates some similar interests, which brings me to this post on cognition on Cory’s blog. In th epost he suggests a number of aspects of cognition that impact upon problem solving. He’s asking for help in validating and sourcing these aspects.

If you can help, please comment on his post.

My particular interest in cognition is that most information systems processes (e.g. governance, software development) are based on the assumption of rational people making object decisions drawing on all available evidence. My experience suggests that this is neither possible nor true. For me, this observation explains most of the limitations and failures associated with the design and support of information systems for e-learning (and information systems more generally).

I’ve written about aspects of this before and again.

So, as time progresses I’m hoping to add to this list in terms of references, examples and additional aspects.

Cory’s cognition list

Cory’s cognition list includes the following (a little paraphrasing)

  • We evolved as ‘first fit’ pattern matchers.
    A quote from Snowden (2005)

    This builds on naturalistic decision theory in particular the experimental and observational work of Gary Klein (1944) now validated by neuro-science, that the basis of human decision is a first fit pattern matching with past experience or extrapolated possible experience. Humans see the world both visually and conceptually as a series of spot observations and they fill in the gaps from previous experience, either personal or narrative in nature. Interviewed they will rationalize the decision in whatever is acceptable to the society to which they belong: “a tree spirit spoke to me” and “I made a rational decision having considered all the available facts” have the same relationship to reality

    I’m guessing that Kaplan’s law of instrument is somewhat related.

  • The fight or flight reaction.
  • We make assumptions.
  • We’re not analytical
    I wonder if this and most of the above points fit under “first fit pattern matchers”?
  • Failure imprints better than success.
  • Serendipitous recall (we only know what we need to know, when we need to know it).
  • We seek symmetry (attractiveness).


Snowden, D. (2005). Multi-ontology sense making: A new simplicity in decision making. Management Today, Yearbook 2005. R. Havenga.