A command for organisations? Program or be programmed

I’ve just finished the Douglas Rushkoff book Program or be Programmed: Ten commands for a digital age. As the title suggests the author provides ten “commands” for living well with digital technologies. This post arises from the titular and last command examined in the book, Program or be programmed.

Dougls Rushkoff

This particular command was of interest to me for two reasons. First, it suggests that learning to program is important and that more should be doing it. As I’m likely to become a information technology high school teacher there is some significant self-interest in there being a widely accepted importance to learning ot program. Second, and the main connection for this post, is that my experience with and observation of universities is that they are tending “to be programmed”, rather than program. In particular when it comes to e-learning.

This post is some thinking out loud about that experience and the Ruskoff command. In particular, it’s my argument that universities are being programmed by the technology they are using. I’m wondering why? Am hoping this will be my last post on these topics, I think I’ve pushed the barrow for all its worth. Onto new things next.

Program or be programmed

Rushkoff’s (p 128) point is that

Digital technology is programmed. This makes it biased toward those with the capacity to write the code.

This also gives a bit of a taste for the other commands. i.e. that there are inherent biases in digital technology that can be good or bad. To get the best out of the technology there are certain behaviours that seem best suited for encouraging the good, rather than the bad.

One of the negative outcomes of not being able to program, of not being able to take advantage of this bias of digital technology is (p 15)

…instead of optimizing our machines for humanity – or even the benefit of some particular group – we are optimizing humans for machinery.

But is all digital technology programmed?

In terms of software, yes, it is all generally created by people programming. But not all digital technology is programmable. The majority of the time, money and resources being invested by universities (I’ll stick to unis, however, much of what I say may be applicable more broadly to organisations) is in “enterprise” systems. Originally this was in the form of Enterprise Resource Planning system (ERPs) like Peoplesoft. It is broadly recognised that modifications to ERPs are not a good idea, and that instead the ERP should be implemented in “vanilla” form (Robey et al, 2002).

That is, rather than modify the ERP system to respond to the needs of the university. The university should modify its practices to match the operation of the ERP system. This appears to be exactly what Rushkoff warn’s against “we are optimizing humans for machinery”.

This is important for e-learning because, I would argue, the Learning Management System (LMS) is essentially an ERP for learning. And I would suggest that much of what goes on around the implementation and support of an LMS within a university is the optimization of humans for machinery. In some specific instances that I’m aware of, it doesn’t matter whether the LMS is open source or not. Why?

Software remains hard to modify

Glass (2001), describing one of the frequently forgotten fundamental facts about software engineering, suggested that maintenance consumes about 40 to 80 percent of software costs, with 60% of the maintenance cost is due to enhancement. i.e. a significant proportion of the cost of any software system is adding new features to it. You need to remember that this is a general statement. If the software you are talking about is part of a system that operates within a continually changing context, then the figure is going to be much, much higher.

Most software engineering remains focused on creation. On the design and implementation of the software. There hasn’t been enough focus on on-going modification, evolution or co-emergence of the software and local needs.

Take Moodle. It’s an LMS. Good and bad like other LMS. But it’s open source. It is meant to be easy to modify. That’s one of the arguments wheeled out by proponents when institutions are having to select a new LMS. And Moodle and its development processes are fairly flexible. It’s not that hard to add a new activity module to perform some task you want that isn’t supported by the core.

The trouble is that Moodle is currently entering a phase which suggests it suffers much the same problems as most large enterprise software applications. The transition from Moodle 1.x to Moodle 2.0 is highlighting the problems with modification. Some folk are reporting difficulties with the upgrade process, others are deciding to delay the upgrade as some of the third-party modules they use haven’t been converted to Moodle 2. There are even suggestions from some that mirror the “implement vanilla” advice for ERPs.

It appears that “we are optimizing humans for machinery”.

I’m wondering if there is anyone doing research how to make systems like Moodle more readily modifiable for local contexts. At the very least, looking at how/if the version upgrade problem can be improved. But also, the ability to modify the core to better suit local requirements. There are aspects there already. One of the difficulties is that to achieve this you would have to cross boundaries between the original developers, service providers (Moodle partners) and the practices of internal IT divisions.

Not everyone wants to program

One reason this will be hard is that not everyone wants to program. Recently, D’Arcy Norman wrote a post talking about the difference between the geeks and folk like his dad. His dad doesn’t want to bother with this techy stuff, he doesn’t want to “program”.

This sort of problem is made worse if you have an IT division that has senior management with backgrounds in non-IT work. For example, an IT director with a background in facilities management isn’t going to understand that IT is protean, that it can be programmed. Familiar with the relative permanence of physical buildings and infrastructure such a person isn’t going to understand that IT can be changed, that it should be optimized for the human beings using the system.

Organisational structures and processes prevent programming

One of the key arguments in my EDUCAUSE presentation (and my thesis) is that the structures and processes that universities are using to support e-learning are biased away from modification of the system. They are biased towards vanilla implementation.

First, helpdesk provision is treated as a generic task. The folk on the helpdesk are seen as low-level, interchangeable cogs in a machine that provides support for all an organisation’s applications. The responsibility of the helpdesk is to fix known problems quickly. They don’t/can’t become experts in the needs of the users. The systems within which they work don’t encourage, or possibly even allow, the development of deep understanding.

For the more complex software applications there will be an escalation process. If the front-line helpdesk can’t solve the problem it gets handed up to application experts. These are experts in using the application. They are trained and required to help the user figure out how to use the application to achieve their aims. These application experts are expert in optimizing the humans for the machinery. For example, if an academic says they want students to have an individual journal, a Moodle 1.9 application expert will come back with suggestions about how this might be done with the Moodle wiki or some other kludge with some other Moodle tool. If Moodle 1.9 doesn’t provide a direct match, they figure out how to kludge together functionality it does have. The application expert usually can’t suggest using something else.

By this stage, an academic has either given up on the idea, accepted the kludge, gone and done it themselves, or (bravely) decided to escalate the problem further by entering into the application governance process. This is the heavy weight, apparently rational process through which requests for additional functionality are weighed against the needs of the organisation and the available resources. If it’s deemed important enough the new functionality might get scheduled for implementation at some point in the future.

There are many problems with this process

  • Non-users making the decisions;
    Most of the folk involved in the governance process are not front-line users. They are managers, both IT and organisational. They might include a couple of experts – e-learning and technology. And they might include a couple of token end-users/academics. Though these are typically going to be innovators. They are not going to be representative of the majority of users.

    What these people see as important or necessary, is not going to be representative of what the majority of academic staff/users think is important. In fact, these groups can quickly become biased against the users. I attended one such meeting where the first 10/15 minutes was spent complaining about foibles of academic staff.

  • Chinese whispers;
    The argument/information presented to such a group will have had to go through chinese whispers like game. An analyst is sent to talk to a few users asking for a new feature. The analyst talks to the developers and other folk expert in the application. The analysts recommendations will be “vetted” by their manager and possibly other interested parties. The analysts recommendation is then described at the governance meeting by someone else.

    All along this line, vested interests, cognitive biases, different frames of references, initial confusion, limited expertise and experience, and a variety of other factors contribute to the original need being morphed into something completely different.

  • Up-front decision making; and
    Finally, many of these requests will have to battle against already set priorities. As part of the budgeting process, the organisation will already have decided what projects and changes it will be implementing this year. The decisions has been made. Any new requirements have to compete for whatever is left.
  • Competing priorities.
    Last in this list, but not last overall, are competing priorities. The academic attempting to implement individual student journals has as their priority improving the learning experience of the student. They are trying to get the students to engage in reflection and other good practices. This priority has to battle with other priorities.

    The head of the IT division will have as a priority of staying in budget and keeping the other senior managers happy with the performance of the IT division. Most of the IT folk will have a priority, or will be told that their priority is, to make the IT division and the head of IT look good. Similarly, and more broadly, the other senior managers on 5 year contracts will have as a priority making sure that the aims of their immediate supervisor are being seen to be achieved……..

These and other factors lead me to believe that as currently practiced, the nature of most large organisations is to be programmed. That is, when it comes to using digital technologies they are more likely to optimize the humans within the organisation for the needs of the technology.

Achieving the alternate path, optimizing the machinery for the needs of the humans and the organisation is not a simple task. It is very difficult. However, by either ignoring or being unaware of the bias of their processes, organisations are sacrificing much of the potential of digital techology. If they can’t figure out how to start programming, such organisations will end up being programmed.

References

Robey, D., Ross, W., & Boudreau, M.-C. (2002). Learning to implement enterprise systems: An exploratory study of the dialectics of change. Journal of Management Information Systems, 19(1), 17-46.

The nature of user involvement in LMS selection and implementation

Given what know (see the below) about the importance of people to the implementation of information systems and also to learning and teaching, how would you characterise the involvement of uses in the selection and implementation of an LMS at most universities? What impact does it have?

The importance of people

There has been significant research within the information systems discipline – a small subset includes user participation and involvement (Ives and Olson 1984); technology acceptance and use (Davis 1989; Venkatesh, Morris et al. 2003); decision-making around system selection and implementation (Bannister and Remenyi 1999; Jamieson, Hyland et al. 2007); system success (DeLone and McLean 1992; Myers 1994); development methods (Mumford 1981); and, the social shaping of technology (Kling 2000)- around the importance and impact of people on information systems and their success. In terms of user participation and involvement, Lynch and Gregor (2004) found that previous studies were inconclusive in terms of links with system success, however, they suggest that the level of influence users have on the development process is a better indicator of system outcomes. The perceptions of the people who may potentially use an information and communication technology play a significant role in their adoption and use of that technology (Jones, Cranston et al. 2005). Information systems are designed and used by people operating in complex social contexts, consequently such a system is understood differently by different people and given meaning by the shared understanding that arises out of social interaction (Doolin 1998).

Similar findings and suggestions are evident in the educational and e-learning literature. John and La Velle (2004) argue that new technologies at most enable rather than dictate change. Dodds (2007) suggests that any excellence demonstrated by a University is not a product of technology, it is a product of the faculty, students and staff who play differing roles in the pursuit of scholarship and learning. For Morgan (2003), teaching and learning are two of the most highly personalised processes. Numerous authors (e.g. Alexander 2001; Oblinger 2003) identify understanding learners, and particularly their learning styles, attitudes, and approaches as essential to the effective facilitation of learning. For Watson (2006), it is clear that consideration of the human dimension is critical to education. Since, as Stewart (2008) observes, the beliefs held by those involved in the educational process, regardless of how ill-informed, can have a tremendous impact on the performance of both students and teachers and how effectively technology may be utilised. Personal characteristics have been found to influence e-learning implementation (Siritongthaworn, Krairit et al. 2006) and most universities are still struggling to engage a significant percentage of students and staff in e-learning (Salmon 2005). While technology may be the stimulus, the essential matters are complex and will be the purview of academics (Oblinger, Barone et al. 2001).

References

Alexander, S. (2001). E-learning developments and experiences. Education and Training, 43(4/5), 240-248.

Bannister, F., & Remenyi, D. (1999). Value perception in IT investment decisions. Electronic Journal of Information Systems Evaluation, 2(2).

Davis, F. D. (1989). Perceived usefulness, perceived ease of use and user acceptance of information technology. MIS Quarterly, 13(3), 319.

DeLone, W. H., & McLean, E. R. (1992). Information systems success: The quest for the dependent variable. Information Systems Research, 3(1), 60-95.

Dodds, T. (2007). Information Technology: A Contributor to Innovation in Higher Education. New Directions for Higher Education, 2007(137), 85-95.

Doolin, B. (1998). Information technology as disciplinary technology: being critical in interpretive research on information systems. Journal of Information Technology, 13(4), 301-311.

Ives, B., & Olson, M. (1984). User involvement and MIS success: a review of research. Management Science, 30(5), 586-603.

Jamieson, K., Hyland, P., & Soosay, C. (2007). An exploration of a proposed balanced decision model for the selection of enterprise resource planning systems. International Journal of Integrated Supply Management, 3(4), 345-363.

John, P. D., & La Velle, L. B. (2004). Devices and Desires: subject subcultures, pedagogical identity and the challenge of information and communications technology. Technology, Pedagogy and Education, 13(3), 307-326.

Jones, D., Cranston, M., Behrens, S., & Jamieson, K. (2005). What makes ICT implementation successful: A case study of online assignment submission. Paper presented at the ODLAA’2005, Adelaide.

Kling, R. (2000). Learning about information technologies and social change: The contribution of social informatics. The Information Society, 16(3), 217-232.

Lynch, T., & Gregor, S. (2004). User participation in decision support systems development: Influencing system outcomes. European Journal of Information Systems, 13(4), 286-301.

Morgan, G. (2003). Faculty use of course management systems: Educause Centre for Applied Research.

Mumford, E. (1981). Participative systems design: Structure and method. Syst. Objectives solutions, 1(1), 5-19.

Myers, M. D. (1994). Dialectical hermeneutics: a theoretical framework for the implementation of information systems. Information Systems Journal, 5, 51-70.

Oblinger, D. (2003). Boomers, gen-Xers and millennials: Understanding the new students. EDUCAUSE Review, 37 – 47.

Oblinger, D., Barone, C., & Hawkins, B. (2001). Distributed education and its challenge: An overview. Washington DC: American Council on Education.

Salmon, G. (2005). Flying not flapping: a strategic framework for e-learning and pedagogical innovation in higher education institutions. ALT-J, Research in Learning Technology, 13(3), 201-218.

Siritongthaworn, S., Krairit, D., Dimmitt, N., & Paul, H. (2006). The study of e-learning technology implementation: A preliminary investigation of universities in Thailand. Education and Information Technologies, 11(2), 137-160.

Stewart, D. P. (2008). Technology as a management tool in the Community College classroom: Challenges and Benefits. Journal of Online Learning and Teaching, 4(4).

Venkatesh, V., Morris, M., Davis, G. B., & Davis, F. D. (2003). User acceptance of information technology: Toward a unified view. MIS Quarterly, 27(3), 425-478.

Watson, D. (2006). Understanding the relationship between ICT and education means exploring innovation and change. Education and Information Technologies, 11(3-4), 199-216.

Nobody likes a do-gooder – another reason for e-learning not mainstreaming?

Came across the article, “Nobody likes a do-gooder: Study confirms selfless behaviour is alienating” from the Daily Mail via Morgaine’s amplify. I’m wondering if there’s a connection between this and the chasm in the adoption of instructional technology identified by Geoghegan (1994)

The chasm

Back in 1994, Geoghegan draw on Moore’s Crossing the Chasm to explain why instructional technology wasn’t being adopted by the majority of university academics. The suggestion is that there is a significant difference between the early adopters of instructional technology and the early majority. That what works for one group, doesn’t work for the others. There is a chasm. Geoghegan (1994) also suggested that the “technologists alliance” – vendors of instructional technology and the university folk charged with supporting instructional technology – adopt approaches that work for the early adopters, not the early majority.

Nobody likes do-gooders

The Daily Mail article reports on some psychological research that draws some conclusions about how “do-gooders” are seen by the majority

Researchers say do-gooders come to be resented because they ‘raise the bar’ for what is expected of everyone.

This resonates with my experience as an early adopter and more broadly with observations of higher education. The early adopters, those really keen on learning and teaching are seen a bit differently by those that aren’t keen. I wonder if the “raise the bar” issue applies? Would imagine this could be quite common in a higher education environment where research retains its primacy, but universities are under increasing pressure to improve their learning and teaching. And more importantly show to everyone that they have improved.

The complete study is outlined in a journal article.

References

Geoghegan, W. (1994). Whatever happened to instructional technology? Paper presented at the 22nd Annual Conferences of the International Business Schools Computing Association, Baltimore, MD.

The end of management – lessons for universities?

Yet another “death of X” article is the spark for this post. This one comes from the Wall Street Journal and is titled The end of management. There’s been a wave of these articles recently, but this one I like because it caters to my prejudice that most of the problems in organisations, especially in universities around learning and teaching, arise from an inappropriate management paradigm. The following has some connections to the oil sheiks thread.

Some choice quotes

Corporations are bureaucracies and managers are bureaucrats. Their fundamental tendency is toward self-perpetuation. They are, almost by definition, resistant to change. They were designed and tasked, not with reinforcing market forces, but with supplanting and even resisting the market.

and

The weakness of managed corporations in dealing with accelerating change is only half the double-flanked attack on traditional notions of corporate management. The other half comes from the erosion of the fundamental justification for corporations in the first place.

And a quote from Gary Hamel which summarises much of the problem with real innovation, including innovation around management

That thing that limits us is that we are extraordinarily familiar with the old model, but the new model, we haven’t even seen yet.

Moving onto the question of resources

In corporations, decisions about allocating resources are made by people with a vested interest in the status quo. “The single biggest reason companies fail,” says Mr. Hamel, “is that they overinvest in what is, as opposed to what might be.”

The challenge that strikes at the heart of improving learning and teaching within universities is capture in this quote

there’s the even bigger challenge of creating structures that motivate and inspire workers. There’s plenty of evidence that most workers in today’s complex organizations are simply not engaged in their work.

Does your university have large numbers of academic staff that are actively engaged in teaching? How does it do it?

I’d like to work for a university that gets this, or at least is trying to.

Wicked problems and the need to engage with differing perspectives

In writing the last post, I had the opportunity re-read the Wikipedia article on wicked problems. This quote struck a chord with me

Rittel and Webber coined the term in the context of problems of social policy, an arena in which a purely scientific-rational approach cannot be applied because of the lack of a clear problem definition and differing perspectives of stakeholders.

My experience with writing BIM from late last year through early this year is a good example of this. BIM is based on an earlier tool called BAM that was written as part of an institution specific system called Webfuse. As of early this year, the institution was dropping Webfuse and moving to Moodle. If BAM were to be continued, it had to be ported to Moodle. And the perspectives begin.

Stakeholders and their perspectives

There are at least three sets of stakeholders in the BIM process:

  • Academic staff wanting BIM written to use in their teaching.
  • Myself, the developer/researcher wanting to write BIM because of an interest in the approach.
  • The IT folk responsible for the Moodle transition project and supporting staff.

The academic staff who wanted BIM created, wanted it because it enabled a pedagogical practice that had been previously successful, at least from their perspective. They didn’t really care a great deal about how, they just wanted to use that pedagogy again.

I wanted to work on BIM because I believed that both the pedagogy it enabled and the model of e-learning systems it embodied were worthwhile and potentially very important for future practice.

The IT folk didn’t want BIM written. They had limited resources to use on the project and anything that was not core Moodle, was not very attractive to them. Consequently, they spent a lot of time and effort proposing methods by which the pedagogy enabled by BAM, could be enabled through various combinations of core Moodle tools. There was also quite a bit of political shenanigans being undertaken to prevent BIM being written.

Effective collaboration to enable an efficient implementation of the required pedagogy was not high on the agenda.

The winner writes the history

Obviously, the above is my perspective of what happened. I’m quite sure others involved might provide different perspectives. Especially now that BIM has been somewhat successful, at least in terms of at least one other institution using it and various people in the broader Moodle community saying nice things. I know begin to wonder what the story will be written about the history of BIM.

I no longer work for the original institution, and am fairly confident that if BIM continues to enjoy some success, that the IT folk within the institution will take some credit for an environment that enabled the development of BIM. After all, the development of BIM proves the rhetoric about the value of adopting an open source LMS like Moodle. After all the institution was able to develop a Moodle module that served an effective pedagogical purpose and is being adopted by others.

From my perspective, the writing of BIM has been achieved in spite of the institutional environment. Due to the difficulties of that environment, I had to do most of the work on holidays, had to fight individuals that actively worked against the development of BIM, and a range of other problems not indicative of an environment conducive to innovation.

But, now that I’ve left the organisation, it shall be interesting to hear what stories those that remain tell of BIM, its development, and their role within it.

The main point is that difference exists

Now all of that probably sounds a bit one sided and biased. Others might suggest a different version of events and suggest that it wasn’t so bad. They are free to confess that. Which version of events is more correct isn’t the point I’m trying to make here.

The point I’m trying to make is that as a wicked problem, improving learning and teaching within a university is going to have a large number of very different perspectives. The attempt to develop “the correct” perspective – which is the aim of engineering or planning approaches to solving these problems – misses the point. To establish an arbitrary and singular “correct” perspective of the problem and its solution, such a process must ignore and continually suppress alternative perspectives. This wastes energy on the suppression, and worse, closes off more fruitful solutions that arise from actively engaging with the diversity.

Two types of process and what university e-learning continues to get wrong

I should be writing other things, but there’s a wave amongst some of the “innovation bloggers” at the moment that I wanted to ride for the purposes of – once again – trying to get those driving university e-learning (and learning and teaching more generally) to realise they have something fundamentally wrong. They are using the wrong type of process.

I level this criticism at most of management, most of the support staff (information technology, instructional designers, staff developers etc) and especially at most of the researchers around e-learning etc.

For those of you at CQU who still don’t get what Webfuse was about. It wasn’t primarily about the technology, it was about this type of process. The technology was only important in so far as it enabled the process to be more responsive.

Empathy-driven innovation and a pull strategy

Over the weekend, Tim Kastelle wrote a post yesterday in which he proposes that a pull strategy is a key for empathy-driven innovation.

What is empathy-driven innovation, well Tim provides the following bullet points about empathy-driven innovation:

  • It requires a deep understanding of what the people that will use your innovation need and want.
    Most organisational e-learning assumes that steering committees and reference groups are sufficient and appropriate for understanding what is needed. This is just plain silly. The people who reside on such things are generally very different in terms of experience and outlook than the majority of academics involved with learning and teaching. If they aren’t different at the start, the very act of being a member of such groups will, over time, make them very different. These groups are not representative.

    What’s worse, is the support structures, processes, and roles that are created to sit under these committees and implement e-learning are more likely to prevent “deep understanding”, than help it. For example, different aspects of e-learning are divided along the lines of institutional structures. Anything technology related is the responsibility of the information technology folk, anything pedagogical is the responsibility of the instructional design folk and never shall the twain meet. As these folk generally report to different managers within different organisational units, the rarely talk and share insights.

    E-learning is more than the sum of its parts. Currently, there is generally a large gulf between the academics and students “experiencing” e-learning, the technology people keeping the systems going, the instructional design folk helping academics design courses, and the management staff trying to keep the whole thing going. This gulf actively works against developing deep understanding and limits the quality of e-learning within universities.

  • Using empathy for the users of our innovations is the best way to create thick value.
    A deep contextualised understanding and appreciation for the context of the academic staff and students helps develop truly unique and high quality e-learning applications and environment. Without it you are left with copying what every one else does, which is typically pretty limited.
  • We are creating ideas that entice people.
    Almost all of university-based e-learning is based on push strategies. i.e. someone or group who is/are “smart” identify the right solution and then push it out onto the academics and students. They have to do this because their understanding of the context and need of the academics and students is small to non-existent. They decisions are based more on their own personal observations and preferences, or even worse, on the latest fad (e.g. e-portfolios, open source LMS etc.).

    They aren’t creating ideas that entice people, they are creating ideas that people have to use.

    Researchers are particular bad at this.

  • Innovations that pull are inherently network-based.
    The idea is that to engage in empathy-driven innovation, you have to have connections to the people using the innovations.

    As argued above, it’s my suggestion that the structures and processes around e-learning within universities are such that they actively work against the formation of these connections. To have empathy-driven innovation you have to connect the folk involved in teaching, learning, technology and instructional design in ways that are meaningful and that actively strengthen the connections.

    At the moment, at least in my institution, there is no easy way for an academic to talk to a technical person that actually knows anything about the system, i.e. someone who can actively modify the system. The technology person can’t easily talk with someone with educational knowledge to better inform technological change. Instead each group retreats to talking amongst themselves. The necessary connections are generally only there in spite of the system, not because of it.

The Webfuse Thesis

I’m somewhat engaged in this discussion because I have seen, for a small period of time, this type of empathy-driven innovation work in the context of e-learning within a University. This is the topic of my PhD Thesis, an attempt to describe an information systems design theory for e-learning that encapsulates this.

At it’s simplest, the Webfuse thesis embodies two aspects:

  1. Process.
    There are two broad types of process: teleological and ateleological. I describe the two types here. Empathy-driven innovation is an example of an ateleological process. The table in the blog post describing teleological and ateleological mentions Seely Brown’s and Hagels push and pull distinction.

    University e-learning is always too teleological, it needs to be more ateleological.

  2. Product.
    Everyone focuses too much on the actual product or system when we talk about e-learning. With Webfuse the product was only important in terms of how flexible it could be. How much diversity could it support and how easy was it to support that diversity. Because the one thing I know about learning and teaching within universities, is that it is diverse. In addition, if you are to engage in ateleological (empathy-driven) design, you need to be able to respond to local needs.

    Most importantly, the question of how flexible the product is, is not limited to just the characteristics of the product. Yes, Moodle as an open source LMS implemented with technology (PHP and MySQL) that has low entry barriers, can be very flexible. However, if the organisation implements with technology with high entry barriers and inflexibility (e.g. Oracle) or if it adopts a process that is more teleological than ateleological, it robs Moodle of its flexibility.

From theory to intervention: Mapping theoretically derived behavioural determinants to behaviour change techniques

The following draws on principles/theory from psychology to guide thinking about how to incorporate “data” from “academic analytics” into an LMS in a way that encourages and enables academic staff to improve their learning and teaching. It’s based on some of the ideas that underpin similar approaches that have been used for students such as this Moodle dashboard and the signals work at University of Purdue.

The following gives some background to this approach, summarises a paper from the psychology literature around behaviour modification and then explains one idea for a “signals” like application for academic staff. Some of this thinking is also informing the “Moodle curriculum mapping” project.

Very interested in pointers to similar work, suggestions for improvement or expressions of interest from folk.

Background

I have a growing interest in how insights from psychology, especially around behaviour change can inform the design of e-learning and other aspects of the teaching environment at universities in a way to encourage and enable improvement.
Important: I did not say “require”, I said “encourage”. Too much of what passes in universities at the moment takes the “require” approach with obvious negative consequences.

This is where my current interest in “nudging” – the design of good choice architecture and behaviour modification is coming from. The basic aim is to redesign the environment within which teaching occurs in a way the encourages and enables improvement in teaching practice, rather than discourages it.

To aid in this work, I’ve been lucky enough to become friends with a pyschologist who has some similar interests. We’re currently talking about different possibilities, informed by our different backgrounds. As part of that he’s pointing me to bits in the psychological literature that offers some insight. This is an attempt to summarise/reflect on one such paper (Michie et al, 2008)

Theory to intervention

It appears that the basic aim of the paper is to

  • Develop methods to clarify the list of behaviour change techniques.
  • Identify links between the behaviour change techniques and behavioural determinants.

First a comparison of two attempts at simplifying the key behavioural determinants for change – the following table. My understanding is that there are some values of these determinants that would encourage behaviour change, and others that would not.

Key Determinants of Behaviour Change from Fishbein et al., 2001; Michie et al., 2004
Fishbein et al Michie et al
Self-standards Social/professional role and identity
Knowledge
Skills Skills
Self-efficacy Beliefs about capabilities
Anticipated outcomes/attitude Beliefs about consequences
Intention Motivation and goals
Memory, attention and decision processes
Environmental constraints Environmental context and resources
Norms Social influences
Emotion
Action planning

It is interesting to see how well the categories listed in this table resonate with the limits I was planning to talk about in this presentation. i.e. it really seems to me, at the moment, that much of the environment within universities around teaching and learning is designed as to reduce the chance of these determinants to be leaning towards behaviour change.

Mapping techniques to determinants

They use a group of experts in a consensus process for linking behaviour change techniques with determinants of behaviour. The “Their mapping” section below gives a summary of the consensus links. The smaller headings are the determinants of behaviour from the above table, the bullet points are the behaviour change techniques.

Now, I haven’t gone looking for more detail on the techniques. The following is going to be based solely on my assumptions about what those techniques might entail – and hence it will be limited. However, this should be sufficient for the goal of identifying changes in the LMS environment that might encourage change in behaviour around teaching.

First, let’s identify some of the prevalent techniques, i.e. those that are mentioned a more than once and which might be useful/easy within teaching.

Prevalent techniques

Social encouragement, pressure and support

The technique “Social processes of encouragement, pressure, support ” is linked to 4 of the 11 determinants: Social/professional role and identity, beliefs about capabilities, motivation and goals and social influences. I find this interesting as it can be suggested that most teaching is a lone and invisible act. Especially in a LMS where what’s going on in other courses. Making what happens more visible might enable this sort of social process.

There’s also some potential connection with “Information regarding behaviour of others” which is mentioned in 3 of 11.

Monitoring and self-monitoring

Get mentioned as linked to 4 of 11 determinants. Again, most LMS don’t appear to give good overall information about what a teacher is doing in a way that would enable monitoring/self-monitoring.

Related to this is “goal/target specified”, part of monitoring.

There’s more to do here, let’s get onto a suggestion

One suggestion

There’s a basic model process embedded here, something along the lines of:

  • Take a knowledge of what is “good” teaching and learning
    For example, Fresen (2007) argues that the level of interaction, facilitation or simply participation by academic staff is a critical success factor for e-learning. There’s a bunch more literature that backs this up. And our own research/analysis has backed this up. Courses with high staff participation show much higher student participation and a clearer correlation between student participation and grade (i.e. more student participation, the higher the grade).
  • Identify a negative/insight into the behavioural determinants that affect academic staff around this issue.
    There are a couple. First, it’s not uncommon for staff to have an information distribution conception of teaching. i.e. they see their job as to disseminate information. Not to talk, to communicate, or participate. Associated with this is that most staff have no idea what other staff are doing within their course sites. They don’t know how often other staff are contribution to the discussion forum or visiting the course site.
  • Draw on a behavioural technique or two to design an intervention in the LMS that can encourage a behaviour change. i.e. that addresses the negative in the determinants.
    In terms of increasing staff participation you might embed into the LMS a graph like the following. Embed it in such a way as the first thing an academic sees when they login, is the graph – perhaps on part of the screen.

    Example staff posts feedback

    What this graph shows is for a single (hypothetical) staff member the number of replies they have made in course discussion forums for the three courses the staff member has taught. The number of replies is shown per term, in reality it might be shown by week of term – as the term progresses.

    This part can hit the “monitoring”, “self-monitoring” and “feedback” techniques.

    The extra, straight line represents the average number of replies made by staff in all courses in the LMS. Or alternatively, all courses in a program/degree into which the staff member teaches. (Realistically, the average would probably change from term to term).

    This aspect hits the “social processes of encouragement, pressure, support”, “modelling/demonstration behaviour of others”. By showing what other people are doing it is starting to create a social norm. One that might perhaps encourage the academic, if they are below the average, to increase their level of replies.

    But the point is not to stop here. Showing a graph like this is simple using business intelligence tools and is only a small part of the change necessary.

    It’s now necessary to hit techniques such as “graded task, starting with easy tasks”, “Increasing skills: problem-solving, decision-making, goal-setting”, “Planning, implementation”, “Prompts, triggers, cues”. It’s not enough to show that there is a problem, you have to help the academic with how to address the problem.

    In this case, there might be links associated with this graph that show advice on how to increase replies or staff participation (e.g. advice to post a summary of the week’s happenings in a course each week, or some other specific, context appropriate advice). Or it might also provide links to further, more detailed information to shed more light on this problem. For example, it might link to SNAPP to show disconnections.

    But it’s even more than this. If you wanted to hit the “Environmental changes (e.g. objects to facilitate behaviour)” technique you may want to go further with than simply showing techniques. You may want to enable this “showing of techniques” to be within a broader community where people could comment on whether or not a technique worked. It would be useful if the tool help automate/scaffold the performance of the task, i.e. moved up the abstraction layer from the basic LMS functionality. Or perhaps the tool and associated process could track and create “before and afters”. i.e. when someone tries a technique, store the graph before it is applied and then capture it at sometime after.

It’s fairly easy to see how the waterfall visualisation (shown below) and developed by David Wiley and his group could be used this way.

education,data,visualization

Their mapping

Social/professional role and identity

  • Social processes of encouragement, pressure, support

Knowledge

  • Information regarding behaviour by others

Skills

  • Goal/target specified: behaviour or outcome
  • Monitoring
  • Self-monitoring
  • Rewards; incentives (inc. self-evaluation)
  • Graded task, starting with easy tasks
  • Increasing skills: problem-solving, decision-making, goal-setting
  • Rehearsal of relevant skills
  • Modelling/demonstration of behaviour by others
  • Homework
  • Perform behaviour in different settings

Beliefs about capabilities

  • Self-monitoring
  • Graded task, starting with easy tasks
  • Increasing skills: problem-solving, decision-making, goal-setting
  • Coping skills
  • Rehearsal of relevant skills
  • Social processes of encouragement, pressure and support
  • Feedback
  • Self talk
  • Motivational interviewing

Beliefs about consequences

  • Self-monitoring
  • Persuasive communication
  • Information regarding behaviour, outcome
  • Feedback

Motivation and goals

  • Goal/target specified: behaviour or outcome
  • Contract
  • Rewards; incentives (inc. self-evaluation )
  • Graded task, starting with easy tasks.
  • Increasing skills: problem-solving, decision-making, goal-setting
  • Social processes of encouragement, pressure, support
  • Persuasive communication
  • Information regarding behaviour, outcome
  • Motivational interviewing

Memory, attention, decision processes

  • Self-monitoring
  • Planning, implementation
  • Prompts, triggers, cues

Environmental context and resources

  • Environmental changes (e.g. objects to facilitate behaviour)

Social influences

  • Social processes of encouragement, pressure, support
  • Modelling/demonstration of behaviour by others

Emotion

  • Stress management
  • Coping skills

Action planning

  • Goal/target specified: behaviour or outcome
  • Contract
  • Planning, implementation
  • Prompts, triggers, cues
  • Use of imagery

References

Fresen, J. (2007). “A taxonomy of factors to promote quality web-supported learning.” International Journal on E-Learning 6(3): 351-362.

Michie, S., M. Johnston, et al. (2008). “From theory to intervention: Mapping theoretically derived behavioural determinants to behaviour change techniques.” Applied Psychology: An International Review 57(4): 660-680.

Why is University/LMS e-learning so ugly?

Yesterday, Derek Moore tweeted

University webs require eye candy + brain fare. Puget Sound’s site does both with colour palate & info architecture http://bit.ly/9U1kBN

In a later he also pointed to the fact that even Puget Sounds instance of Moodle looked good. I agreed.

This resonated strongly with me because I and a few colleagues have recently been talking about how most e-learning within Universities and LMS is ugly. Depressing corporate undesign seeking to achieve quality through consistency and instead sinking to being the lowest common denominator. Sorry, I’m starting to mix two of my bete noires:

  1. Most LMS/University e-learning is ugly.
  2. Most of it is based on the assumption that everything must be the same.

Let’s just focus on #1.

I’m using ugly/pretty in the following in the broadest possible sense. Pretty, at its extreme end, is something that resonates postively in the soul as your using it effectively to achieve something useful. It helps you achieve the goal, but you feel good while your doing it, even when you fail and even without knowing why. There’s a thesis or three in this particular topic alone – so I won’t have captured it.

Why might it be ugly? An absence of skill?

Let me be the first to admit that the majority of e-learning that I’ve been responsible for is ugly. This design (used in 100s of course sites) is mostly mine, but has thankfully improved (as much as possible) by other folk. At best you might call that functional. But it doesn’t excite the eyes or resonate. And sadly, it’s probably all downhill from there as you go further back in history.

Even my most recent contribution – BIM – is ugly. If you wish to inflict more pain on your aesthetic sensibility look at this video. BIM rears its ugly head from about 1 minute 22 seconds in.

In my case, these are ugly because of an absence of skill. I’m not a graphic designer, I don’t have training in visual principles. At best I pick up a bit, mostly from what I steal, and then proceed to destroy those principles through my own ineptitude.

But what about organisations? What about the LMS projects like Moodle?

Why might it be ugly? Trying to be too many things to too many?

An LMS is traditionally intended to be a single, integrated system that provides all the functionality required for institutional e-learning. It is trying to be a jack of all trades. To make something so all encompassing look good in its entirety is very difficult. For me, part of looking good is responding to the specifics of a situation in an appropriate way.

It’s also not much use being pretty if you don’t do anything. At some level the developers of an LMS have to focus on making it easy to get the LMS to do things, and that will limit the ability to make it look pretty. The complexity of the LMS development, places limits on making it look pretty.

At some level, the complexity required to implement a system as complex as a LMS also reduces the field of designers who can effectively work with to improve the design of the system.

But what about organisations adopting the LMS, why don’t they have the people to make it look good?

Why might it be ugly? Politics?

The rise of marketing and the “importance of brand” comes with it the idea of everything looking the same. It brings out the “look and feel” police, those folk responsible for ensuring that all visual representations of the organisation capture the brand in accepted ways.

In many ways this is an even worse example of “trying to be too many things”. As the “brand” must capture a full range of print, online and other media. Which can be a bridge too far for many. The complexity kills the ability for the brand to capture and make complete use of the specific media. Worse, often the “brand police” don’t really understand the media and thus can’t see the benefits of the media that could be used to improve the brand.

The brand and the brand police create inertia around the appearance of e-learning. They help enshrine the ugliness.

Then we get into the realm of politics and irrationality. It no longer becomes about aesthetic arguments (difficult at the best of times) it becomes about who plays the game the best, who has the best connection to leadership, who has the established inertia, who can spin the best line.

The call to arms

I think there is some significant value in making e-learning look “pretty”. I think there’s some interesting work to be done in testing that claim and finding out how you make LMS and university e-learning “pretty”.

Some questions for you:

  • Is there already, or can we set up, a gallery of “pretty” LMS/institutional e-learning?
    Perhaps something for Moodle (my immediate interest) but other examples would be fun.
  • What bodies of literature can inform this aim?
    Surely some folk have already done stuff in this area.
  • What might be some interesting ways forward i.e. specific projects to get started?

Embedding behaviour modification – paper summary

A growing interest of mine is an investigation of how the design of the environment and information systems to support university learning and teaching can be improved with a greater consideration given to factors which can help encourage improvement and change. i.e. not just building systems that do a task (e.g. manage a discussion forum) but design a discussion forum that encourages and enables an academic to adopt strategies and tactics that are known to be good. If they choose to.

One aspect of the thinking around this is the idea of behaviour modification. The assumption is that to some extent improving the teaching of academics is about changing their behaviour. The following is a summary of a paper (Nawyn et al, 2006) available here.

The abstract

Ubiquitous computing technologies create new opportunities for preventive healthcare researchers to deploy behavior modification strategies outside of clinical settings. In this paper, we describe how strategies for motivating behavior change might be embedded within usage patterns of a typical electronic device. This interaction model differs substantially from prior approaches to behavioral modification such as CD-ROMs: sensor-enabled technology can drive interventions that are timelier, tailored, subtle, and even fun. To explore these ideas, we developed a prototype system named ViTo. On one level, ViTo functions as a universal remote control for a home entertainment system. The interface of this device, however, is designed in such a way that it may unobtrusively promote a reduction in the user’s television viewing while encouraging an increase in the frequency and quantity of non-sedentary activities. The design of ViTo demonstrates how a variety of behavioral science strategies for motivating behavior change can be carefully woven into the operation of a common consumer electronic device. Results of an exploratory evaluation of a single participant using the system in an instrumented home facility are presented

Summary

Tell’s how a PDA + additional technology was used to embed behaviour modification strategies aimed at decreasing the amount of television watching. Describes a successful test with a single person.

Has some links/references to strategies and research giving principles for how to guide this type of design.

Introduction

Set the scene. Too many Americans watch too much TV, are overweight and don’t get exercise. Reducing TV watching should improve health, if replaced with activities that aren’t sedentary. But difficult because TV watching is addictive and exercise is seen to have high costs and initial experience not so good.

The idea is that “successful behavior modification depends on delivery of motivational strategies at the precise place and time the behavior occurs”. The idea is that “sense-enabled mobile computing technologies” can help achieve this. This work aims to:

  • use technology to disrupt the stimulus-reward cycle of TV watching;
  • decrease the costs of physical activity.

Technology-enabled behavioral modification strategies

Prior work has included knowledge campaigns and clinical interventions – the two most common approaches. Technology used to reduce television usually gatekeepers used to limit student access – not likely to be used by adults. There are exercise-contingent TV activation systems.

More work aimed at increasing physical activity independent of television. Approaches use include measuring activity and providing open loop feedback. i.e. simple, non-intrusive aids to increase activity. The more interactive, just in time feedback may help short-term motiviation – e.g. video games. Also technology interventions that mimic a human trainer.

For those not already exercising small increases in physical activity may be better than intense regimens.

The opportunity: just-in-time interactions

Technological intervention based on the value of: that people respond best to information that is timely, tailored to their situation, often subtle, and easy to process. This intervention uses a PDA device intended to replace the television remote control and adds a graphical interface, built-in program listings, access to a media library, integrated activity management, and interactive games.

It tries to determine the goals of the user and suggest alternatives to watching TV in a timely manner. The addition of wearable acceleration sensors it can also function as a personal trainer.

Challenges

Provide a user experience rewarding enough to be used over time.

Grabbing attention without grabbing time

Prior work on behavior change interventions reveals them to be:

  • resource-intensive, requiring extensive support staff;
  • time-intensive, requiring the user to stop everyday activity to focus on relevant tasks.

This is why the remote is seen as a perfect device. It’s part of the normal experience. Doesn’t need separate time to use.

Sustaining the interaction over time

Behavior change needs to be sustained over years to have a meaningful impact.
Extended use of a device might run the risk of annoyance, so avoided paternalistic or authoritarian strategies. Focus instead on strategies that promote intrinsic motivation and self-reflection. Elements of fun, reward and novelty are used to induce positive affect rather than feelings of guilt.

Avoiding the pitfall of coercion

Temptation of using coercion for motiviation. The likelihood that users will tolerate coercive devices for long is questionable.

Avoiding reliance on extrinsic justification

Optimal outcome of any behavioural intervention is change that persists. Heavy reliance on extrinsic justification – rewards or incentives – may result in dependency that can hurt persistence if removed. Also problems if the undesirable behaviour – watching TV – is the reward for exercise.

Case study

Low cost remote produced from consumer hardware. Laptop provided to manage media library. GUI with finger input.

Provides puzzles that use the TV for display and physical activity for input.

Behavior modification strategies

Most derived from basic research on learning and decision-making – suggestibility, goal-setting and operant conditioning). Examples include:

  • value integration – having persuasive strategies embedded within an application that otherwise provides value to the user increases the likelihood of adoption.
  • reduction – reducing the complexity of a task increases the likelihood that it will be performed.
  • convenience – embedding within something used regularly, increases opportunities for delivery of behaviour change strategies.
  • ease of use – easier to use = more likely to be adopted over long term.
  • intrinsic motivation – incorproating elements of challenge, curiosity and control into an activity can sustain interest.
  • suggestion – you can bias people toward a course of action through even very subtle prompts and cues.
  • encouraging incompatible behaviour – encouragement can be effective
  • disrupting habitual behaviour – eliminate bad habits by the conditions that create them are removed or avoided.
  • goal setting – concrete, achievable goals promote behaviour change by orienting the individual toward a definable outcome.
  • self-monitoring – motivated people can be more effective when able to evaluate progress toward outcome goals.
  • proximal feedback – feedback that occurs during or immediately after an activity has the greatest impact on behaviour change.
  • operant conditioning – increase frequency of desirable behaviour by pairing with rewarding stimuli.
  • shaping – transform an existing behaviour into more desirable one by rewarding successive approximations of the end goal.
  • consistency – draw on the desire of people to have a consistency between what they say and do to help them adhere to stated goals.

Exploratory evaluation

Use it with a single (real life) person to find out what happens.

Done in a specially instrumented apartment, including 3 phases: baseline with normal remote, 12 days at home, 7 days in lab with special remote. Participant not told that this was aimed at changing behaviour around watching TV and physical activity.

Results

Television watching reduced from 133 minutes a day during baseline to 41 minutes during intervention.

Evaluation against the adopted strategies were positive.

Conclusions

Substantial improvement important. Phase strategies in over time. Strategies are initially seen as novel – can use this curiosity. Not all users will react well.

References

Nawyn, J., S. Intille, et al. (2006). Embedding behavior modification strategies into a consumer electronic device: A case study. 8th International Conference on Ubiquitous Computing: 297-314.

Different perspectives on the purpose of the LMS

Antonio Vantaggiato gives one response to a post from Donald Clark titled “Moodle: e-learning’s frankenstein”. Clark’s post is getting a bit of traction because it is being seen as a negative critique of Moodle.

I think part of this problem is the failure to recognise the importance of the perceived purpose to which Moodle (or any LMS) is meant to serve. Just in my local institution, I can see a number of very different perceptions of the purpose behind the adoption of Moodle.

In the following I’m stealing bits of writing I’ve done for the thesis, some of it has appeared in previous posts. This probably makes the following sound somewhat pretentious, but I’ve gotta got some use out of the &%*#$ thesis.

The importance of purpose

Historically and increasingly, at least in my experience, the implementation of e-learning within universities has been done somewhat uncritical with the information technology taken for granted and assumed to be unproblematic. This is somewhat surprising given the nature of the universities and the role academics are meant to take. However, in my experience the selection of institutional LMSs is driven by IT and management with little room for critical thought or theory informed decision making.

Instead they rely on a very techno-rational approach that takes a very narrow perspective of what technology is, how it has effects and how and why it is implicated in social change (Orlikowski and Iacono 2001). A different perspective is that technology serves the goals of those who guide its design and use (Lian 2000).

This is important because many, if not most, universities follow, or at least profess to follow, a purpose driven approach to setting strategic directions (McConachie, Danaher et al. 2005). The implementation of an LMS is being done to achieve specific institutional purposes. The very definition of a teleological design process is to set and achieve objectives, to be purpose driven (Introna, 1996). When an institution engages in selecting an LMS, the purpose is typically set by a small group, usually organisational leaders, who draw on expert knowledge to perform a diagnosis of the current situation in order to identify some ideal future state and how to get there.

Once that purpose is decided, everything the organisation does from then on is about achieving that purpose with maximum efficiency. By definition, any activity or idea that does not move the organisation closer to achieving its stated purpose is seen as inefficient (Jones and Muldoon, 2007).

Differences of purpose

Many of the folk responding to Clark’s post who are defending Moodle have their own notion of the purpose of Moodle, usually how they have used it. Others draw on the purposes espoused by the designer(s) of Moodle. There is little recognition that there exists a diversity of opinions about the purposes of Moodle.

A little of this diversity is represented in discussions about how Moodle is used in individual courses. For example, this comment mentions that Moodle does teacher centered very well. i.e. if a teacher sees the purpose of a course site to distribute information, Moodle can do that. This comment makes the point that Moodle is a tool, the pedagogy is not about the tool, it is about the approach.

Now, while to some extent that is true, I also agree with Kallinikos (2004) that systems can have profound effects on the structuring of work and the forms of human action they enable or constrain.

While Moodle’s designers may have all sorts of wonderful intents with the purpose of Moodle. Within a university the purpose assigned to Moodle by the people implementing it and supporting play a significant part. The processes, structures etc that they put around Moodle within an institutional setting can enable or constrain the purpose seen by the Moodle designers and the purpose seen by the staff and students who will use it.

Moodle/LMS as an integrated enterprise system

Due to the complexity of implementing Moodle for a largish organisation the people driving Moodle implementations within universities are usually IT folk. It is my suggestion that the purpose they perceive of Moodle is that of an integrated, enterprise system. A university’s LMS forms the academic system equivalent of enterprise resource planning (ERP) systems in terms of pedagogical impact and institutional resource consumption (Morgan 2003).

To slightly paraphrase Siemens (2006), the purpose of an LMS for the institution is to provide the organisation with the ability to produce and disseminate information by centralising and controlling services. The LMS model with its nature as an integrated, enterprise system fits the long-term culture of institutional information technology and its primary concern with centralizing and controlling information technology services with a view to reducing costs (Beer and Jones 2008).

An LMS is designed to provide an organisation with all the tools it will need for e-learning. Weller, Pegler and Mason (2005) identify two approaches to the design of an LMS: monolithic or integrated approach, and the best-of-breed approach. The monolithic approach is the predominant approach and seeks to provide all common online learning tools in a single off-the-shelf package (Weller, Pegler et al. 2005).

The evidence of this purpose can be seen when you go to your LMS folk and ask them “I’d like to do X”. The response to this question will generally be not what is the best marriage of pedagogy and technology (the best blended learning) to achieve your goal. The response to this question will generally be “How to do X in the LMS”. Regardless of how much extra work, complexity and just plain inappropriateness doing X in the LMS requires.

If all you have is an LMS, every pedagogical problem is solved by the application of the LMS.

What should the purpose of an LMS be?

BIM is a representation of what I think the purpose of an LMS should be. i.e. the LMS should provide the services that are necessary/fundamental to the university/institution, and only those. Increasingly, most of the services should be fulfilled by services and resources that students and staff already use and control.

BIM provides academics teaching a course a way to aggregate blog posts from students and, if they want to, mark them. Those marks are integrated into the Moodle gradebook. The assumption is that marking/accreditation is one of the main tasks a university performs and that there aren’t external services that currently provide that service.

There are, however, a great many very good and free blog services. So students use their choice of blog provider (or something else that generates RSS/Atom) to create and manage their contributions.

The purpose of the LMS isn’t to provide all services, just those that are required for the institution’s tasks.

Eventually, the term LMS becomes a misnomer. The system isn’t about managing learning. It’s about providing the glue between what the institution has to provide and what the learners are using. The purpose is about achieving the best mix of pedagogy and technology, rather than on how to use the LMS.

This perspective obviously has connections with Jon Mott’s (2010) article and the various folk who have written about this previously.

References

Beer, C. and D. Jones (2008). Learning networks: harnessing the power of online communities for discipline and lifelong learning. Lifelong Learning: reflecting on successes and framing futures. Keynote and refereed papers from the 5th International Lifelong Learning Conference, Rockhampton, Central Queensland University Press.

Jones, D. and N. Muldoon (2007). The teleological reason why ICTs limit choice for university learners and learning. ICT: Providing choices for learners and learning. Proceedings ASCILITE Singapore 2007, Singapore.

Kallinikos, J. (2004). “Deconstructing information packages: Organizational and behavioural implications of ERP systems.” Information Technology & People 17(1): 8-30.

Introna, L. (1996). “Notes on ateleological information systems development.” Information Technology & People 9(4): 20-39.

Lian, A. (2000). “Knowledge transfer and technology in education: Toward a complete learning environment.” Educational Technology & Society 3(3): 13-26.

McConachie, J., P. Danaher, et al. (2005). “Central Queensland University’s Course Management Systems: Accelerator or brake in engaging change?” International Review of Research in Open and Distance Learning 6(1).

Morgan, G. (2003). Faculty use of course management systems, Educause Centre for Applied Research: 97.

Mott, J. (2010). “Envisioning the Post-LMS Era: The Open Learning Network.” EDUCAUSE Quarterly 33(1).

Orlikowski, W. and C. S. Iacono (2001). “Research commentary: desperately seeking the IT in IT research a call to theorizing the IT artifact.” Information Systems Research 12(2): 121-134.

Siemens, G. (2006). “Learning or Management System? A Review of Learning Management System Reviews.” from http://ltc.umanitoba.ca/wordpress/wpcontent/uploads/2006/10/learning-ormanagement-system-with-reference-list.doc.

Weller, M., C. Pegler, et al. (2005). “Students’ experience of component versus integrated virtual learning environments.” Journal of Computer Assisted Learning 21(4): 253-259.