The grammar of school, psychological dissonance and all professors are rather ludditical

Yesterday, via a tweet from @marksmithers I read this post from the author of the DIYU book titled “Vast Majority of Professors Are Rather Ludditical”. This is somewhat typical of the defict model of academics which is fairly prevalent and rather pointless. It’s pointless for a number of reasons, but the main one is that it is not a helpful starting point for bringing a out change as it ignores the broader problem and consequently most solutions that arise from a deficit model won’t work.

One of the major problems this approach tends to ignore is the broader impact of the grammar of school (first from Tyack and Cuban and then Papert). I’m currently reading The nature of technology (more on this later) by W. Brian Arthur. The following is a summary and a little bit of reflection upon a section titled “Lock-in and Adaptive Stretch”, which seems to connect closely with the grammar of school idea.

Psychological dissonance and adaptive stretch

Arthur offers the following quote from the sociologist Diane Vaughan around psychological dissonance

[In the situations we deal with as humans, we use] a frame of reference constructed from integrated sets of assumptions, expectations and experiences. Everything is perceived on the basis of this framework. The framework becomes self-confirming because, whenever we can, we tend to impost it on experiences and events, creating incidents and relationships that conform to it. And we tend to ignore, misperceive, or deny events that do not fit it. As a consequence, it generally leads us to what we are looking for. This frame of references is not easily altered or dismantled, because the way we tend to see the world is intimately linked to how we see and define ourselves in relation to the world. Thus, we have a vested interest in maintaining consistency because our own identity is at risk.

Arthur goes onto to suggest that “the greater the distances between a novel solution and the accepted one, the large is this lock-in to previous tradition”. He then defines the lock-in of the older approach as adaptive stretch. This is the situation where it is easier to reach for the old approaches and adapt it to the new circumstances through stretching.

Hence professors are ludditical

But haven’t I just made the case, this is exactly what happens with the vast majority of academic practice around e-learning. If they are using e-learning at all – and not simply sticking with face-to-face teaching – most teaching academics are still using lectures, printed notes and other relics of the past that they have stretched into the new context.

They don’t have the knowledge to move on, so we have to make them non-ludditical. This is when management and leadership at universities rolls into action and identifies plans and projects that will help generate non-ludditical academics.

The pot calling the kettle black

My argument is that if you step back a bit further the approaches being recommended and adopted by researchers and senior management; the way those approaches are implemented; and they way they are evaluated for success, are themselves suffering from psychological dissonance and adaptive stretch. The approaches almost without exception borrow from a traditional project management approach and go something like:

  • Small group of important people identify the problem and the best solution.
  • Hand it over to a project group to implement.
  • The project group tick the important project boxes:
    • Develop a detailed project plan with specific KPIs and deadlines.
    • Demonstrate importance of project by wheeling out senior managers to say how important the project is.
    • Implement a marketing push involving regular updates, newsletters, posters, coffee mugs and presentations.
    • Develop compulsory training sessions which all must attend.
    • Downplay any negative experiences and explain them away.
    • Ensure correct implementation.
    • Get an evaluation done by people paid for and reporting to the senior managers who have been visibly associated with the project.
    • Explain how successful the project was.
  • Complain about how the ludditical academics have ruined the project through adaptive stretching.

Frames of reference and coffee mugs

One of the fundamental problem with these approaches to projects within higher education is that it effectively ignores the frames of reference that academics bring to problem. Rather than start with the existing frames of reference and build on those, this approach to projects is all about moving people straight into a new frame of reference. In doing this, there is always incredible dissonance between how the project people think an action will be interpreted and how it actually is interpreted.

For example, a few years ago the institution I used to work for (at least as of CoB today) adopted Chickering and Gamson’s (1987) 7 principles for good practice in undergraduate teaching as a foundation for the new learning and teaching management plan. The project around this decision basically followed the above process. As part of the marketing push, all academics (and perhaps all staff) received a coffee mug and a little palm card with the 7 principles in nice text and a link to the project website. The intent of the project was to increase awareness of the academics of the 7 principles and how important they were to the institution.

The problem was, that at around this time the institution was going through yet more restructures and there was grave misgivings from senior management about how much money the institution didn’t have. The institution was having to save money and this was being felt by the academics in terms of limits on conference travel, marking support etc. It is with this frame of reference that the academics saw the institution spending a fair amount of money on coffee mugs and palm cards. Just a touch of dissonance.

What’s worse, a number of academics were able to look at the 7 principles and see principle #4 “gives prompt feedback” and relate that to the difficulty of giving prompt feedback because there’s no money for marking support. Not to mention the push from some senior managers about how important research is to future career progression.

So, the solution is?

I return to a quote from Cavallo (2004) that I’ve used before

As we see it, real change is inherently a kind of learning. For
people to change the way they think about and practice education, rather than merely being told what to do differently, we believe that practitioners must have experiences that enable appropriation of new modes of teaching and learning that enable them to reconsider and restructure their thinking and practice.

Rather than tell academics what to do, you need to create contextualised experiences for academics that enable appropriation of new models of teaching and learning. What most senior managers at universities and many of the commentators don’t see, is that the environment at most universities is preventing academics from having these experiences and then preventing them from appropriating the new models of teaching.

The policies, processes, systems and expectations senior managers create within universities are preventing academics from becoming “non-ludditical”. You can implement all the “projects” you want, but if you don’t work on the policies, processes, systems and expectations in ways that connect with the frames of reference of the academics within the institution, you won’t get growth.


Cavallo, D. (2004). Models of growth – Towards fundamental change in learning environments. BT Technology Journal, 22(4), 96-112.

Chickering, A. W., & Gamson, Z. F. (1987). Seven principles for good practice in undergraduate education. AAHE Bulletin, 39(7), 3-7.

First the fridge dies, and then…

For the last couple of days our LG side-by-side fridge has been dying. Not a great situation, especially given the problems we’ve had with it. And then today I find out that my employment at my current institution is about to cease after 20 years. The following is a bit of reflection about what might happen now.

The last couple of years at the institution have been a boring cycle: org restructure, imminent redundancy, last minute “saving”, period of uncertainty, org restructures, imminent redundancy, glimmer of last minute “saving”…redundancy. So, while there is a tinge of sadness (mostly for the folk “left behind), and a touch of worry (disruption to the family) this is actually a great relief. We’re in a position where this causes no great financial pressure, so not only is there relief, there is some wonder at the possibilities that have opened up.

The PhD

I’ve been working part time on the PhD for almost ten years. The first possibility is to finish the beast, get it off my back. The end is near, the time is now to put head down and get it done.

What do I want to do then?

Then what? Some of the possibilities include:

  • More L&T support/instructional design/educational development within higher education;
    This is the field I’ve been in for a while, there’s some interesting possibilities within this field. It wouldn’t be too hard to do a lot of stuff much better than how it is being done at the moment. However, there’s also a lot of inertia that makes it hard. Not amongst the academics. The inertia I’ve struggled with is the short-term perspectives of senior institutional management and the limited depth and diversity of their insights. Most academics want to engage in context appropriate innovation in their teaching. Most senior leaders want to tick the AUQA boxes and satisfy techno-rational notions of management/leadership – don’t rock the boat. (Note: given my current circumstances, I am probably over stating this case just a bit due to a somewhat lessened sense of objectivity, but the case is there).

    There are some positives here, but there are some negatives.

  • e-learning;
    Where my skills and experience are best used, I think, are harnessing information technologies to support learning and teaching. It’s where the PhD is located and where most of my experience is. The dividing line between this possibilty and the previous is very vague to non-existent. Especially in the similarities around it being fairly simple to do something significantly better than the status quo. However, the problem with senior management continues to exist and for good measure you get some potentially/typically very limited thinking from IT departments. This is the problem with e-learning, it’s currently seen as an IT problem, not a learning and teaching problem.
  • A return to information systems;
    For most of my 20 years I’ve been a faculty academic teaching and doing research. The PhD is within the information systems field, I have some contacts and publications in the area. I could return there. Especially given that my real interest is in developing and understanding new ways of helping organisations harness information technologies – for me e-learning is an application of that. The positioning of IS in relation to IT and other business disciplines is a bit troubling. Also there are a lot of IS PhD graduates, and not many positions.
  • technical development.
    Software development is one of the activities I like a lot. Software like BIM is an example of what I can do, though it’s also an example of the above. There’s a big move towards Moodle, BIM is Moodle…so perhaps a software development role. Perhaps not as intellectually rewarding, but more practically fulfilling.

So no clear cut choices. So, what’s the dream job? At the moment, a research and development job focused at helping a university harness information technology to effectively and innovatively improve the quality of its learning and teaching. Something that straddles information systems and e-learning and is focused on innovation. Especially one that involves being part of a team of talented folk, I’ve had enough of the lone-ranger stuff.

Actually, for some time the institutional inertia around learning and teaching has been getting me down. Perhaps it is time to look for a L&T/e-learning role outside of formal institutional settings, does such exist?

Actually, perhaps its time to really open up to the possibilities? What comes, comes. Perhaps taking me to a place I could never have imagined.


This is the question that is current causing the most heart ache. We have almost the perfect home for a young family. It would be hard to better and even harder to leave. However, it’s located in a regional area and the only place I’m likely to get the type of work I’ve described above, is the institution that is letting me go. Which suggests three options:

  • tele-commute;
    Not a lot of jobs around that do this, especially of the type I want. It doesn’t make a lot of sense. Though a straight software development role would fit well here.
  • contractor;
    i.e. short periods away and then back. Again not an ideal approach for the type of work I’d like. But limits family disruption.
  • moving.
    Pack up the family and move to where the work is. A disruption to be sure, but it opens up possibilities.


So what have I forgotten or not even though of? Anyone got an opportunity? Anyone interested in a software developer/information systems/e-learning teaching academic?

Off to buy a fridge. And then see what they have in the way of interesting and productive careers.

PLEs and the institution: the wrong problem

Yesterday, I rehashed/summarised some earlier thoughts about “handling the marriage of PLEs and institutions. Since then, I’ve been reflecting on that post. Am coming to the belief that this is just the wrong problem, or perhaps just a symptom of a deeper problem. (Signal the start of the broken record).

All of the students and academic staff (the learners) of a university have always had their own PLEs. It’s only with the rise of Web 2.0, e-learning 2.0 and related movements/fads that the PLE (and/or PLN) label has become a concern. And only because of this has the question of how, or indeed if, an institution should provide a PLE arisen. In much the same way that universities – at least within Australia – have had to deal with distance education, flexible learning, lifelong learning, open learning, blended learning and a whole range of similar labels and fads.

The product focus

The problem that I am seeing is that university teaching and learning – and the systems that underpin and support that teaching and learning – are “product” or fad focused. i.e. folk within the institution note that “X” (i.e. open learning, blended learning, PLEs, e-portfolios etc) is currently the buzz word within the sector around learning and teaching and hence the organisation and its practices are re-organised – or at least seen to be re-organised – to better implement “X”. From this you get a whole bunch of folk within institutions (from senior management down) for whom their professional identity becomes inextricably linked with “X”. Their experiences and knowledge grow around “X”. Any subsequent criticism of “X” is a criticism of their identity and thus can’t be allowed. It has to be rejected. Worse still, “X” becomes the grammar of the institution, everything must be considered as part of “X” (thus good) or not part of “X” (thus bad).

Various factors such as short term contracts for senior managers; top-down management; certain research strategies that generate outputs through investigating “learning and teaching with “X””; the increasing prevalence of “project managers” within universities and the simplistic notions many of them have; deficit models of academics; and the wicked nature of the learning and teaching problem all contribute to the prevalence of this mistake.

The process focus

What I described as a way to handle the marriage of PLEs and institutions is no different from the approach we used to implement Webfuse and no different from the process I would use to attempt to support and improve learning and teaching within any university. It’s an approach that doesn’t focus on a particular “X”, but instead on adopting a process that enables the institution to learn and respond to what happens within its own context and outside.

Some broad steps:

  • Ensure that there’s a L&T support team full of the best people you can get with a breadth and depth of experience in learning, teaching, technology and contextual knoweldge.
    This is not a one off, it’s an on-going process of bringing new people in and helping the people within grow to exceed their potential.
  • Implement a process where the L&T support team is working closely and directly with the academics teaching within the context during the actual teaching.
    i.e. not just on design of courses before delivery, but during teaching in a way that enables the support team to help the teaching academics in ways that are meaningful, contextual and build trust and connections between the teaching academics and the support staff.
  • Adopt approaches that encourage greater connections between the L&T support team, the teaching academics, students and the outside world.
  • Support and empower the support team and teaching academics to experiment with interventions at a local level with a minimum management intervention or constraints in terms of institutional barriers.
  • Observe what happens in the local interventions and cherry pick the good ideas for broader incorporation into the institution’s L&T environment in a way that encourages and enables adoption by others.
  • Implement mechanisms where senior management are actively encouraged to understand the reality of teaching within the institutional context and actively charged with identifying and removing those barriers standing in the way of teaching and learning.
    The job of the leaders is not to choose the direction, but to help the staff doing the work get to where they want to go.

What’s important

The identity of “X” is not important, be it graduate attributes, constructive alignment, PLEs, Web 2.0, social media, problem-based learning, blended learning etc, all these things are transitory. What’s important is that the university has the capability and the on-going drive to focus on a process through which it is reflecting on what it does, what works, what doesn’t and what it could do better, and subsequently testing those thoughts.

How to handle the marriage of PLEs and institutions

The following is my attempt to think about how the “marriage” of the PLE concept and educational institutions can be handled. It arises from reading some of the material that has arisen out of the PLE conference held in Barcelona a few weeks ago and some subsequent posts, most notable this one on the anatomy of a PLE from Steve Wheeler.

The following is informed by a paper some colleagues and I wrote back in 2009 around this topic. That paper, aimed to map the landscape for a project we were involved with that was attempting to implement/experiment with just such a marriage. By the time the paper was presented (end 2009) the project was essentially dead in the water – at least in terms of its original conceptualisation – due to organisational restructures.

The paper and this post attempts to use the Ps framework as one way to map this landscape.

In summary, people (students and staff) already have PLEs, the question is how to effectively create a marriage between each person’s PLE and the institution that is effective, open, and responds to contextual and personal needs.

Product – what is a PLE

The assumption is that the definition of what a PLE is, is both uncertain and likely to change and emerge as the “marriage” is consumated (taking the metaphor too far?). I like the following quotes to summarise the emergence aspect

Broader conceptualisations see technology as one of a number of components of an emergent process of change where the outcomes are indeterminate because they are situationally and dynamically contingent (Markus & Robey, 1988). Ongoing change is not solely “technology led” or solely “organisational/agency driven”, instead change arises from a complex interaction among technology, people and the organization (Marshall & Gregor, 2002)

But we found some value in defining what a PLE is not:

  • a single tool;
  • specified, owned or hosted by the university;
  • be common across all students;
  • necessarily involve the use of information and communication technologies;
  • be a replacement or duplication of the institutional learning management system.

Picking up on the last point, we position the PLE as a counterpoint to the LMS

The PLEs@CQUni project emphasises the role of PLEs as a counterpoint (in the musical sense where two or more very different sounding tunes harmonise when played together) to the institutional LMS

The design guidelines we generated from this were

  • The "PLE product" is not owned, specified or provided by the university.
  • Each learner makes their own decisions about the collection of services and tools that will form their "PLE Product".
  • The University needs to focus on enabling learners to make informed choices between services and tools and on allowing for integration of institutional services with learners’ chosen services and tools.
  • The PLE work will act as a counterpoint to existing and new investments in enterprise systems, by combining them with the students’ customised environment in order to provide previously unavailable services.
  • The final nature of the PLE product and its relationship with the institution will emerge from the complex interaction between technology, people and the organization.


When looking at the people involved, we developed these guidelines:

  • The PLE project will fail if learners (both staff and students) do not engage with this concept.
  • People are not rational decision makers. They make decisions based on pattern matching of their personal or collective experiences.
  • There is little value in asking people who have limited experience with a new paradigm or technology what they would like to see or do with the technology.
  • The project focus should be on understanding, working with and extending the expectations of the participants within the specific conditions of the local context.
  • A particular emphasis must be on providing the scaffolding necessary to prepare learners for the significant changes that may arise from the PLE concept.


It’s long been a bug bear of mine that universities are so project centric, that they believe that big up front design/traditional IT development processes actually works for projects involving innovation and change. This is evident in the guidelines around process we developed:

  • Classic, structured project management practices are completely inappropriate for the PLEs@CQUni project.
  • An approach based on ateleological or naturalistic design is likely to be more appropriate.
  • Project aims should be based on broad strategic aims and place emphasis on organisational learning.


A project has to have a purpose doesn’t it? At the very least, for political reasons, the project has to be seen to have a purpose. The guidelines for purpose were:

  • The project will cultivate an emergent methodology.
  • The project will focus on responding to local contextual needs.
  • The overall purpose of the project is to support the institution’s new brand.

The last point is likely to bring shudders to most folk. Branding! Are you owned by “the man”? This was partly political, however, the “branding” really does gel with the concept of the PLE. The tag line is “Be what you want to be” and one of the “messages” on the corporate website was

CQUniversity interacts in a customized way to your individual requirements. Not all universities can say that and few can say it with confidence. We can.

For me, there is a connection with PLEs.


To some extent, the discussions from the PLE conference that I have seen seem to assume that all universities are the same. I disagree. I think there are unique differences between institutions that can and should be harnessed. What works at the OU, will not work at my current institution. So the guidelines for place we developed are:

  • The project must engage with broader societal issues without sacrificing local contextual issues.
  • It must aim to engage and work with the different cultures that make up the institution.
  • It should use a number of safe-fail projects, reinforcing those with positive outcomes and eliminating others.

What’s missing?

There are other aspects of the Ps framework not considered in the paper or above – Pedagogy, and Past Experience. However, the above suggests how these would be handled. i.e. connecting with current practice within the specific place and trying to extend it to better fit with the ideas underpinning a PLE. Such extension would be done in diverse ways, with different disciplines and different individuals within those disciplines trying different things, talking to each other and working out new stuff.

What would it look like?

There were two concrete changes the project implemented before it was canned:

  1. BAM/BIM;
    Provide LMS-based method for staff to manage/aggregate and use individual student blogs (PLE).
  2. Generating RSS feeds from Blackboard 6.3 discussion forums.
    The institutional LMS at the time was the ancient Blackboard 6.3. We implemented an intermediary system that generated an RSS feed of posts. A way for students/staff using newsreaders (PLE) to track what was happening within the LMS and at the same time saving them time. They didn’t need to login to the LMS, go to each course, check each discussion forum for new posts….

These two bits only really touched the surface. In fact, these interventions were intended as easy ways to scaffold and encourage the greater use and integration of “PLE” like concepts into the daily practice of learning and teaching within a university. The start of a journey, and valuable because of the journey to come, more so than the destination they represented. A journey we we’re never able to carry through for an interesting distance. Here’s hoping that someone can start it again.

Features used in Webfuse course sites

Time to get back into the thesis. The following is the next completed section from the evaluation part of chapter 5 of my thesis. A result of much data munging and some writing, still needs a bit more reflection and thought, but getting close.

Features used in course sites

The previous sub-section examined the number of pages used in default course sites from 2000 through 2004. This sub-section seeks to examine in more detail the question of feature adoption within the Webfuse courses sites. In particular it seeks to describe the impact of the introduction of the default course sites approach and compare its results with feature adoption in course websites from other systems at other institutions. This is done using the Malikowski et al (2007) model introduced in Chapter 4, which abstracts LMS features into five system independent categories (see Figure 4.8). This sub-section first describes the changes in the available Webfuse features – both through new Webfuse page types and Wf applications – from 2000 onwards in terms of the Malikowski et al (2007) model. It then outlines how Webfuse feature adoption within course sites changed over the period from 2000 through 2004 and compares that with other systems at other institutions. Finally, it compares and contrasts feature adoption during 2005 through 2009 at CQU between Webfuse and Blackboard.

As described in Chapter 4, the fifth Malikowski et al (2007) category – Computer-Based Instruction – is not included in the following discussions because Webfuse never provided features that fit within this category. In addition, it is a category of feature rarely present or used in other LMS, especially from 2000 through 2004. Table 5.12 lists the remaining four Malikowski et al (2007) categories and lists the Webfuse features within those categories from 1997-1999 and 2000 onwards. The 2000 onward features included features provided by both page types and Wf applications.

Table 5.12 – Allocation of Webfuse page type (1996-1999) to Malikowski et al (2007) categories
Category Page Types (1997-1999) Webfuse features (2000-)
Transmitting content Various content and index page types
Lecture and study guide page types
File upload and search page types
Timetable generator (Jones 2003)
Creating class interactions Email2WWW
WWWBoard and WebBBS
CourseGroup, CourseGroups
Email Merge
Evaluating students AssignmentSubmission Quiz
Assignment extension management
Academic misconduct database
OASIS (Jones and Behrens 2003)
BAM (Jones and Luck 2009)
Plagiarism detection
IROG (Jones 2003)
Peer Review
Topic Allocation
Evaluating course and instructors Barometer

Table 5.13 shows the percentage of Webfuse courses that adopted features in each of the categories proposed Malikowski et al (2007) from 1997 through 2009. The “Malikowski %” row represents the level of feature adoption found by Malikowski et al (2007) in the LMS literature for usage reported before 2004. The “Blackboard %” row represents feature adoption within Blackboard by CQU courses during 2005. Blackboard was adopted as the official institutional LMS by CQU in 2004. The subsequent rows show the level of feature adoption within Webfuse courses from 1997 through 2007. The following describes some limitations and context for the data in Table 5.13 after which some additional visualisations of this data are shown and then some conclusions are drawn.

Table 5.13 – Feature adoption in Webfuse course sites (1999-2004)
Usage Transmitting content Class interactions Evaluating students Evaluating courses and instructors
Malikowski % >50% 20-50% 20-50% <20%
Blackboard % 94% 28% 17% 2%
1997 34.9% 1.8% 0.9% 9.2%
1998 38.4% 48.6% 1.4% 0.7%
1999 46.0% 9.0% 2.1% 9.5%
2000 46.6% 43.7% 24.7% 6.9%
2001 51.6% 32.4% 47.1% 28.3%
2002 69.6% 63.8% 57.7% 44.2%
2003 69.2% 68.5% 93.7% 37.7%
2004 61.3% 61.9% 91.8% 35.7%
2005 64.2% 69.2% 93.6% 39.8%
2006 70.0% 68.7% 105.1% 31.6%
2007 68.5% 102.0% 168.1% 33.1%
2008 72.9% 110.7% 192.0% 51.6%
2009 69.2% 105.7% 211.4% 42.7%

A variety of contextual factors and limitations are necessary to understand the data presented in Table 5.13. These include:

  • Missing course sites;
    As mentioned in previous tables both the course website archives for 1998 and 2000 are each missing course sites for a single term. The percentages shown in Table 5.13 represent the percentage of courses offered in the terms for which archival information is available.
  • Missing mailing lists;
    For most of the period shown in Table 5.13 a significant proportion of courses made use of electronic mailing lists for course communication. These lists, while supported by the Webfuse team, did not have an automated web interface until after the introduction of the default course sites. Information about the use of mailing lists before the default course sites is somewhat patchy. With none available before 2000 and only some information for 2000 and the first half of 2001.
  • Optional versus compulsory content transmission;
    All Webfuse courses sites, including both manually produced sites (pre 2nd half of 2001) and the default course sites (post 2nd half of 2001) included content. Rather than simply show 100%, Table 5.13 shows the percentage of courses where additional content was transmitted through the course site by teaching staff. This was an optional practice.
  • The definition of adoption and the course barometer;
    From 2001 through 2005 the presence of a course barometer was part of the Infocom default course site. This means that 100% of all Webfuse course sites had a course barometer. However, this is not represented in the figures for “evaluating courses and instructors” in Table 5.13. Instead, Table 5.13 includes the percentage of courses where the course barometer was actually used within the course.
  • Greater than 100% adoption.
    From 2006 onwards, both the class interactions and evaluating students columns suggest that greater than 100% of Webfuse course sites had adopted features in these categories. This arises due to the ability of courses to use a number of the Webfuse provided features (e.g. email merge and results upload) without using Webfuse for course sites.

The following graphs enable a visual comparison between the level of feature adoption within Webfuse and are also used to draw some conclusions about that adoption. These graphs use almost the same data as shown in Table 5.13 only separated into the four Malikowski et al (2007) categories. The only difference is that the following graphs also show how feature adoption for Blackboard changed over the period 2005 through 2009, rather than simply showing the level of adoption for 2005 as in Table 5.13. The Blackboard figures for 2009 only include data from the first CQU term, not for the entire year.

Figure 5.9 provides a visualisation of the percentage of courses using features associated with content transmission. The Malikowski et al (2007) range is identified by the dotted lines and represent that as of around 2004, it was common to find between 50% and 100% of course sites using content transmission features. The dashed line in Figure 5.9 shows that from 2005 through 2009 between 80% and 100% of CQU Blackboard course sites were using content transmission features. The thicker black line that includes data labels represents the percentage of Webfuse course sites using the option of adding content transmission features to the default course sites.

Content Transmission

Figure 5.9 – Percentage course sites adopting content transmission: Webfuse, Blackboard and Malikowski et al (2007) (click image to enlarge)

From Figure 5.9 it is possible to see that there was an increase from in the optional use of content transmission features when the default course site approach was introduced during the second half of 2001. In 2002, the first full year of operation for the default course site approach, there was an increase of over 20% use of content transmission features over 2000, the last full year without the default course site approach. From 2002 the adoption rate stayed above 60%.
Figure 5.10 shows the percentage of course websites adopting class interaction features such as discussion forums, chat rooms etc. As of 2004, Malikowski et al (2007) found that it was typical to find between 20% and 50% of course sites adopting these features. From 2005 through 2009, the percentage of Blackboard courses adoption class interaction features increased from 28% through 61%. The data series with the data labels represents the adoption of class interactions within Webfuse course sites and highlights some of the limitations and contextual issues discussed above about Table 5.13.


Figure 5.10 – Percentage course sites adopting class interactions: Webfuse, Blackboard and Malikowski et al (2007)(click image to enlarge)

As mentioned in the previous chapter, the Department of Mathematics and Computing (M&C) – in which the Webfuse work originated – had started using email lists in 1992 as a way of interacting with distance education students. These lists arose from the same place as Webfuse. As outlined above prior to 2001, the archives of these mailing lists were kept separate from the Webfuse course sites and records are somewhat patchy. For example, there are archives of the mailing lists for 1998, hence the peak of 48.6% in 1998. The 1.8% and 9% adoption figures for 1997 and 1999 represent years for which mailing list data is missing. In addition, the greater than 100% adoption rates in 2007-2009 arise from increased use of the email merge facility by courses that did not have Webfuse course sites. These courses accessed the email merge facility through Staff MyCQU.

Figure 5.10 shows that adoption of class interaction features were significantly higher within Webfuse than both the Malikowski averages and in Blackboard. Given that once adopted, it was unusual for a course mailing list to be dropped, unless replaced by a web-based discussion forum. It is thought that complete archives of the pre-2001 mailing lists archives would indicate that as early as 1997, almost 50% of Webfuse course sites had adopted some form of class room interaction. Most of this adoption arose from M&C courses continuing use of mailing lists. The increased adoption of class interaction features post 2002 arise from the increased prevalence of Web-based discussion, especially amongst non-M&C courses.

Figure 5.11 shows the percentage adoption of features related to student assessment – typically quizzes and online assignment submission. It shows that the typical Malikowski et al (2007) adoption rate is expected to be between 20% and 50%. It shows that CQU Blackboard adoption from 2005 through 2009 ranged between 17% and 30%. On the other hand, Webfuse adoption after a minimal adoption in 1997 through 1999, increased to levels of over 90% from 2003 through 2005 before exceeding 100% from 2006 onwards.

Evaluate Students

Figure 5.11 – Percentage course sites adopting student assessment: Webfuse, Blackboard and Malikowski et al (2007)(click image to enlarge)

The almost non-existent adoption of student assessment features within Webfuse from 1997 through 1999 represents the almost non-existent provision of these features. A primitive online assignment submission system was used in a small number of courses during these years, mostly those taught by the Webfuse designer. From 2000 onwards an online quiz system became available and a new assignment submission system began to be developed. From this stage on adoption grows to over 90% in 2003. The use of Webfuse student assessment features far outstrips both the Malikowski ranges and those of CQU Blackboard courses.

Figure 5.12 shows the adoption of course evaluation features. It shows that the expected Malikowski et al range (2007) to be between 0% and 20%. The adoption of course evaluation features by CQU Blackboard courses ranges from 2% in 2005 through to 5% in 2009. Prior to 2001, the Webfuse adoption rate is less than 10%, but then increases to range between 28% and 52% from 2001 on. This increase is generally due to increase availability of the Webfuse course barometer feature (see Section 5.3.6).

Evaluate Courses

Figure 5.12 – Percentage course sites adopting course evaluation: Webfuse, Blackboard and Malikowski et al (2007)(click image to enlarge)

Two of the peaks in the Webfuse adoption of course evaluation features from Figure 5.12 coincide with concerted efforts to encourage broader use of the course barometer. The 2002 peak at 44.2% coincides with the work of a barometer booster within Infocom during 2001 and early 2002 as described in Jones (2002). The 2008 peak of 51.6% coincides with a broader whole of CQU push to use the barometer for student evaluation purposes.

The above suggests that, in terms of feature adoption by courses, Webfuse and the default course site approach has been somewhat successful. It ensured that 100% of all courses offered by the organisational unit using Webfuse had a course site with some level of content transmission. With a significant additional level of content added to the course sites. Overall, there was a broader adoption of content transmission with less effort required by academics. In terms of course interactions, student assessment and course evaluation features, the services provided by Webfuse after 2001 has results in levels of adoption greater than broadly expected (as indicated by the Malikowski model) and than found in the use of the Blackboard system at the same institution.


Jones, D. (2003). How to live with ERP systems and thrive. Paper presented at the 2003 Tertiary Education Management Conference, Adelaide.

Jones, D., & Behrens, S. (2003). Online Assignment Management: An Evolutionary Tale. Paper presented at the 36th Annual Hawaii International Conference on System Sciences, Hawaii.

Jones, D., & Luck, J. (2009). Blog Aggregation Management: Reducing the Aggravation of Managing Student Blogging. Paper presented at the World Conference on Education Multimedia, Hypermedia and Telecommunications 2009. from

Malikowski, S., Thompson, M., & Theis, J. (2007). A model for research into course management systems: bridging technology and learning theory. Journal of Educational Computing Research, 36(2), 149-173.

The ethics of learning analytics: initial steps

Col’s recent blog post has just started the necessary process of the Indicators project paying closer attention to the question of ethics when applied to learning analytics. The following are some of my initial responses to Col’s post and an attempt to invite some additional suggestions from other folk around the question

What are the ethical problems and considerations that should form part of work around learning analytics?

Feel free to comment.

Pointers to literature

I’ve tried a quick search for literature around ethics and analytics, but have not been able to find anything specific. Will need to search further, would welcome any pointers to relevant literature.

Web data mining and learning analytics

Col’s post seems to depend mostly on a paper that examines ethical issues in web data mining (Wel and Royakkers, 2004). While learning analytics could certainly be seen as a subset of web data mining, I’m not convinced that its not without its differences.

Especially given that the indicators project is currently focused on using usage data from institutional learning management systems. For example, Col uses the following quote from Wel and Royakkers (2004)

Web mining does, however, pose a threat to some important ethical values like privacy and individuality. Web mining makes it difficult for an individual to autonomously control the unveiling and dissemination of data about his/her private life

When a student is using the institutional LMS, is this really a part of his/her private life? Like it or not the institutional LMS is owned by the LMS, it’s being used by the student for learning and the purpose of learning analytics is to help improve that learning.

In addition, Col repeats the point that Wel and Royakkers (2004) make that there are issues when the data is analysed without user knowledge. Well, all LMS contain a certain level of “analytics” functionality, it’s built into the systems. I’m not sure that students are made explicitly aware of this functionality or how it is used. Is this a problem?

Internet research

The CMC/Internet research community is one amongst many various other fields/sub-groupings dealing with these sorts of issues. Herring (2002) offers a overview of this field, including ethical issues.

In summary,

  • ease of data collection creates ethical concerns;
  • participants may not be aware their actions are being collected and studied;
  • while identifies may be masked, some systems/archives and ways of expressing material may make it possible to identify;
  • there has been some debate (e.g. Mann and Stewart 2000);
  • some research advocate obtaining informed consent;
  • others suggest asking permission when quoting comments and/or masking identifying information;
  • informed consent can cause problems, especially with critical research;
  • need to strike a balance between quality research and protecting users from harm.
  • there are debates around the definition of harm.

This is a fairly old reference, likely to be more up to date information in more recent research publications and in research methodology texts. Need to look at those.

This chapter from the SAGE handbook of online research methods seems a good place to start.


Herring, S. (2002). Computer-mediated communication on the Internet. Annual Review of Information Science and Technology, 36(1), 109-168.

Wel, L. v., & Royakkers, L. (2004). Ethical Issues in web data mining. Ethics and Information Technology, 6, 11.