Category Archives: thesis

The life and death of Webfuse: What’s wrong with industrial e-learning and how to fix it

The following is a collection of presentation resources (i.e. the slides) for an ASCILITE’2012 of this paper. The paper and presentation are a summary of the outcomes my PhD work. The thesis goes into much more detail.

Abstract

Drawing on the 14-year life and death of an integrated online learning environment used by tens of thousands of people, this paper argues that many of the principles and practices underpinning industrial e-learning – the current dominant institutional model – are inappropriate. The paper illustrates how industrial e-learning can limit outcomes of tertiary e-learning and limits the abilities of universities to respond to uncertainty and effectively explore the future of learning. It limits their ability to learn. The paper proposes one alternate set of successfully implemented principles and practices as being more appropriate for institutions seeking to learn for the future and lead in a climate of change.

Slides

The slides are available on Slideshare and should show up below. These slides are the extended version, prior to the cutting required to fit within the 20 minute time limit.

References

Arnott, D. (2006). Cognitive biases and decision support systems development: a design science approach. Information Systems Journal, 16, 55–78.

Behrens, S., Jamieson, K., Jones, D., & Cranston, M. (2005). Predicting system success using the Technology Acceptance Model: A case study. 16th Australasian Conference on Information Systems. Sydney.

Brews, P., & Hunt, M. (1999). Learning to plan and planning to learn: Resolving the planning school/learning school debate. Strategic Management, 20(10), 889–913.

Cecez-Kecmanovic, D., Janson, M., & Brown, A. (2002). The rationality framework for a critical study of information systems. Journal of Information Technology, 17, 215–227.

Central Queensland University. (2004). Faculty teaching and learning report. Rockhampton, Australia.

Davenport, T. (1998). Putting the Enterprise into the Enterprise System. Harvard Business Review, 76(4), 121–131.

Dede, C. (2008). Theoretical perspectives influencing the use of information technology in teaching and learning. In J. Voogt & G. Knezek (Eds.), (pp. 43–59). New York: Springer.

Dillard, J., & Yuthas, K. (2006). Enterprise resource planning systems and communicative action. Critical Perspectives on Accounting, 17(2-3), 202–223.

Fleming, P., & Spicer, A. (2003). Working at a cynical distance: Implications for power, subjectivity and resistance. Organization, 10(1), 157–179.

Haywood, T. (2002). Defining moments: Tension between richness and reach. In W. Dutton & B. Loader (Eds.), (pp. 39–49). London: Routledge.

Hutchins, E. (1991). Organizing work by adaptation. Organization Science, 2(1), 14–39.

Introna, L. (1996). Notes on ateleological information systems development. Information Technology & People, 9(4), 20–39.

Jamieson, K., & Hyland, P. (2006). Factors that influence Information Systems decisions and outcomes: A summary of key themes from four case studies. Adelaide, Australia.

Jones, D. (1996). Solving Some Problems of University Education: A Case Study. In R. Debreceny & A. Ellis (Eds.), Proceedings of AusWebÕ96 (pp. 243–252). Gold Coast, QLD: Southern Cross University Press.

Jones, D. (2002). Student Feedback, Anonymity, Observable Change and Course Barometers. In P. Barker & S. Rebelsky (Eds.), World Conference on Educational Multimedia, Hypermedia and Telecommunications 2002 (pp. 884–889). Denver, Colorado: AACE.

Jones, D. (2003). Course Barometers: Lessons gained from the widespread use of anonymous online formative evaluation. QUT, Brisbane.

Jones, D., & Buchanan, R. (1996). The design of an integrated online learning environment. In A. Christie, B. Vaughan, & P. James (Eds.), Making New Connections, asciliteÕ1996 (pp. 331–345). Adelaide.

Jones, D., & Luck, J. (2009). Blog Aggregation Management: Reducing the Aggravation of Managing Student Blogging. In G. Siemns & C. Fulford (Eds.), World Conference on Educational Multimedia, Hypermedia and Telecommunications 2009 (pp. 398–406). Chesapeake, VA: AACE.

Jones, N., & OÕShea, J. (2004). Challenging hierarchies: The impact of e-learning. Higher Education, 48, 379–395.

Katz, R. (2003). Balancing Technology and Tradition: The Example of Course Management Systems. EDUCAUSE Review, 38(4), 48–59.

Kurtz, C., & Snowden, D. (2007). Bramble Bushes in a Thicket: Narrative and the intangiables of learning networks. In M. Gibbert & T. Durand (Eds.), . Blackwell.

Laurillard, D. (2002). Rethinking University Teaching: A Conversational Framework for the Effective Use of Learning Technologies. London: Routledge.

Light, B., Holland, C. P., & Wills, K. (2001). ERP and best of breed: a comparative analysis. Business Process Management Journal, 7(3), 216–224.

March, J. (1991). Exploration and exploitation in organizational learning. Organization Science, 2(1), 71–87.

Mintzberg, H. (1989). Mintzberg on Management, Inside our Strange World of Organisations. New York: Free Press.

Morgan, Glenda. (2003). Faculty use of course management systems. Educause Centre for Applied Research.

Morgan, Glenn. (1992). Marketing discourse and practice: Towards a critical analysis. In M. Alvesson & H. Willmott (Eds.), (pp. 136–158). London: SAGE.

Pozzebon, M., Titah, R., & Pinsonneault, A. (2006). Combining social shaping of technology and communicative action theory for understanding rhetorical closuer in IT. Information Technology & People, 19(3), 244–271.

Robey, D., Ross, W., & Boudreau, M.-C. (2002). Learning to implement enterprise systems: An exploratory study of the dialectics of change. Journal of Management Information Systems, 19(1), 17–46.

Rossi, D., & Luck, J. (2011). Wrestling, wrangling and reaping: An exploration of educational practice and the transference of academic knowledge and skill in online learning contexts. Studies in Learning, Evaluation, Innovation and Development, 8(1), 60–75.

Seely Brown, J., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32–42.

Seely-Brown, J., & Hagel, J. (2005). From push to pull: The next frontier of innovation. The McKinsey Quarterly. McKinsey & Company.

Simon, H. (1991). Bounded rationality and organizational learning. Organization Science, 2(1), 125–134.

Sturgess, P., & Nouwens, F. (2004). Evaluation of online learning management systems. Turkish Online Journal of Distance Education, 5(3).

Thomas, J. (2012). Universities canÕt all be the same – it’s time we embraced diversity. The Conversation. Retrieved June 28, 2012, from http://theconversation.edu.au/universities-cant-all-be-the-same-its-time-we-embraced-diversity-7379

Truex, Duane, Baskerville, R., & Travis, J. (2000). Amethodical systems development: the deferred meaning of systems development methods. Accounting Management and Information Technologies, 10, 53–79.

Truex, Duanne, Baskerville, R., & Klein, H. (1999). Growing systems in emergent organizations. Communications of the ACM, 42(8), 117–123.

Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124–1131.

Underwood, J., & Dillon, G. (2011). Chasing dreams and recognising realities: teachersÕ responses to ICT. Technology, Pedagogy and Education, 20(3), 317–330. doi:10.1080/1475939X.2011.610932

Wagner, E., Scott, S., & Galliers, R. (2006). The creation of Òbest practiceÓ software: Myth, reality and ethics. Information and Organization, 16(3), 251–275.

Weick, K., & Quinn, R. (1999). Organizational change and development. Annual Review of Psychology, 50, 361–386.

The Texas sharpshooter fallacy and other issues for learning analytics

Becoming somewhat cynical about the headlong rush toward learning analytics I’m commencing an exploration of the problems associated with big data, data science and some of the other areas which form the foundation for learning analytics. The following is an ad hoc collection of some initial resources I’ve found and need to engage with.

Feel free to suggest some more.

The Texas sharpshooter fallacy

This particular fallacy gets a guernsey mainly because of the impact of its metaphoric title. From the Wikipedia page

The Texas sharpshooter fallacy often arises when a person has a large amount of data at their disposal, but only focuses on a small subset of that data. Random chance may give all the elements in that subset some kind of common property (or pair of common properties, when arguing for correlation). If the person fails to account for the likelihood of finding some subset in the large data with some common property strictly by chance alone, that person is likely committing a Texas Sharpshooter fallacy.

Critical questions for big data

Boyd, D., & Crawford, K. (2012). Critical questions for big data. Information, Communication & Society, 15(5), 662–679.

Abstract

The era of Big Data has begun. Computer scientists, physicists, economists, mathematicians, political scientists, bio-informaticists, sociologists, and other scholars are clamoring for access to the massive quantities of information produced by and about people, things, and their interactions. Diverse groups argue about the potential benefits and costs of analyzing genetic sequences, social media interactions, health records, phone logs, government records, and other digital traces left by people. Significant questions emerge. Will large-scale search data help us create better tools, services, and public goods? Or will it usher in a new wave of privacy incursions and invasive marketing? Will data analytics help us understand online communities and political movements? Or will it be used to track protesters and suppress speech? Will it transform how we study human communication and culture, or narrow the palette of research options and alter what ‘research’ means? Given the rise of Big Data as a socio-technical phenomenon, we argue that it is necessary to critically interrogate its assumptions and biases. In this article, we offer six provocations to spark conversations about the issues of Big Data: a cultural, technological, and scholarly phenomenon mythology that provokes extensive utopian and dystopian rhetoric.

The headings give a good idea of the provocations:

  • Big data changes the definition of knowledge.
  • Claims to objectivity and accuracy are misleading.
  • Bigger data are not always better data.
  • Taken out of context, Big data loses its meaning.
  • Just because it is accessible does not make it ethical.
  • Limited access to big data creates new digital divides.

Effects of big data analytics on organisations’ value creation

Mouthaan, N. (2012). Effects of big data analytics on organizations’ value creation. University of Amsterdam.

A Masters’ thesis, that amongst other things is

arguing that big data analytics might create value in two ways: by improving transaction efficiency and by supporting innovation, leading to new or improved products and services

and

this study also shows that big data analytics is indeed a hype created by
both potential users and suppliers and that many organizations are still experimenting with its implications as it is a new and relatively unexplored topic, both in scientific and organizational fields.

The promise and peril of big data

Bollier, D., & Firestone, C. (2010). The promise and peril of big data. Washington DC: The Aspen Institute.

Some good discussion of issues reported by a rappoteur, issues included.

  • How to make sense of big data?

    • Data correlation or scientific methods – Chris Anderson’s “Data deluge makes the scientific method obsolete” and responses. e.g. “MY TiVO thinks I’m gay”, gaming, the advantage of theory/deduction etc.
    • How should theories be crafted in the an age of big data?
    • Visualisation as a sense-making tool.
    • Bias-free interpretation of big data.

      Cleaning data requires decisions about what to ignore. Problem increased when data comes from different sources. Quote “One man’s noise is another man’s data”

    • Is more actually less?
      Does it yield new insights or create confusion and false confidence. “Big data is driven more by storage capabilities than by superior ways to ascertain useful knowledge”.
    • Correlations, causality and strategic decision making.
  • Business and social implications of big data
    • Social perils posed by big data
  • How should big data abuses be addressed?
  • Research ethics in emerging forms of online learning

    Esposito, A. (2012). Research ethics in emerging forms of online learning: issues arising from a hypothetical study on a MOOC. Electronic Journal of e-Learning, 10(3), 315–325.

    Will hopefully give some initial insights into the thorny issue of ethics.

    Data science and prediction

    Dhar, V. (2012). Data Science and Prediction. Available at SSRN. New York City.

    Appears to be slightly more “boosterish” than some of the other papers.

    Abstract

    The world’s data is growing more than 40% annually. Coupled with exponentially growing computing horsepower, this provides us with unprecedented basis for ‘learning’ useful things from the data through statistical induction without material human intervention and acting on them. Philosophers have long debated the merits and demerits of induction as a scientific method, the latter being that conclusions are not guaranteed to be certain and that multiple and numerous models can be conjured to explain the observed data. I propose that ‘big data’ brings a new and important perspective to these problems in that it greatly ameliorates historical concerns about induction, especially if our primary objective is prediction as opposed to causal model identification. Equally significantly, it propels us into an era of automated decision making, where computers will make the bulk of decisions because it is infeasible or more costly for humans to do so. In this paper, I describe how scale, integration and most importantly, prediction will be distinguishing hallmarks in this coming era of Data Science.’ In this brief monograph, I define this newly emerging field from business and research perspectives.

    Codes and codings in crisis: Signification, performativity and excess

    Mackenzie, A., & Vurdubakis, T. (2011). Codes and Codings in Crisis: Signification, Performativity and Excess. Theory, Culture & Society, 28(6), 3–23.

Three likely paths for learning analytics and academics

The following is an early attempt to write and share some thoughts on what, why and with what impacts Australian universities are going to engage with learning analytics over the next couple of years. Currently it’s fairly generic and the same structure could be used with any fad or change process.

You could read the next section, but it’s basically an argument as to why its important to consider the how learning analytics will impact academics. The three likely paths section describes the paths.

Context and rationale

By all indications learning analytics is one of the next big things in university learning and teaching. Ferguson (2012) identifies learning analytics as one of the fastest-growing areas of research within technology-enhanced learning with interest being driven by a combination of technological, pedagogical, political and economic drivers. The 2012 Horizon Report (Johnson & Cummins, 2012) argues that while learning analytics is still in its early stages of development it is likely to see widespread adoption within the higher education sector in the next 2-3 years. The recent Horizon technology outlook for Australian Tertiary Education (Johnson, Adams, & Cummins, 2012) places learning analytics – for the first time anywhere in the world – into the “one year or less” time-frame for adoption. Given what I hear around the traps, it would appear that every single Australian university is doing something (or thinking about it) around learning analytics.

My interest is in how these plans are going to impact upon academics and their pedagogical practice. It’s a fairly narrow view, but an interesting, self-serving and possibly important one. Johnson & Cummins (2012) suggestion that the larger promise of learning analytics is when it is used “to more precisely understand students’ learning needs and to tailor instruction appropriately far more accurately and far sooner than is possible today” (p. 23). I don’t think automated-tutoring information systems are going to be up to that task anytime soon. At least not across a broad cross-section of what is taught at Universities. So academics/teachers/instructors will be involved in someway.

But we don’t know much about this and it appears to be difficult. Dawson et al. (2011) make the observation of “a dearth of studies that have investigated the relationship between learning analytics and data requirements that would better assist teachers in the design and evaluation of learning and teaching practice” (p. 4). Not only that, it has been found that being able to interpret the patterns provided by learning analytics and apply them to practice is difficult, time-consuming and requires additional support (Dawson et al., 2011; Dawson & McWilliam, 2008). So I wonder how and with what impacts the almost inevitable strategies adopted by Australian Universities will help with this.

Not surprisingly, I am not optimistic. So I’m trying to develop some sort of framework to help think about the different likely paths they might adopt, the perspectives which underpin these paths, and what the likely positives, problems and outcomes might be of those paths.

The three likely paths

For the moment, I’ve identified three likely paths which I’ve labelled as

  1. Do it to the academics.
  2. Do it for the academics.
  3. Do it with the academics.

There are probably other paths (e.g. do nothing, ignore the academics) that might be adopted, but I feel these are probably the most likely.

These are listed in order of which I think are most likely to happen. There may be examples of path #3 spread throughout institutions, but I fear they will be few and far between.

Currently, it’s my theory that organisations probably need to travel all three paths. The trouble is that the 3rd path will probably be ignored and this will reduce the impact and adoption of learning analytics.

The eventual plan is to compare and contrast these different paths by the different assumptions or perspectives on which they are based. The following gives a quick explanation of each of the paths and an initial start on this analysis.

Those of you who know me, can probably see some correspondence between these three paths and the 3 levels of improving teaching. There is a definite preference in the following for the 3rd path, this is not to suggest that it should (or can) be the only path explored, or that the other paths have no value. All three have there part to play, but I think it would be wrong if the 3rd path was ignored.

Perhaps that’s the point here, to highlight the need for the 3rd path to complement the limitations of the other two. Not to mention helping surface some of the limitations of the other two so they can be appropriately addressed.

Some questions for your

  • Is there any value in this analysis?
  • What perspectives/insights/theories do I need to look at to better inform this?
  • What might be some useful analysis lens for the three paths?
  • Are there other paths?
  • What am I missing?

Do it to the academics

It seems a fair bit of interest in learning analytics is being driven by non-teaching folk. Student administration and IT folk amongst the foremost with senior management in there somewhere as well. Long and Siemens (2001) define this level as academic analytics rather than learning analytics. But I believe it belongs here because of the likelihood that if senior managers use academic analytics to make decisions, that some of the decisions they make will have an impact on academics (i.e. do it to them).

I can see this path leading to outcomes like

  • Implementation of a data warehouse, various dashboards and reports.
  • Some of these may be used to make data-driven decisions.
  • The implementation of various strategies such as “at-risk” processes that are done independently of academics.
  • At it’s worst, the creation of various policies or processes that require courses to meet certain targets or adopt certain practices (e.g. the worst type of “common course site policy), i.e. performativity.

In terms of analysing/characterising this type of approach, you might suggest

  • Will tend to actually be academic analytics, rather than learning analytics (as defined by Long and Siemens, 2011) but may get down to learning analytics at the departmental level.
  • It’s based on a “If I tell them to do it, they will…” assumption.
    i.e. what is written in the policy is what the academics will actually do.
  • A tendency to result in task corruption and apparent compliance.
  • It assumes academics will change their teaching practice based on what you told them to do.
  • Is based on the assumptions of teleological processes.
    i.e. the system is stable and predictable, the designers can manipulate the system’s behaviour, and the designer’s can determine the goals/criteria for success.
  • It is located a long way from the actual context of learning and teaching and assumes that big data sets and data mining algorithms will enable the identification of useful information that can guide decision making.
  • It does not recognise the diversity inherent in teaching/teachers and learning/learners.
    Assumes learning is like sleeping
  • It is based on the assumption of senior managers (of people in general) as rational decision makers, if only they had the right data.
  • What is actually done will rely heavily on which vendor gets chosen to implement.
  • Will be largely limited to the data that is already in the system(s).

Do it for the academic

There are possibly two sub-paths within this path

  1. The researcher path.
    Interested researchers develop theoretically-informed, research-based approaches to how learning analytics can be used by academics to improve what they do. They are developing methods for the use of academics.
  2. The support division path.
    This is where the Information Technology or Learning and Teaching support division of the university note the current buzz-word (learning analytics) and implement a tool, some staff development etc to enable academics to harness the buzz-word.

In terms of analysing/characterising this approach, I might identify

  • It’s based on the “If I build it, they will come” assumption.
  • It assumes you can improve/change teaching by providing new tools and awareness.
  • Which generally hasn’t worked for a range of reasons including perhaps the chasm
    i.e. the small number of early adopter academics engage, the vast majority don’t.
  • It assumes some level of commonality in teaching/teachers and learning/learners.
    At least at some level, as it assumes implementing a particular tool or approach may be applicable across the organisation. Assumes learning is perhaps more like eating?
  • It assumes that the researchers or the support division have sufficient insight to develop something appropriate.
  • It assumes we know enough about learning analytics and helping academics use learning analytics to inform pedagogical practice to enshrine practice around a particular set of tools.
  • Is based on the assumptions of teleological processes.
    i.e. the system is stable and predictable, the designers can manipulate the system’s behaviour, and the designer’s can determine the goals/criteria for success.
  • It will be constrained by the institutions existing systems and the support divisions’ people and their connections.
  • The support division path can be heavily influenced by the perspective of the academic (or others) as a client/customer which assumes that the client/customer knows what they want and because they generally don’t often sinks to a process of “manage the customer” rather than help.

Do it with the academic

In this path the application of learning analytics is treated as something that needs to be learned about. Folk work with the academics to explore how learning analytics can be best used to inform individual pedagogical practice. Perhaps drawing on insights from the other paths, but also modifying the other paths based on what is learned.

In terms of analysing/characterising this approach, I might identify

  • It assumes that if you want to change/improve teaching, then the academics need to learn and be helped to learn.
    (That probably sounds more condescending than I would like).
  • Based on a “If they learn it, they will do it” premise.
    Which doesn’t have to be true.
  • It assumes learning/learners and teaching/teachers are incredibly diverse.
  • It assumes we don’t know enough about what might be found with learning analytics and how we might learn how to use it.
  • Assumption that the system(s) in place will change in response to this learning which in turn means more learning ….and the cycle continues.

References

Dawson, S., Bakharia, A., Lockyer, L., & Heathcote, E. (2011). “Seeing” networks : visualising and evaluating student learning networks Final Report 2011. Main. Canberra.

Dawson, S., & McWilliam, E. (2008). Investigating the application of IT generated data as an indicator of learning and teaching performance. Canberra: Australian Learning and Teaching Council.

Ferguson, R. (2012). The State of Learning Analytics in 2012 : A Review and Future Challenges a review and future challenges. Milton Keynes: UK.

Johnson, L., Adams, S., & Cummins, M. (2012). Technology Outlook for Australian Tertiary Education 2012-2017: An NMC Horizon Report Regional Analysis. New Media Consortium. Austin, Texas.

Johnson, L., & Cummins, M. (2012). The NMC Horizon Report: 2012 Higher Education Edition (p. 42). Austin, Texas.

Siemens, G., & Long, P. (2011). Penetrating the Fog: Analytics in Learning and Education. EDUCAUSE Review, 46(5).

Compliance cultures and transforming the quality of e-learning

In putting the finishing touches on this ASCILITE paper I discovered that Tuesday will be the 2 year anniversary of when I first put together much of the following on attempts by universities to improve/transform the quality of e-learning through checklists and other “quality assurance” methods. Given that I still see this tendency from central L&T folk in Universities – especially those in management – and that the original checklist the sparked the following has been largely gotten rid of, I thought I’d share this.

The anecdotal spark will be briefly touched upon in the ASCILITE paper, the quick summary of some literature won’t be due to space constraints. But I do find it increasingly interesting/frightening/sad that these approaches are still being adopted, even with the widespread knowledge of what actually happens.

The anecdotal spark

The spark for this was a chat with a friend who was and is a Senior Lecturer within a Faculty at an Australian University. I was in a central L&T support role. My friend ins one of the few academics who was widely respected and made significant contributions to the institution. He/she, however, was being increasingly frustrated by the “quality assurance” of L&T, especially the recent introduction of a checklist for the minimum service standard for course websites. The nature of the checklist and the technology used to implement and manage it was so pointless that the widespread academic way of dealing with the checklist was captured by this quote

I go in and tick all the boxes, the moderator goes in and ticks all the boxes and the school secretary does the same thing. It’s just like the exam check list.

This was always a bit sad because the intent – at least the published, espoused intent – of the minimum service standards was to act as a starting point for “integrating learning and teaching strategies that could influence students study habits” and to “encourage academic staff to look beyond existing practices and consider the useful features of the new LMS” (Tickle et al., 2009, p. 1042). But the outcome was no great surprise given what is said in the literature.

Some of the literature

Knight and Trowler (2000)

Likewise, attempts to improve teaching by coercion run the risk of producing compliance cultures, in which there is ‘change without change’ , while simultaneously compounding negative feelings about academic work

Harvey and Newton (2004, p. 149)

These studies reinforce the view that quality is about compliance and accountability and has, in itself, contributed little to any effective transformation of the student learning experience.

Radloff (2008, n.p.)

Staff may question the institutional approach to quality which they perceive as
compliance driven creating ‘busy work’ (Anderson 2006; Harvey & Newton 2004; Laughton 2003) with little positive impact on teaching practice and student learning experiences (Harvey 2006). They may therefore try to avoid, subvert or actively reject attempts to implement quality systems and processes. As Jones and de Saram
(2005, p. 48) note, “It is relatively easy to develop a system and sets of procedures for quality assurance and improvement on paper. To produce a situation where staff on campus ‘buy into’ this in an authentic and energetic manner is much more difficult”.

What’s really surprising is that the last author quoted here was the Pro-Vice Chancellor responsible for learning and teaching just before the checklist approach was introduced.

References

Knight, P., & Trowler, P. (2000). Department-level Cultures and the Improvement of Learning and Teaching. Studies in Higher Education, 25(1), 69–83.

Harvey, L., & Newton, J. (2004). Transforming quality evaluation. Quality in Higher Education, 10(2), 149–165.

Radloff, A. (2008). Engaging staff in quality learning and teaching: What’s a Pro Vice Chancellor to do? In Engaging Communities, Proceedings of the 31st HERDSA Annual Conference (pp. 285–296). Rotorua.

Tickle, K., Muldoon, N., & Tennent, B. (2009). Moodle and the institutional repositioning of learning and teaching at CQUniversity. Proceedings ascilite Auckland 2009 (pp. 1038–1047). Auckland, NZ.

The illusion we understand the past fosters overconfidence in our ability to predict the future

As mentioned in the last post I’m currently reading Thinking, Fast and Slow by Daniel Kahneman. The title of this post comes from this quote from that book

The illusion that we understand the past fosters overconfidence in our ability to predict the future

Earlier in the same paragraph Kahneman writes

As Nassim Taleb pointed out in The Black Swan, our tendency to construct and believe coherent narratives of the past makes it difficult for us to accept the limits of our forecasting ability.

Later in the same chapter, Kahneman writes (my emphasis)

The main point of this chapter is not that people who attempt to predict the future make many errors; that goes without saying. The first lesson is that errors of prediction are inevitable because the world is unpredictable. The second is that high subjective confidence is not to be trusted as an indicator of accuracy (low confidence could be more informative).

The connection to e-learning and the LMS

I read this section of Kahneman’s book while at lunch. On returning I found that @sthcrft had written about “The post-LMS non-apocalypse” in part influenced by @timklapdor’s post from earlier this week Sit down we need to talk about the LMS.

In @sthcrft’s post she tries (and by her own admission somewhat fails) at describing what a “post-LMS” world might look like. She’s being asked to predict the future. Which given the above (and a range of other perspectives) a silly thing to try and do. And this is my main problem with the current top-down, “management science” driven approach being adopted by universities. An approach that is predicated on the assumption that you can predict the future. But, before moving onto management, lets just focus on the management of IT systems and the LMS.

(About to paraphrase some of my own comments on @sthcrft’s post).

I have a problem with the LMS as a product model. It has serious flaws. But in seeking to replace the LMS, most universities are continuing to use the same Process model. The plan-driven process model that underpins all enterprise information systems procurement/development assumes you can predict the future. In this case, that you can predict all of the features that are ever going to be required by all of the potential users of the system.
Not going to happen.

Even though I like @timklapdor’s idea of the environment as a much better product model. It will suffer from exactly the same problems if it is developed/implemented without changing the process model and all that it brings with it. The focus on the plan-driven process model ends up with hierarchical organisations with the wrong types of people/roles with the wrong types of inter-connections between them to deal with the “post-lms” world.

This is one of the reasons why I don’t think the adoption of open source LMSes (e.g. Moodle) are going to show any significant changes in the practice of e-learning.

This is the point I will try to make in an 2012 ASCILITE paper. In that same paper, I’ll briefly touch on an alternative. For the longer version of that story – made significantly inaccessible through the requirements of academic writing – see my thesis.

Management and narratives

On a related note, a conversation with a colleague today reinforced the idea that one of the primary tasks taken on by senior managers (e.g. Vice-Chancellors) of a certain type is the creation of coherent narratives. Creating a positive narrative of the institution and its direction and accomplishments seems to have become a necessary tool to demonstrate that the senior manager has made a positive contribution to the institution. It’s a narrative destined to please all stakeholders, perhaps especially the senior managers set of potential employers.

I wonder about the cause/impact that this increasing emphasis on a coherent institutional narrative has on the belief of those within organisations that you can and should predict the future? I wonder if this type of narrative preventing organisations from preparing to fulfil Alan Kay’s quotation

The best way to predict the future is to make it

Perhaps organisations with certain types of leaders are too busy focused on predicting the future that they can’t actually make it?
Management is all about constructing coherent narratives.

On a tension with teaching designs heavy on constructive alignment

Constructive alignment is an approach to designing courses where there is – not surprisingly – alignment between what the students do, what is assessed and what it is intended that they will learn. It’s gotten a lot of play in the higher education sector over recent years. It has some value, but I’ve always had some qualms about constructive alignment, but I’d like to add another observation about an apparent tension within constructively aligned courses.

Beyond my prior experience, I’m currently teaching a course designed by another academic that has been explicitly informed by constructive alignment. It’s a masters course and the design overall seems quite fitting and it it is certainly aligned. I quite like the design and think it has the potential – all other things being equal – engage the students in some quality learning. However, this alignment is also the apparent source of some tension.

The course is really very hard to get your head around. Trying to understand what a student has to do to complete the course is actually quite complicated. A part of this is the intricate, interconnection between everything. It’s just not a lecture and some assignments. Everything contributes to the end goal. This both reduces the freedom and flexibility of the students, but also means that to feel comfortable in the course they have to understand everything.

The difficulty of intricately, interconnecting all of this has also led to the design of some activities or names for activities which don’t exactly match the common definition for that name. In this case, what is called an online symposium is probably more a writers workshop or peer review session. This leads to students existing understandings creating dissonance with what is actually meant in the course.

Is a course that really tries to follow constructive alignment, destined to have to deal with a tension between difficult for students to understand and generating quality learning outcomes?

People and e-learning – limitations and an alternative

So the last of three sections examining the limitations of industrial e-learning and suggesting an alternative. Time to write the conclusion, read the paper over again and cut it down to size.

People

The characteristics of the product and process of industrial e-learning (e.g. focus on long periods of stable use and the importance of efficient use of the chosen LMS) directly reinforced by and directly impact the people and roles involved with tertiary e-learning. This section briefly examines just four examples of this impact, including:

  1. The negative impact of organizational hierarchies on communication and knowledge sharing.
    The logical decomposition inherent in teleological design creates numerous, often significant, organizational boundaries between the people involved with e-learning. Such boundaries are seen as inhibiting the ability to integrate knowledge across the organization. The following comments from Rossi and Luck (2011, p. 68) partially illustrate this problem:
    During training sessions … several people made suggestions and raised issues with the structure and use of Moodle. As these suggestions and issues were not recorded and the trainers did not feed them back to the programmers … This resulted in frustration for academic staff when teaching with Moodle for the first time as the problems were not fixed before teaching started.

  2. Chinese whispers.
    Within an appropriate governance structure the need for changes to an LMS would typically need to flow up from the users to a central committee typically made up of senior leaders from the faculties, Information Technology and central learning and teaching. There would normally be some representation from teaching staff and students. The length of the communication chain for the original need becomes like a game of Chinese Whispers as it is interpreted through the experiences and biases of those involved. Leading to this impression reported by Rossi and Luck (2011, p. 69)
    The longer the communication chain, the less likely it was that academic users’ concerns would be communicated correctly to the people who could fix the problems.

    The cost of traversing this chain of communication means it is typically not worth the effort of raising small-scale changes.

    Not to mention killing creativity which just came through my Twitter feed thanks to @kyliebudge.

  3. Mixed purposes.
    Logical decomposition also encourages different organizational units to focus on their part of the problem and lose sight of the whole picture. An IT division evaluated on its ability to minimize cost and maximize availability is not likely to want to support technologies in which it has limited expertise. This is one explanation for why the leader of an IT division would direct the IT division’s representatives on an LMS selection panel to ensure that the panel selected the LMS implemented in Java. Or a decision to use the latest version of the Oracle DBMS – the DBMS supported by the IT division – to support the new Moodle installation even though it hasn’t been tested with Moodle and best practice advice is to avoid Oracle. A decision that leads to weeks at the start of the “go live” term where Moodle is largely unavailable.
  4. The perils of senior leadership.
    Having the support and engagement of a senior leader at an institution is often seen as a critical success factor for an LMS implementation. But when the successful completion of the project is tied to the leader’s progression within the leadership hierarchy it can create the situation where the project will be deemed a success, regardless of the outcome.

As an alternative, the Webfuse system relied on a multi-skilled, integrated development and support team. This meant that the small team was responsible for training, helpdesk support, and systems development. The helpdesk person handling the user’s problem was typically also a Webfuse developer who was empowered to make small changes without formal governance approval. Behrens (2009, p. 127) quotes a manager in CQU’s IT division describing the types of changes made to Webfuse as “not even on the priority radar” due to traditional IT management techniques. The developers were also located within the faculty, so they also interacted with academic staff in the corridors and the staff room. This context created an approach to the support of an e-learning system with all the hallmarks of a social constructivist, situated cognition, or community of practice. The type of collaborative and supportive environment identified by Tickle et al (2009) in which academics learn through attempts to solve genuine educational problems, rather than being shown how to adapt their needs to the constraints of the LMS.

References

Behrens, S. (2009). Shadow systems: the good, the bad and the ugly. Communications of the ACM, 52(2), 124-129.

Rossi, D., & Luck, J. (2011). Wrestling, wrangling and reaping: An exploration of educational practice and the transference of academic knowledge and skill in online learning contexts. Studies in Learning, Evaluation, Innovation and Development, 8(1), 60-75. Retrieved from http://www.sleid.cqu.edu.au/include/getdoc.php?id=1122&article=391&mode=pdf

Tickle, K., Muldoon, N., & Tennent, B. (2009). Moodle and the institutional repositioning of learning and teaching at CQUniversity. Auckland, NZ. Retrieved from http://www.ascilite.org.au/conferences/auckland09/procs/tickle.pdf

Introducing the alternative

The last couple of posts have attempted to (in the confines of an #ascilite12 paper) summarise some constraints with the dominant product and process models used in industrial e-learning and suggest an alternative. The following – which probably should have been posted first – describes how and where this alternative comes from.

As all this is meant to go into an academic paper, the following starts with a discussion about “research methods” before moving onto describing some of the reasons why this alternative approach might have some merit.

As with the prior posts, this is all still first draft stuff.

Research methods and limitations

From the initial stages of its design the Webfuse system was intended to be a vehicle for both practice (it hosted over 3000 course sites from 1997-2009) and research. Underpinning the evolution of Webfuse was an on-going process of cycle action research that sought to continually improve the system through insights from theory and observation of use. This commenced in 1996 and continued, at varying levels of intensity, through to 2009 when the system ceased directly supporting e-learning. This work has contributed in varying ways to over 25 peer-reviewed publications. Webfuse has also been studied by other researchers investigating institutional adoption of e-learning systems (Danaher, Luck, & McConachie, 2005) and shadow systems in the context of ERP implementation (Behrens, 2009; Behrens & Sedera, 2004).

Starting in 2001 the design of Webuse became the focus of a PhD thesis (Jones, 2011) that made two contributions towards understanding e-learning implementation within universities: the Ps Framework and an Information Systems Design Theory (ISDT). The Ps Framework arose out of an analysis of existing e-learning implementation practices and as a tool to enable the comparison of alternate approaches (Jones, Vallack, & Fitzgerald-Hood, 2008). The formulated ISDT – An ISDT for emergent university e-learning systems –offers guidance for e-learning implementation that brings a number of proposed advantages over industrial e-learing. These contributions to knowledge arose from an action research process that combined broad theoretical knowledge – the principles of the ISDT are supported by insights from a range of kernel theories – with empirical evidence arising from the design and support of a successful e-learning system. Rather than present the complete ISDT – due primarily to space constraints – this paper focuses on how three important components of e-learning can be re-conceptualised through the principles of the ISDT.

The ISDT – and the sub-set of principles presented in this paper – seek to provide theoretical guidance about how to develop and support information systems for university e-learning that are capable of responding to the dominant characteristics (diversity, uncertainty and rapid change) of university e-learning. This is achieved through a combination of product (principles of form and function) and process (principles of implementation) that focus on developing a deep and evolving understanding of the context and use of e-learning. It is through being able to use that understanding to make rapid changes to the system, which ultimately encourages and enables adoption and on-going adaptation. It suggests that any instantiation built following the ISDT will support e-learning in a way that: is specific to the institutional context; results in greater quality, quantity and variety of adoption; and, improves the differentiation and competitive advantage of the host institution.

As with all research, the study described within this study has a number of limitations that should be kept in mind when considering its findings. Through its use of action research, this work suffers the same limitations, to varying degrees, of all action research. Baskerville and Wood-Harper (1996) identify these limitations as: (1) lack of impartiality of the researcher; (2) lack of discipline; (3) mistaken for consulting; and (4) context-dependency leading to difficulty of generalizing findings. These limitations have been addressed within this study through a variety of means including: a history of peer-reviewed publications throughout the process; use of objective data sources; the generation of theory; and, an on-going process of testing. Consequently the resulting ISDT and the principles described here have not been “proven”. This was not the aim of this work. Instead, the intent was to gather sufficient empirical and theoretical support to build and propose a coherent and useful alternative to industrial e-learning. The question of proof and further testing of the ISDT in similar and different contexts provides – as in all research aiming to generate theory – an avenue for future research.

On the value of Webfuse

This section aims to show that there is some value in considering Webfuse. It seeks to summarise the empirical support for the ISDT and the principles described here by presenting evidence that the development of Webfuse led to a range of features specific to the institution and to greater levels of adoption. It is important to note that from 1997 through 2005 Webfuse was funded and controlled by one of five faculties at CQUniversity. Webfuse did not become a system controlled by a central IT division until 2005/2006 as a result of organizational restructures. During the life-span of Webfuse CQU adopted three different official, institutional LMS: WebCT (1999), Blackboard (2004), and Moodle (2010).

Specific to the context

During the period from 1999 through 2002 the “Webfuse faculty” saw a significant increase in the complexity of its teaching model including the addition of numerous international campuses situated within capital cities and a doubling in student numbers, primarily through full-fee paying overseas students. By 2002, the “Webfuse faculty” was teaching 30% of all students at the University. Due to the significant increased in complexity of teaching in this context, a range of teaching management and support services were integrated into Webfuse including: staff and student “portals”, an online assignment submission and management system, a results upload application, an informal review of grade system, a timetable generator, student photo gallery, academic misconduct database, email merge facility, and assignment extension systems.

The value of these systems to the faculty is illustrated by this quote from the Faculty annual report for 2003 cited by Danaher, Luck & McConachie (2005, p. 39)

[t]he best thing about teaching and learning in this faculty in 2003 would be the development of technologically progressive academic information systems that provide better service to our students and staff and make our teaching more effective. Webfuse and MyInfocom development has greatly assisted staff to cope with the complexities of delivering courses across a large multi-site operation.

By 2003 the faculties not using Webfuse were actively negotiating to enable their staff to have access to the services. In 2009 alone, over 12,000 students and 1100 staff made use of these services. Even though no longer officially supported, it is a few of these services that continue to be used by the university in the middle of 2012.

Quotes from staff using the Webfuse systems reported in various publications (Behrens, 2009; Behrens, Jamieson, Jones, & Cranston, 2005; Jones, Cranston, Behrens, & Jamieson, 2005) also provide some insights into how well Webfuse supported the specific context at CQUni.

my positive experience with other Infocom systems gives me confidence that OASIS would be no different. The systems team have a very good track record that inspires confidence

The key to easy use of OASIS is that it is not a off the shelf product that is sooooo generic that it has lost its way as a course delivery tool.

I remember talking to [a Webfuse developer] and saying how I was having these problems with uploading our final results into [the Enterprise Resource Planning (ERP) system] for the faculty. He basically said, “No problem, we can get our system to handle that”…and ‘Hey presto!’ there was this new piece of functionality added to the system … You felt really involved … You didn’t feel as though you had to jump through hoops to get something done.

Beyond context specific systems supporting the management of learning and teaching, Webfuse also included a number of context specific learning and teaching innovations. A short list of examples includes:

  • the course barometer;
    Based on an innovation (Svensson, Andersson, Gadd, & Johnsson, 1999) seen at a conference the barometer was designed to provide students a simple, anonymous method for providing informal, formative feedback about a course (Jones, 2002). Initially intended only for the authors courses, the barometer became a required part of all Webfuse course sites from 2001 through 2005. In 2007/2008 the barometers were used as part of a whole of institution attempt to encourage formative feedback in both Webfuse and Blackboard.
  • Blog Aggregation Management (BAM); and
    BAM allowed students to create individual, externally hosted web-logs (blog) and use them as reflective journals. Students registered their external blog with BAM, which then mirrored all of the students’ blog posts on an institutional server and provided a management and marking interface for teaching staff. Created by the author for use in his own teaching in 2006, BAM was subsequently used in 26 course offerings by 2050+ students and ported to Moodle as BIM (Jones & Luck, 2009). In reviewing BAM, the ELI guide to blogging (Coghlan et al., 2007) identified as
    One of the most compelling aspects of the project was the simple way it married Web 2.0 applications with institutional systems. This approach has the potential to give institutional teaching and learning systems greater efficacy and agility by making use of the many free or inexpensive—but useful—tools like blogs proliferating on the Internet and to liberate institutional computing staff and resources for other efforts.
  • A Web 2.0 course site.
    While it looked like a normal course website, none of the functionality – including discussion, wiki, blog, portfolio and resource sharing – was implemented by Webfuse. Instead, freely available and externally hosted Web 2.0 tools and services provided all of the functionality. For example, each student had a portfolio and a weblog provided by the site http://redbubble.com. The content of the default course site was populated by using BAM to aggregate RSS feeds (generated by the external tools) which were then parsed and displayed by Javascript functions within the course site pages. Typically students and staff did not visit the default course site, as they could access all content by using a course OPML file and an appropriate reader application.

Even within the constraints placed on the development of Webfuse it was able to develop an array of e-learning applications that are either not present in industrial LMSes, were added much later than the Webfuse services, or had significantly reduced functionality.

Greater levels of adoption

Encouraging staff adoption of the Webfuse system was one of the main issues raised in the original Webfuse paper (Jones & Buchanan, 1996). Difficulties in encouraging high levels of quality use of e-learning within universities has remained a theme throughout the literature. Initial use of Webfuse in 1997 and 1998 was not all that successful in achieving that goal, with only five – including the designer of Webfuse who made 50% of all edits using the system – of 60 academic staff making any significant use of Webfuse by early 1999 (Jones & Lynch, 1999). These limitations were addressed from 1999 onwards by a range of changes to the system, how it was supported and the organizational context. The following illustrates the success of these changes by comparing Webfuse adoption with that of the official LMS (WebCT 1999-2003/4; Blackboard 2004-2009) used primarily by the non-Webfuse faculties. It first examines the number of course sites and then examines feature adoption.

From 1997 Webfuse automatically created a default course site for all Faculty courses by drawing on a range of existing course related information. For the official institutional LMS course sites were typically created on request and had to be populated by the academics. By the end of 2003 – 4 years after the initial introduction of WebCT as the official institutional LMS – only 15% (141) of courses from the non-Webfuse faculties had WebCT course sites. At the same time, 100% (302) of the courses from the Webfuse faculty had course sites. Due to the need for academics to populate WebCT and Blackboard courses sites, the presence of a course website doesn’t necessarily imply use. For example, Tickle et al (2009) report that 21% of the 417 Blackboard courses being migrated to Moodle in 2010 contained no documents.

Research examining the adoption of specific categories of LMS features provides a more useful insight into LMS usage. Figures 1 through 4 use the research model proposed by Malikowski, Thompson, & Thies (2007) to compare the adoption of LMS features between Webfuse (the thick continuous lines in each figure), CQUni’s version of Blackboard (the dashed lines), and range of adoption rates found in the literature by Malikowski et al (2007) (the two dotted lines in each figure). This is done for four of the five LMS feature categories identified by Malikowski et al (2007): content transmission (Figure 1), class interaction (Figure 2), student assessment (Figure 3), and course evaluation (Figure 4).

(Click on the graphs to see large versions)

Content Transmission Interactions
Figure 1: Adoption of content transmission features: Webfuse, Blackboard and Malikowski Figure 2: Adoption of class interactions features: Webfuse, Blackboard and Malikowski
(missing archives of most pre-2002 course mailing lists)
Evaluate Students Evaluate Courses
Figure 3: Adoption of student assessment features: Webfuse, Blackboard and Malikowski Figure 4: Adoption of course evaluation features: Webfuse, Blackboard and Malikowski

The Webfuse usage data included in Figures 1 through 4 only include actual feature use by academics or students. For example, from 2001 through 2005 100% of Webfuse courses contained a course evaluation feature called a course barometer, only courses where the course barometer was actually used by students are included in Figure 4. Similarly, all Webfuse default course sites contained content (either automatically added from existing data repositories or copied across from a previous term). Figure 1 only includes data for those Webfuse course sites where teaching staff modified or added content.

Figures 2 and 3 indicate Webfuse adoption rates of greater than 100%. This is possible because a number of Webfuse features – including the EmailMerge and online assignment submission and management applications – were being used in course sites hosted on Blackboard. Webfuse was seen as providing services that Blackboard did not provide, or that were significantly better than what Blackboard did provide. Similarly, the spike in Webfuse course evaluation feature adoption in 2008 to 51.6% is due to a CQU wide push to improve formative feedback across all courses that relied on the Webfuse course barometer feature.

Excluding use by non-Webfuse courses and focusing on the time period 2003-2006, Figures 2 and 3 show that adoption of Webfuse class interaction and student assessment features significantly higher than the equivalent Blackboard features at CQU. It is also significantly higher than the adoption rates found by Malikowski et al (2007) in the broader literature. It also shows adoption rates that appear to be somewhat higher than that found amongst 2008, Semester 1 courses at the University of Western Sydney and Griffith University by Rankine et al (2009). Though it should be noted that Rankine et al (2009) used different sampling and feature categorization strategies that make this comparison tentative.

References

Behrens, S. (2009). Shadow systems: the good, the bad and the ugly. Communications of the ACM, 52(2), 124-129.

Behrens, S., Jamieson, K., Jones, D., & Cranston, M. (2005). Predicting system success using the Technology Acceptance Model: A case study. 16th Australasian Conference on Information Systems. Sydney. Retrieved from http://cgit.nutn.edu.tw:8080/cgit/PaperDL/tkw_090717140108.pdf

Behrens, S., & Sedera, W. (2004). Why do shadow systems exist after an ERP implementation? Lessons from a case study. In C.-P. Wei (Ed.), (pp. 1713-1726). Shanghai, China.

Coghlan, E., Crawford, J., Little, J., Lomas, C., Lombardi, M., Oblinger, D., & Windham, C. (2007). ELI Discovery Tool: Guide to Blogging. EDUCAUSE. Retrieved from http://www-cdn.educause.edu/eli/GuideToBlogging/13552

Danaher, P. A., Luck, J., & McConachie, J. (2005). The stories that documents tell: Changing technology options from Blackboard, Webfuse and the Content Management System at Central Queensland University. Studies in Learning, Evaluation, Innovation and Development, 2(1), 34-43.

Jones, D. (2002). Student Feedback, Anonymity, Observable Change and Course Barometers. In S. R. Philip Barker (Ed.), (pp. 884-889). Denver, Colorado: AACE.

Jones, D. (2011). An Information Systems Design Theory for E-learning. Philosophy. Australian National University. Retrieved from http://davidtjones.wordpress.com/research/phd-thesis/

Jones, D., & Buchanan, R. (1996). The design of an integrated online learning environment. In P. J. Allan Christie Beverley Vaughan (Ed.), (pp. 331-345). Adelaide.

Jones, D., Cranston, M., Behrens, S., & Jamieson, K. (2005). What makes ICT implementation successful: A case study of online assignment submission. Adelaide.

Jones, D., & Luck, J. (2009). Blog Aggregation Management: Reducing the Aggravation of Managing Student Blogging. In G. Siemns & C. Fulford (Eds.), World Conference on Educational Multimedia, Hypermedia and Telecommunications 2009 (pp. 398-406). Chesapeake, VA: AACE. Retrieved from http://www.editlib.org/p/31530

Jones, D., & Lynch, T. (1999). A Model for the Design of Web-based Systems that supports Adoption, Appropriation and Evolution. In Y. D. San Murugesan (Ed.), (pp. 47-56). Los Angeles.

Jones, D., Vallack, J., & Fitzgerald-Hood, N. (2008). The Ps Framework: Mapping the landscape for the PLEs@CQUni project. Hello! Where are you in the landscape of educational technology? ASCILITE’2008. Melbourne.

Malikowski, S., Thompson, M., & Theis, J. (2007). A model for research into course management systems: bridging technology and learning theory. Journal of Educational Computing Research, 36(2), 149-173.

Rankine, L., Stevenson, L., Malfroy, J., & Ashford-Rowe, K. (2009). Benchmarking across universities: A framework for LMS analysis. Ascilite 2009. Same places, different spaces (pp. 815-819). Auckland. Retrieved from http://www.ascilite.org.au/conferences/auckland09/procs/rankine.pdf

Svensson, L., Andersson, R., Gadd, M., & Johnsson, A. (1999). Course-Barometer: Compensating for the loss of informal feedback in distance education (pp. 1612-1613). Seattle, Washington: AACE.

Tickle, K., Muldoon, N., & Tennent, B. (2009). Moodle and the institutional repositioning of learning and teaching at CQUniversity. Auckland, NZ. Retrieved from http://www.ascilite.org.au/conferences/auckland09/procs/tickle.pdf

The e-learning process – limitations and an alternative

And here’s the followup to the well received “LMS Product” post. This is the second section looking at the limitations of how industrial e-learning is implemented, this time focusing on the process used. Not really happy with this one, space limitations are making it difficult to do a good job of description.

Process

It has become a maxim of modern society that without objectives, without purpose there can be no success, the setting of goals and achieving them has become the essence of “success” (Introna, 1996). Many, if not most, universities follow, or at least profess to follow, a purpose driven approach to setting strategic directions (Jones, Luck, McConachie, & Danaher, 2005). This is how institutional leaders demonstrate their strategic insight, their rationality and leadership. This is not a great surprise since such purpose driven processes – labeled as teleological processes by Introna (1996) – has dominated theory and practice to such an extent that it has become ingrained. Even though the debate between the “planning school” of process thought and the “learning school” of process thought has been one of the most pervasive debates in management (Clegg, 2002).

Prior papers (Jones et al., 2005; Jones & Muldoon, 2007) have used the nine attributes of a design process formulated by Introna (1996) to argue that purpose driven processes are particularly inappropriate to the practice of tertiary e-learning. The same papers have presented and illustrated the alternative, ateleological processes. The limitations of teleological processes can be illustrated by examining Introna’s (1996) three necessary requirements for teleological design processes

  1. The system’s behaviour must be relatively stable and predictable.
    As mentioned in the previous section, stability and predictability do not sound like appropriate adjectives for e-learning, especially into the future. Especially given the popular rhetoric about organizations in the present era no longer being stable, and instead are continuously adapting to shifting environments that places them in a state of constantly seeking stability while never achieving it (Truex, Baskerville, & Klein, 1999).
  2. The designers must be able to manipulate the system’s behaviour directly.
    Social systems cannot be “designed” in the same way as technical systems, at best they can be indirectly influenced (Introna, 1996). Technology development and diffusion needs cooperation, however, it takes place in a competitive and conflictual atmosphere where different social groups – each with their own interpretation of the technology and the problem to be solved – are inevitably involved and seek to shape outcomes (Allen, 2000). Academics are trained not to accept propositions uncritically and subsequently cannot be expected to adopt strategies without question or adaptation (Gibbs, Habeshaw, & Yorke, 2000).
  3. The designers must be able to determine accurately the goals or criteria for success.
    The uncertain and confused arena of social behaviour and autonomous human action make predetermination impossible (Truex, Baskerville et al. 2000). Allen (2000) argues that change in organizational and social setting involving technology is by nature undetermined.

For example, Tickle et al (2009) offer one description of the teleological process used to transition CQUni to the Moodle LMS in 2009. One of the institutional policies introduced as part of this process was the adoption of Minimum Service Standards for course delivery (Tickle et al., 2009, p. 1047). Intended to act as a starting point for “integrating learning and teaching strategies that could influence students study habits” and to “encourage academic staff to look beyond existing practices and consider the useful features of the new LMS” (Tickle et al., 2009, p. 1042). In order to assure the quality of this process a web-based checklist was implemented in another institutional system with the expectation that the course coordinator and moderator would actively check the course site met the minimum standards. A senior lecturer widely recognized as a quality teacher described the process for dealing with the minimum standards checklist as

I go in and tick all the boxes, the moderator goes in and ticks all the boxes and the school secretary does the same thing. It’s just like the exam check list.

The minimum standards checklist was removed in 2011.

A teleological process is not interested in learning and changing, only in achieving the established purpose. The philosophical assumptions of teleological processes – modernism and rationality – are in direct contradiction to views of learning meant to underpin the best learning and teaching. Rossi and Luck (2011, p. 62) talk about how “[c]onstructivist views of learning pervade contemporary educational literature, represent the dominant learning theory and are frequently associated with online learning”. Wise and Quealy (2006, p. 899) argue, however, that

while a social constructivist framework may be ideal for understanding the way people learn, it is at odds not only with the implicit instructional design agenda, but also with current university elearning governance and infrastructure.

Staff development sessions become focused on helping the institution achieve the efficient and effective use of the LMS, rather than quality learning and teaching. This leads to staff developers being “seen as the university’s ‘agent’” (Pettit, 2005, p. 253). There is a reason why Clegg (2002) references to teleological approaches as the “planning school” of process thought and the alternative ateological approach the “learning school” of process.

The ISDT abstracted from the Webfuse work includes 11 principles of implementation (i.e. process) divided into 3 groups. The first and second groupings refer more to people and will be covered in the next section. The second grouping focused explicitly on the process and was titled “An adopter-focused, emergent development process”. Webfuse achieved this by using an information systems development processes based on principles of emergent development (Truex et al., 1999) and ateleological design (Introna, 1996). The Webfuse development team was employed and located within the faculty. This allowed for a much more in-depth knowledge of the individual and organizational needs and an explicit focus on responding to those needs. The quote early in this paper about the origins of the results uploading system is indicative of this. Lastly, at its best Webfuse was able to seek a balance between teleological and ateleological processes due to a Faculty Dean who recognized the significant limitations of a top-down approach.

This process, when combined with a flexible and responsive product, better enabled the Webfuse team to work with the academics and students using the system to actively modify and construct the system in response to what was learned while using the system. It was an approach much more inline with a social constructivist philosophy.

References

Allen, J. (2000). Information systems as technological innovation. Information Technology & People, 13(3), 210-221.

Clegg, S. (2002). Management and organization paradoxes. Philadelphia, PA: John Benjamins Publishing.

Gibbs, G., Habeshaw, T., & Yorke, M. (2000). Institutional learning and teaching strategies in English higher education. Higher Education, 40(3), 351-372.

Introna, L. (1996). Notes on ateleological information systems development. Information Technology & People, 9(4), 20-39.

Jones, D., Luck, J., McConachie, J., & Danaher, P. A. (2005). The teleological brake on ICTs in open and distance learning. Adelaide.

Jones, D., & Muldoon, N. (2007). The teleological reason why ICTs limit choice for university learners and learning. In R. J. Atkinson, C. McBeath, S. K. A. Soong, & C. Cheers (Eds.), (pp. 450-459). Singapore. Retrieved from http://www.ascilite.org.au/conferences/singapore07/procs/jones-d.pdf

Pettit, J. (2005). Conferencing and Workshops: a blend for staff development. Education, Communication & Information, 5(3), 251-263. doi:10.1080/14636310500350505

Rossi, D., & Luck, J. (2011). Wrestling, wrangling and reaping: An exploration of educational practice and the transference of academic knowledge and skill in online learning contexts. Studies in Learning, Evaluation, Innovation and Development, 8(1), 60-75. Retrieved from http://www.sleid.cqu.edu.au/include/getdoc.php?id=1122&article=391&mode=pdf

Tickle, K., Muldoon, N., & Tennent, B. (2009). Moodle and the institutional repositioning of learning and teaching at CQUniversity. Auckland, NZ. Retrieved from http://www.ascilite.org.au/conferences/auckland09/procs/tickle.pdf

Truex, D., Baskerville, R., & Klein, H. (1999). Growing systems in emergent organizations. Communications of the ACM, 42(8), 117-123.

Wise, L., & Quealy, J. (2006). LMS Governance Project Report. Melbourne, Australia: Melbourne-Monash Collaboration in Education Technologies. Retrieved from http://www.infodiv.unimelb.edu.au/telars/talmet/melbmonash/media/LMSGovernanceFinalReport.pdf

The LMS Product – limitations and an alternative

What follows is the first draft of the “Product” section for an ASCILITE paper (the overview for the paper) I hope to finish by tomorrow……just a bit of wishful thinking. Much of it has appeared in this blog previously, just now trying to wrangle it into a formal publication and all the limitations (e.g. space) that brings with it.

It’s a first draft, so comments and suggestions more than welcome.

Product

One of the defining characteristics of the industrial e-learning paradigm is the reliance on the Learning Management System (LMS) as the product for organizational e-learning. Despite the associated complexities and risks almost every university seems compelled to have an LMS (Coates, James, & Baldwin, 2005). The LMS is an example of an integrated or monolithic information system. This type of information system brings with it a set of advantages and disadvantages. On the plus side, an integrated system offers cost efficiencies and other benefits through standardization but, at the same time, such systems constrain flexibility, competitiveness, autonomy and increase rigidity (B Light, Holland, & Wills, 2001; Lowe & Locke, 2008). Such systems are best suited to circumstances where there is commonality between organizations and stable requirements with low uncertainty. This does not seem to be a good description of tertiary e-learning, either over the last 10 years or the next 10. This section looks at two of the repercussions of this mismatch – 1) organizations and people must adapt to the system; and, 2) the single vendor limitation – before describing the alternate principles from the ISDT.

The first repercussion of an integrated system is captured by this comment (Sturgess & Nouwens, 2004, n.p.)

we should seek to change people’ behaviour because information technology systems are difficult to change.

This is a comment from a technical staff member participating in CQUni’s 2003 LMS selection process. This comment, rather than being isolated, captures the accepted industry best practice recommendation to implement integrated systems in their “vanilla” form because local changes are too expensive (Robey, Ross, & Boudreau, 2002). Maintaining a vanilla implementation constrains what is possible with the system, limiting change, innovation and differentiation and perhaps being a contributing factor in the poor pedagogical outcomes observed in industrial e-learning.

For example, in 2007 an instructional designer working on a redesign of a CQUni course in Nutrition informed by constructive alignment was stymied by the limitations of the Blackboard LMS. Blackboard could not support the required number of group-based discussion forums required by the new course design. Normally, with an integrated system the pedagogical approach would have to be changed to fit the confines of the system. Instead the implementation of the course site was supplemented with use of one of the Webfuse discussion forums that allowed the fulfillment of the original educational design. Academic staff teaching large first year courses using the Webfuse BAM functionality faced a similar situation when CQUni adopted Moodle. Since Moodle did not provide similar functionality these staff would be forced to change their pedagogical approach to fit the capabilities of the integrated system.

The regular forced migration to another version of an LMS is the extreme example of the organization being forced to change in response to the technology, rather than the technology fitting to the organizations needs. It is not uncommon to hear Universities being forced to adopt a new LMS because the vendor has ceased supporting their current system. The cost, complexity and disruption caused by an LMS migration contributes to this “stable systems drag” (Truex, Baskerville, & Klein, 1999) as the institution seeks a long period of “vanilla” use to recoup the cost.

Another characteristic of an integrated system is that the quality of the tools available is limited to those provided by a single vendor or community. For example, a key component of the recent disquiet about the Curt Bonk MOOC hosted within a Blackboard LMS was the poor quality of the Blackboard discussion forum (see Lane, 2012). Reservations about the quality and functionality of the Wiki and Blog tools within Moodle are also fairly common. LMS-based tools also tend not to fare well in comparisons with specialist tools. For example, when LMS-based blog tools are compared with tools like WordPress. In addition, integrated systems tend to support only one version of every given tool. Leading to the situation where users can pine for the previous version of the tool because it suited their needs better.

The ISDT formulated from the experience of developing Webfuse proposes 13 principles for the form and function of the product for emergent e-learning. These principles were divided into 3 groups:

  1. Integrated and independent services.
    Rather than a system or platform, Webfuse was positioned as glue. It was used to “fuse” together widely different services and tools into an integrated whole. Webfuse was an example of a best-of-breed system, a type of system that provides more flexibility and responsiveness to contextual needs (Ben Light, Holland, & Wills, 2001). For example, when the existing discussion forum tool was seen as limited, a new discussion forum tool was selected and integrated into Webfuse. At the same time the old discussion forum tool was retained and could be used by those for whom it was an appropriate fit. While new tools could be added as required, the interface used by staff and students remained essentially the same. There was no need for expensive system migrations.
  2. Adaptive and inclusive architecture.
    Almost all LMS support some form of plugin architecture where external users can develop new tools and services for the LMS. This architecture, however, is generally limited to tools specifically written for the LMS and its architecture and thereby limiting what tools can be integrated. The Webuse “architecture” was designed to support the idea of software wrappers (Sneed, 2000) enabling the inclusion of a much broader array of applications.
  3. Scaffolding, context-sensitive conglomerations.
    Most e-learning tools provide a collection of configuration options that can be used in a variety of ways. Effective use of these tools requires a combination of skills from a broad array of disciplines and significant contextual knowledge that the majority of academic staff do not possess. The most obvious example is in the overall design of a course website. Webfuse had a default course site conglomeration that combined a range of institutional data sources and Webfuse tools to automatically create a course site. A key aspect of the Webfuse wrappers placed around integrated tools was the addition of institutional specific information and services. There are significant, unexplored opportunities in adding scaffolding to e-learning tools that enable distributed cognition.

Writing about the need for universities to embrace diversity Thomas (2012) talks of Procrustes who

would stretch and sever the limbs of his guests to fit the size of his bed. We, too, are continuing to stretch and shape our higher education to a particular standard to the detriment of students and society alike.

In terms of e-learning, that “particular standard” is defined by the products we are using to implement industrial e-learning.

References

Coates, H., James, R., & Baldwin, G. (2005). A Critical Examination of the Effects of Learning Management Systems on University Teaching and Learning. Tertiary Education and Management, 11(1), 19-36. Retrieved from http://www.springerlink.com/content/r21987609l3g1h58/

Lane, L. M. (2012). Leaving an open online course. Retrieved from http://lisahistory.net/wordpress/2012/04/leaving-an-open-online-class/

Light, B, Holland, C. P., & Wills, K. (2001). ERP and best of breed: a comparative analysis. Business Process Management Journal, 7(3), 216-224.

Lowe, A., & Locke, J. (2008). Enterprise resource planning and the post bureaucratic organization. Information Technology & People, 21(4), 375-400.

Robey, D., Ross, W., & Boudreau, M.-C. (2002). Learning to implement enterprise systems: An exploratory study of the dialectics of change. Journal of Management Information Systems, 19(1), 17-46.

Sneed, H. (2000). Encapsulation of legacy software: A technique for reusing legacy software components. Annals of Software Engineering, 9(1-4), 293-313.

Sturgess, P., & Nouwens, F. (2004). Evaluation of online learning management systems. Turkish Online Journal of Distance Education, 5(3). Retrieved from http://tojde.anadolu.edu.tr/tojde15/articles/sturgess.htm

Thomas, J. (2012). Universities can’t all be the same – it’s time we embraced diversity. The Conversation. Retrieved June 28, 2012, from http://theconversation.edu.au/universities-cant-all-be-the-same-its-time-we-embraced-diversity-7379

Truex, D., Baskerville, R., & Klein, H. (1999). Growing systems in emergent organizations. Communications of the ACM, 42(8), 117-123.