Strategic plans, theoretical models and just doing it

Suffering a minor malaise brought on the strategic/operational planning process currently underway at my place of work. As a process it always seems an exercise in futility and frustration, but at least the current process is significantly better than some I’ve observed.

The problem is that it’s all based on a faulty assumption. That an institution can respond to an incredibly complex and rapidly changing context by having some smart people go away for a few months and create a theoretical conception of the way forward for the institution. A theoretical conception informed by their own existing schemata, which in an incredibly complex and rapidly changing context are always going to be insufficient. Especially when those smart people are those that have been successful in the current system which is based on old ideals. Some (many?) of which are unlikely to be relevant in the future.

As it happens one of those smart people came across the following tweet/image which essentially summarises what is wrong with this approach.

I’d suggest some additional related statements

  • A prototype is worth a thousand strategic plans.
  • A prototype is worth a thousand theoretical models.

and also the definition that a prototype is NOT some toy system that no-one uses. It’s a system that has been used in anger and been used to learn real practical lessons.

The illusion of the “one university”

Much of this practice seems to emerge from the belief that it’s important that the institution take center state. The institution has to have a plan, a set of graduate attributes, a set of systems for doing X etc. Perhaps an artefact of the rise of the Vice-Chancellor as CEO approach to leading universities.

I find this increasing importance of “one university” way of doing things interesting when talking about personal/personalised learning and the inherent diversity and flexibility inherent in such a concept.

Leadership as defining what’s successful

After spending a few days visiting friends and family in Central Queensland – not to mention enjoying the beach – a long 7+ hour drive home provided an opportunity for some thinking. I’ve long had significant qualms about the notion of leadership, especially as it is increasingly being understood and defined by the current corporatisation of universities and schools. The rhetoric is increasingly strong amongst schools with the current fashion for assuming that Principals can be the saviour of schools that have broken free from the evils of bureaucracy. I even work within an institution where a leadership research group is quite active amongst the education faculty.

On the whole, my experience of leadership in organisations has been negative. At the best the institution bumbles along through bad leadership. I’m wondering whether or not questioning this notion of leadership might form an interesting future research agenda. The following is an attempt to make concrete some thinking from the drive home, spark some comments, and set me up for some more (re-)reading. It’s an ill-informed mind dump sparked somewhat by some early experiences on return from leave.

Fisherman’s beach by David T Jones, on Flickr

In the current complex organisational environment, I’m thinking that “leadership” is essentially the power to define what success is, both prior to and after the fact. I wonder whether any apparent success attributed to the “great leader” is solely down to how they have defined success? I’m also wondering how much of that success is due to less than ethical or logical definitions of success?

The definition of success prior to the fact is embodied in the current model of process assumed by leaders, i.e. telological processes. Where the great leader must define some ideal future state (e.g. adoption of Moodle, Peoplesoft, or some other system; an organisational restructure that creates “one university”; or, perhaps even worse, a new 5 year strategic plan etc.) behind which the weight of the institution will then be thrown. All roads and work must lead to the defined point of success.

This is the Dave Snowden idea of giving up the evolutionary potential of the present for the promise of some ideal future state. A point he’ll often illustrate with this quote from Seneca

The greatest loss of time is delay and expectation, which depend upon the future. We let go the present, which we have in our power, and look forward to that which depends upon chance, and so relinquish a certainty for an uncertainty.

Snowden’s use of this quote comes from the observation that some systems/situations are examples of Complex Adaptive Systems (CAS). These are systems where traditional expectations of cause and effect don’t hold. When you intervene in such systems you cannot predict what will happen, only observe it in retrospect. In such systems the idea you can specify up front where you want to go is little more than wishful thinking. So defining success – in these systems – prior to the fact is a little silly. It questions the assumptions of such leadership, including that they can make a difference.

So when the Executive Dean of a Faculty – that includes programs in information technology and information systems – is awarded “ICT Educator of the Year” for the state because of the huge growth in student numbers, is it because of the changes he’s made? Or is it because he was lucky enough to be in power at (or just after) the peak of the IT boom? The assumption is that this leader (or perhaps his predecessor) made logical contributions and changes to the organisation to achieve this boom in student numbers. Or perhaps they made changes simply to enable the organisation to be better placed to handle and respond to the explosion in demand created by external changes.

But perhaps rather than this single reason for success (great leadership), it was instead there were simply a large number of small factors – with no central driving intelligence or purpose – that enabled this particular institution to achieve what it achieved. Similarly, when a few years later the same group of IT related programs had few if any students, it wasn’t because this “ICT Educator of the Year” had failed. Nor was it because of any other single factor, but instead hundreds and thousands of small factors both internally and externally (some larger than others).

The idea that there can be a single cause (or a single leader) for anything in a complex organisational environment seems to be faulty. But because it is demanded of them, leaders must spend more time attempting to define and convince people of their success. In essence then, successful leadership becomes more about your ability to define and promulgate widely acceptance of this definition of success.

KPIs and accountability galloping to help

This need to define and promulgate success is aided considerably by simple numeric measures. The number of student applications; DFW rates; numeric responses on student evaluation of courses – did you get 4.3?; journal impact factors and article citation metrics; and, many many more. These simple figures make it easy for leaders to define specific perspectives on success. This is problematic and it’s many problems are well known. For example,

  • Goodhart’s law – “When a measure becomes a target, it ceases to be a good measure.”
  • Campbell’s law – “The more any quantitative social indicator (or even some qualitative indicator) is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor.”
  • the Lucas critique.

For example, you have the problem identified by Tutty et al (2008) where rather than improve teaching, institutional quality measures “actually encourage inferior teaching approaches” (p. 182). It’s why you have the LMS migration project receiving an institutional award for quality etc, even though for the first few weeks of the first semester it was largely unavailable to students due to dumb technical decisions by the project team and required a large additional investment in consultants to fix.

Would this project have received the award if a senior leader in the institution (and the institutional itself) heavily reliant upon the project being seen as a success?

Would the people involved in giving the project the award have reasonable reasons for thinking it award winning? Is success of the project and of leadership all about who defines what perspective is important?

Some other quick questions

Some questions for me to consider.

  • Where does this perspective sit within the plethora of literature on leadership and organisational studies? Especially within the education literature? How much of this influenced by earlier reading of “Managing without Leadership: Towards a Theory of Organizational Functioning”
  • Given the limited likelihood of changing how leadership is practiced within the current organisational and societal context, how do you act upon any insights this perspective might provide? i.e. how the hell do I live (and heaven forbid thrive) in such a context?

References

Tutty, J., Sheard, J., & Avram, C. (2008). Teaching in the current higher education environment: perceptions of IT academics. Computer Science Education, 18(3), 171–185.

Does institutional e-learning have a TPACK problem?

The following is the first attempt to expand upon an idea that’s been bubbling along for the last few weeks. It arises from a combination of recent experiences, including

  • Working through the institutional processes to get BIM installed on the institutional Moodle.
  • Using BIM in my own teaching and the resulting changes (and maybe something along these lines) that will be made.
  • Talking about TPACK to students in the ICTs and Pedagogy course.
  • On-going observations of what passes for institutional e-learning within some Australian Universities (and which is likely fairly common across the sector).

Note: the focus here is on the practice of e-learning within Universities and the institutionally provided systems and processes.

The problem(s)

A couple of problems that spark this thinking

  1. How people and institutions identify the tools available/required.
  2. How the tools provide appropriate support, especially pedagogical, to the people using it.

Which tools?

One of the questions I was asked to address in my presentation to ask for BIM to be installed on the institutional LMS was something along the lines “Why would other people want to use this tool? We can’t install a tool just for one peson.”

Well one answer was that a quick Google search of the institution’s course specifications that revealed 30+ 2012 courses using reflective journals of varying types. BIM is a tool designed primarily to support the use of reflective learning journals by students via individual blogs.

I was quite surprised to find 30+ courses already doing this. This generated some questions

  • How are they managing the workload and the limitations of traditional approaches?
    The origins of BIM go back to when I took over a course that was using a reflective journal assessment task. Implemented by students keeping them as Word documents and submitting at the end of semester. There were problems.
  • I wonder how many of the IT and central L&T people knew that there were 30+ courses already using this approach?
    In this context, it would be quite easy to draw the conclusion that the IT and central L&T folk are there to help people with the existing tools and keep their own workload to a minimum by controlling what new tools are added to the mix. Rather than look for opportunities for innovation within the institution. Which leads to..
  • I wonder why the institution wasn’t already actively looking for tools to help these folk?
    Especially given that reflective learning journals (diaries etc) are “recognised as a significant tool in promoting active learning” (Thorpe, 2004, p. 327) but at the same time the are also “demanding and time-consuming for both students and educators” (Thorpe, 2004, p. 339)

A combination of those questions/factors seem to contribute to recent findings about the workloads faced by academics in terms of e-learning (Tynan et al, 2012)

have increased both the number and type of teaching tasks undertaken by staff, with a consequent increase in their work hours

and (Bright, 2012, n.p)

Lecturers who move into the online learning environment often discover that the workload involved not only changes, but can be overwhelming as they cope with using digital technologies. Questions arise, given the dissatisfaction of lecturers with lowering morale and increasing workload, whether future expansion of this teaching component in tertiary institutions is sustainable.

How the tools provide support?

One of the problems I’m facing with BIM is that the pedagogical approach I originally used and which drove the design of BIM is not the pedagogical approach I’m using now. The features and functions in BIM currently, don’t match what I want to do pedagogically. I’m lucky, I can change the system. But not many folk are in this boat.

And this isn’t the first time we’ve faced this problem. Reaburn et al (2009) used BIM’s predecessor in a “work integrated learning” course where the students were working in a professional context. They got by, but this pedagogical approach had yet again different requirements.

TPACK

“Technological Pedagogical Content Knowledge (TPACK) is a framework that identifies the knowledge teachers need to teach effectively with technology” (Koehler, n.d.). i.e. it identifies a range of different types of knowledge that are useful, perhaps required, for the effective use of technology in teaching and learning. While it has it’s detractors, I believe that TPACK can provide a useful lens for examining the problems with institutional e-learning and perhaps identify some suggestions for how institutional e-learning (and e-learning tools) can be better designed.

To start, TPACK proposes that successful e-learning (I’m going to use that as short-hand for the use of technology in learning and teaching) requires the following types of knowledge (with my very brief descriptions)

  • Technological knowledge (TK) – how to use technologies.
  • Pedagogical knowledge (PK) – how to teach.
  • Content knowledge (CK) – knowledge of what the students are meant to be learning.

Within institutional e-learning you can see this separation in organisational structures and also the assumptions of some of the folk involved. i.e.

  • Technological knowledge – is housed in the institutional IT division.
  • Pedagogical knowledge – is housed in the central L&T division.
  • Content knowledge – academics and faculties are the silos of content knowledge.

Obviously there is overlap. Most academics have some form of TK, PK and CK. But when it comes to the source of expertise around TK, it’s the IT division. etc.

TPACK proposes that there are combinations of these three types of knowledge that offer important insights

  • Pedagogical Content Knowledge (PCK) – the idea that certain types of content is best taught using certain types of pedagogy.
  • Technological Pedagogical Knowledge (TPK) – the knowledge that certain types of technologies work well with certain types of pedagogy (e.g. teaching critical analysis using a calculator probably isn’t a good combination)
  • Technological Content Knowledge (TCK) – that content areas draw on technologies in unique ways (e.g. mathematicians use certain types of technologies that aren’t used by historians)

Lastly, TPACK suggests that there is a type of knowledge in which all of the above is combined and when used effectively this is where the best examples of e-learning arise.  i.e. TPACK – Technological, Pedagogical and Content Knowledge.

The problem I see is that institutional e-learning, its tools, its processes and its organisational structures are getting in the way of allowing the generation and application of effective TPACK.

Some Implications

Running out of time, so some quick implications that I take from the above and want to explore some more. These are going to be framed mostly around my work with BIM, but there are potentially some implications for broader institutional e-learning systems which I’ll briefly touch on.

BIM’s evolution is best when I’m teaching with it

Assuming that I have the time, the best insights for the future development of BIM have arisen when I’m using BIM in my teaching. When I’m able to apply the TPACK that I have to identify ways the tool can help me. When I’m not using BIM in my teaching I don’t have the same experience.

At this very moment, however, I’m only really able to apply this TPACK because I’m running BIM on my laptop (and using a bit of data munging to bridge the gap between it and the institutional systems). This means I am able to modify BIM in response to a need, test it out and use it almost immediately. When/if I begin using BIM on the institutional version of Moodle, I won’t have this ability. At best, I might hope for the opportunity for a new version of BIM to be installed at the end of the semester.

There are reasons why institutional systems have these constraints. The problem is that these constraints get in the way of generating and applying TPACK and thus limit the quality of the institutional e-learning.

I also wonder if there’s a connection here and the adoption of Web 2.0 and other non-institutional tools by academics. i.e. do they find it easier to generate and apply TPACK to these external tools because they don’t have the same problems and constraints as the institutional e-learning tools?

BIM and multiple pedagogies

Arising from the above point is the recognition that BIM needs to be able to support multiple pedagogical approaches. i.e. the PK around reflective learning journals reveals many different pedagogical approaches. If BIM as an e-learning tool is going to effectively support these pedagogies then new forms of TPK need to be produced. i.e. BIM itself needs to know about and support the different reflective journal pedagogies.

There’s a lot of talk about how various systems are designed to support a particular pedagogical approach. However, I wonder just how many of these systems actually provide real TPK assistance? For example, the design of Moodle “is guided by a ‘social constructionist pedagogy'” but it’s pretty easy to see examples of how it’s not used that way when course sites are designed.

There are a range of reasons for this. Not the least of which is that the focus of teachers and academics creating course sites is often focused on more pragmatic tasks. But part of the problem is also, I propose, the level of TPK provided by Moodle. The level of technological support it provides for people to recognise, understand and apply that pedagogical approach.

There’s a two-edged sword here. Providing more TPK may help people adopt this approach, but it can also close off opportunities for different approaches. Scaffolding can quickly become a cage. Too much focus on a particular approach also closes off opportunities for adoption.

But on the other hand, the limited amount of specific TPK provided by the e-learning tools is, I propose, a major contributing factor to the workload issues around institutional e-learning. The tools aren’t providing enough direct support for what teachers want to achieve. So the people have to bridge the gap. They have to do more work.

BIM and distributed cognition – generating TPACK

One of the concerns raised in the committee that had to approve the adoption of BIM was about the level of support. How is the institution going to support academics who want to use BIM? The assumption being that we can’t provide the tool without some level of support and training.

This is a valid concern. But I believe there are two asumptions underpinning it which I’d like to question and explore alternatives. The observations are

  1. You can’t learn how to use the tool, simply by using the tool.
    If you buy a good computer/console game, you don’t need to read the instructions. Stick it in and play. The games are designed to scaffold your entry into the game. I haven’t yet met an institutional e-learning tool that can claim the same. Some of this arises, I believe, from the limited amount of TPK most tools provide. But it’s also how the tool is designed. How can BIM be designed to support this?
  2. The introduction of anything new has to be accompanied by professional development and other forms of formal support.
    This arises from the previous point but it also connected to a previous post titled “Professional development is created, not provided”. In part, this is because the IT folk and the central L&T folk see their job as (and some have their effectiveness measured by) providing professional development sessions or the number of helpdesk calls they process.

It’s difficult to generate TPACK

I believe that the current practices, processes and tools used by institutional e-learning systems make it difficult for the individuals and organisations involved to develop TPACK. Consequently the quality of institutional e-learning suffers. This contributes to the poor quality of most institutional e-learning, the limited adoption of features beyond content distribution and forums, and is part of the reason behind the perceptions of increasing workload around e-learning.

If this is the case, then can it be addressed? How?

References

Bright, S. (2012). eLearning lecturer workload: working smarter or working harder? In M. Brown, M. Hartnett, & T. Stewart (Eds.), ASCILITE’2012. Wellington, NZ.

Reaburn, P., Muldoon, N., & Bookallil, C. (2009). <a href=”“>Blended spaces, work based learning and constructive alignment: Impacts on student engagement. Same places, different spaces. Proceedings ascilite Auckland 2009 (pp. 820–831). Auckland, NZ.

Thorpe, K. (2004). Reflective learning journals : From concept to practice. Reflective practice: International and Multidisciplinary Perspectives, 5(3), 327–343.

Tynan, B., Ryan, Y., Hinton, L., & Mills, L. (2012). Out of hours Final Report of the project e-Teaching leadership: planning and implementing a benefits-oriented costs model for technology-enhanced learning. Strawberry Hills, Australia.

The life and death of Webfuse: What’s wrong with industrial e-learning and how to fix it

The following is a collection of presentation resources (i.e. the slides) for an ASCILITE’2012 of this paper. The paper and presentation are a summary of the outcomes my PhD work. The thesis goes into much more detail.

Abstract

Drawing on the 14-year life and death of an integrated online learning environment used by tens of thousands of people, this paper argues that many of the principles and practices underpinning industrial e-learning – the current dominant institutional model – are inappropriate. The paper illustrates how industrial e-learning can limit outcomes of tertiary e-learning and limits the abilities of universities to respond to uncertainty and effectively explore the future of learning. It limits their ability to learn. The paper proposes one alternate set of successfully implemented principles and practices as being more appropriate for institutions seeking to learn for the future and lead in a climate of change.

Slides

The slides are available on Slideshare and should show up below. These slides are the extended version, prior to the cutting required to fit within the 20 minute time limit.

References

Arnott, D. (2006). Cognitive biases and decision support systems development: a design science approach. Information Systems Journal, 16, 55–78.

Behrens, S., Jamieson, K., Jones, D., & Cranston, M. (2005). Predicting system success using the Technology Acceptance Model: A case study. 16th Australasian Conference on Information Systems. Sydney.

Brews, P., & Hunt, M. (1999). Learning to plan and planning to learn: Resolving the planning school/learning school debate. Strategic Management, 20(10), 889–913.

Cecez-Kecmanovic, D., Janson, M., & Brown, A. (2002). The rationality framework for a critical study of information systems. Journal of Information Technology, 17, 215–227.

Central Queensland University. (2004). Faculty teaching and learning report. Rockhampton, Australia.

Davenport, T. (1998). Putting the Enterprise into the Enterprise System. Harvard Business Review, 76(4), 121–131.

Dede, C. (2008). Theoretical perspectives influencing the use of information technology in teaching and learning. In J. Voogt & G. Knezek (Eds.), (pp. 43–59). New York: Springer.

Dillard, J., & Yuthas, K. (2006). Enterprise resource planning systems and communicative action. Critical Perspectives on Accounting, 17(2-3), 202–223.

Fleming, P., & Spicer, A. (2003). Working at a cynical distance: Implications for power, subjectivity and resistance. Organization, 10(1), 157–179.

Haywood, T. (2002). Defining moments: Tension between richness and reach. In W. Dutton & B. Loader (Eds.), (pp. 39–49). London: Routledge.

Hutchins, E. (1991). Organizing work by adaptation. Organization Science, 2(1), 14–39.

Introna, L. (1996). Notes on ateleological information systems development. Information Technology & People, 9(4), 20–39.

Jamieson, K., & Hyland, P. (2006). Factors that influence Information Systems decisions and outcomes: A summary of key themes from four case studies. Adelaide, Australia.

Jones, D. (1996). Solving Some Problems of University Education: A Case Study. In R. Debreceny & A. Ellis (Eds.), Proceedings of AusWebÕ96 (pp. 243–252). Gold Coast, QLD: Southern Cross University Press.

Jones, D. (2002). Student Feedback, Anonymity, Observable Change and Course Barometers. In P. Barker & S. Rebelsky (Eds.), World Conference on Educational Multimedia, Hypermedia and Telecommunications 2002 (pp. 884–889). Denver, Colorado: AACE.

Jones, D. (2003). Course Barometers: Lessons gained from the widespread use of anonymous online formative evaluation. QUT, Brisbane.

Jones, D., & Buchanan, R. (1996). The design of an integrated online learning environment. In A. Christie, B. Vaughan, & P. James (Eds.), Making New Connections, asciliteÕ1996 (pp. 331–345). Adelaide.

Jones, D., & Luck, J. (2009). Blog Aggregation Management: Reducing the Aggravation of Managing Student Blogging. In G. Siemns & C. Fulford (Eds.), World Conference on Educational Multimedia, Hypermedia and Telecommunications 2009 (pp. 398–406). Chesapeake, VA: AACE.

Jones, N., & OÕShea, J. (2004). Challenging hierarchies: The impact of e-learning. Higher Education, 48, 379–395.

Katz, R. (2003). Balancing Technology and Tradition: The Example of Course Management Systems. EDUCAUSE Review, 38(4), 48–59.

Kurtz, C., & Snowden, D. (2007). Bramble Bushes in a Thicket: Narrative and the intangiables of learning networks. In M. Gibbert & T. Durand (Eds.), . Blackwell.

Laurillard, D. (2002). Rethinking University Teaching: A Conversational Framework for the Effective Use of Learning Technologies. London: Routledge.

Light, B., Holland, C. P., & Wills, K. (2001). ERP and best of breed: a comparative analysis. Business Process Management Journal, 7(3), 216–224.

March, J. (1991). Exploration and exploitation in organizational learning. Organization Science, 2(1), 71–87.

Mintzberg, H. (1989). Mintzberg on Management, Inside our Strange World of Organisations. New York: Free Press.

Morgan, Glenda. (2003). Faculty use of course management systems. Educause Centre for Applied Research.

Morgan, Glenn. (1992). Marketing discourse and practice: Towards a critical analysis. In M. Alvesson & H. Willmott (Eds.), (pp. 136–158). London: SAGE.

Pozzebon, M., Titah, R., & Pinsonneault, A. (2006). Combining social shaping of technology and communicative action theory for understanding rhetorical closuer in IT. Information Technology & People, 19(3), 244–271.

Robey, D., Ross, W., & Boudreau, M.-C. (2002). Learning to implement enterprise systems: An exploratory study of the dialectics of change. Journal of Management Information Systems, 19(1), 17–46.

Rossi, D., & Luck, J. (2011). Wrestling, wrangling and reaping: An exploration of educational practice and the transference of academic knowledge and skill in online learning contexts. Studies in Learning, Evaluation, Innovation and Development, 8(1), 60–75.

Seely Brown, J., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32–42.

Seely-Brown, J., & Hagel, J. (2005). From push to pull: The next frontier of innovation. The McKinsey Quarterly. McKinsey & Company.

Simon, H. (1991). Bounded rationality and organizational learning. Organization Science, 2(1), 125–134.

Sturgess, P., & Nouwens, F. (2004). Evaluation of online learning management systems. Turkish Online Journal of Distance Education, 5(3).

Thomas, J. (2012). Universities canÕt all be the same – it’s time we embraced diversity. The Conversation. Retrieved June 28, 2012, from http://theconversation.edu.au/universities-cant-all-be-the-same-its-time-we-embraced-diversity-7379

Truex, Duane, Baskerville, R., & Travis, J. (2000). Amethodical systems development: the deferred meaning of systems development methods. Accounting Management and Information Technologies, 10, 53–79.

Truex, Duanne, Baskerville, R., & Klein, H. (1999). Growing systems in emergent organizations. Communications of the ACM, 42(8), 117–123.

Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124–1131.

Underwood, J., & Dillon, G. (2011). Chasing dreams and recognising realities: teachersÕ responses to ICT. Technology, Pedagogy and Education, 20(3), 317–330. doi:10.1080/1475939X.2011.610932

Wagner, E., Scott, S., & Galliers, R. (2006). The creation of Òbest practiceÓ software: Myth, reality and ethics. Information and Organization, 16(3), 251–275.

Weick, K., & Quinn, R. (1999). Organizational change and development. Annual Review of Psychology, 50, 361–386.

The e-learning process – limitations and an alternative

And here’s the followup to the well received “LMS Product” post. This is the second section looking at the limitations of how industrial e-learning is implemented, this time focusing on the process used. Not really happy with this one, space limitations are making it difficult to do a good job of description.

Process

It has become a maxim of modern society that without objectives, without purpose there can be no success, the setting of goals and achieving them has become the essence of “success” (Introna, 1996). Many, if not most, universities follow, or at least profess to follow, a purpose driven approach to setting strategic directions (Jones, Luck, McConachie, & Danaher, 2005). This is how institutional leaders demonstrate their strategic insight, their rationality and leadership. This is not a great surprise since such purpose driven processes – labeled as teleological processes by Introna (1996) – has dominated theory and practice to such an extent that it has become ingrained. Even though the debate between the “planning school” of process thought and the “learning school” of process thought has been one of the most pervasive debates in management (Clegg, 2002).

Prior papers (Jones et al., 2005; Jones & Muldoon, 2007) have used the nine attributes of a design process formulated by Introna (1996) to argue that purpose driven processes are particularly inappropriate to the practice of tertiary e-learning. The same papers have presented and illustrated the alternative, ateleological processes. The limitations of teleological processes can be illustrated by examining Introna’s (1996) three necessary requirements for teleological design processes

  1. The system’s behaviour must be relatively stable and predictable.
    As mentioned in the previous section, stability and predictability do not sound like appropriate adjectives for e-learning, especially into the future. Especially given the popular rhetoric about organizations in the present era no longer being stable, and instead are continuously adapting to shifting environments that places them in a state of constantly seeking stability while never achieving it (Truex, Baskerville, & Klein, 1999).
  2. The designers must be able to manipulate the system’s behaviour directly.
    Social systems cannot be “designed” in the same way as technical systems, at best they can be indirectly influenced (Introna, 1996). Technology development and diffusion needs cooperation, however, it takes place in a competitive and conflictual atmosphere where different social groups – each with their own interpretation of the technology and the problem to be solved – are inevitably involved and seek to shape outcomes (Allen, 2000). Academics are trained not to accept propositions uncritically and subsequently cannot be expected to adopt strategies without question or adaptation (Gibbs, Habeshaw, & Yorke, 2000).
  3. The designers must be able to determine accurately the goals or criteria for success.
    The uncertain and confused arena of social behaviour and autonomous human action make predetermination impossible (Truex, Baskerville et al. 2000). Allen (2000) argues that change in organizational and social setting involving technology is by nature undetermined.

For example, Tickle et al (2009) offer one description of the teleological process used to transition CQUni to the Moodle LMS in 2009. One of the institutional policies introduced as part of this process was the adoption of Minimum Service Standards for course delivery (Tickle et al., 2009, p. 1047). Intended to act as a starting point for “integrating learning and teaching strategies that could influence students study habits” and to “encourage academic staff to look beyond existing practices and consider the useful features of the new LMS” (Tickle et al., 2009, p. 1042). In order to assure the quality of this process a web-based checklist was implemented in another institutional system with the expectation that the course coordinator and moderator would actively check the course site met the minimum standards. A senior lecturer widely recognized as a quality teacher described the process for dealing with the minimum standards checklist as

I go in and tick all the boxes, the moderator goes in and ticks all the boxes and the school secretary does the same thing. It’s just like the exam check list.

The minimum standards checklist was removed in 2011.

A teleological process is not interested in learning and changing, only in achieving the established purpose. The philosophical assumptions of teleological processes – modernism and rationality – are in direct contradiction to views of learning meant to underpin the best learning and teaching. Rossi and Luck (2011, p. 62) talk about how “[c]onstructivist views of learning pervade contemporary educational literature, represent the dominant learning theory and are frequently associated with online learning”. Wise and Quealy (2006, p. 899) argue, however, that

while a social constructivist framework may be ideal for understanding the way people learn, it is at odds not only with the implicit instructional design agenda, but also with current university elearning governance and infrastructure.

Staff development sessions become focused on helping the institution achieve the efficient and effective use of the LMS, rather than quality learning and teaching. This leads to staff developers being “seen as the university’s ‘agent’” (Pettit, 2005, p. 253). There is a reason why Clegg (2002) references to teleological approaches as the “planning school” of process thought and the alternative ateological approach the “learning school” of process.

The ISDT abstracted from the Webfuse work includes 11 principles of implementation (i.e. process) divided into 3 groups. The first and second groupings refer more to people and will be covered in the next section. The second grouping focused explicitly on the process and was titled “An adopter-focused, emergent development process”. Webfuse achieved this by using an information systems development processes based on principles of emergent development (Truex et al., 1999) and ateleological design (Introna, 1996). The Webfuse development team was employed and located within the faculty. This allowed for a much more in-depth knowledge of the individual and organizational needs and an explicit focus on responding to those needs. The quote early in this paper about the origins of the results uploading system is indicative of this. Lastly, at its best Webfuse was able to seek a balance between teleological and ateleological processes due to a Faculty Dean who recognized the significant limitations of a top-down approach.

This process, when combined with a flexible and responsive product, better enabled the Webfuse team to work with the academics and students using the system to actively modify and construct the system in response to what was learned while using the system. It was an approach much more inline with a social constructivist philosophy.

References

Allen, J. (2000). Information systems as technological innovation. Information Technology & People, 13(3), 210-221.

Clegg, S. (2002). Management and organization paradoxes. Philadelphia, PA: John Benjamins Publishing.

Gibbs, G., Habeshaw, T., & Yorke, M. (2000). Institutional learning and teaching strategies in English higher education. Higher Education, 40(3), 351-372.

Introna, L. (1996). Notes on ateleological information systems development. Information Technology & People, 9(4), 20-39.

Jones, D., Luck, J., McConachie, J., & Danaher, P. A. (2005). The teleological brake on ICTs in open and distance learning. Adelaide.

Jones, D., & Muldoon, N. (2007). The teleological reason why ICTs limit choice for university learners and learning. In R. J. Atkinson, C. McBeath, S. K. A. Soong, & C. Cheers (Eds.), (pp. 450-459). Singapore. Retrieved from http://www.ascilite.org.au/conferences/singapore07/procs/jones-d.pdf

Pettit, J. (2005). Conferencing and Workshops: a blend for staff development. Education, Communication & Information, 5(3), 251-263. doi:10.1080/14636310500350505

Rossi, D., & Luck, J. (2011). Wrestling, wrangling and reaping: An exploration of educational practice and the transference of academic knowledge and skill in online learning contexts. Studies in Learning, Evaluation, Innovation and Development, 8(1), 60-75. Retrieved from http://www.sleid.cqu.edu.au/include/getdoc.php?id=1122&article=391&mode=pdf

Tickle, K., Muldoon, N., & Tennent, B. (2009). Moodle and the institutional repositioning of learning and teaching at CQUniversity. Auckland, NZ. Retrieved from http://www.ascilite.org.au/conferences/auckland09/procs/tickle.pdf

Truex, D., Baskerville, R., & Klein, H. (1999). Growing systems in emergent organizations. Communications of the ACM, 42(8), 117-123.

Wise, L., & Quealy, J. (2006). LMS Governance Project Report. Melbourne, Australia: Melbourne-Monash Collaboration in Education Technologies. Retrieved from http://www.infodiv.unimelb.edu.au/telars/talmet/melbmonash/media/LMSGovernanceFinalReport.pdf

A command for organisations? Program or be programmed

I’ve just finished the Douglas Rushkoff book Program or be Programmed: Ten commands for a digital age. As the title suggests the author provides ten “commands” for living well with digital technologies. This post arises from the titular and last command examined in the book, Program or be programmed.

Dougls Rushkoff

This particular command was of interest to me for two reasons. First, it suggests that learning to program is important and that more should be doing it. As I’m likely to become a information technology high school teacher there is some significant self-interest in there being a widely accepted importance to learning ot program. Second, and the main connection for this post, is that my experience with and observation of universities is that they are tending “to be programmed”, rather than program. In particular when it comes to e-learning.

This post is some thinking out loud about that experience and the Ruskoff command. In particular, it’s my argument that universities are being programmed by the technology they are using. I’m wondering why? Am hoping this will be my last post on these topics, I think I’ve pushed the barrow for all its worth. Onto new things next.

Program or be programmed

Rushkoff’s (p 128) point is that

Digital technology is programmed. This makes it biased toward those with the capacity to write the code.

This also gives a bit of a taste for the other commands. i.e. that there are inherent biases in digital technology that can be good or bad. To get the best out of the technology there are certain behaviours that seem best suited for encouraging the good, rather than the bad.

One of the negative outcomes of not being able to program, of not being able to take advantage of this bias of digital technology is (p 15)

…instead of optimizing our machines for humanity – or even the benefit of some particular group – we are optimizing humans for machinery.

But is all digital technology programmed?

In terms of software, yes, it is all generally created by people programming. But not all digital technology is programmable. The majority of the time, money and resources being invested by universities (I’ll stick to unis, however, much of what I say may be applicable more broadly to organisations) is in “enterprise” systems. Originally this was in the form of Enterprise Resource Planning system (ERPs) like Peoplesoft. It is broadly recognised that modifications to ERPs are not a good idea, and that instead the ERP should be implemented in “vanilla” form (Robey et al, 2002).

That is, rather than modify the ERP system to respond to the needs of the university. The university should modify its practices to match the operation of the ERP system. This appears to be exactly what Rushkoff warn’s against “we are optimizing humans for machinery”.

This is important for e-learning because, I would argue, the Learning Management System (LMS) is essentially an ERP for learning. And I would suggest that much of what goes on around the implementation and support of an LMS within a university is the optimization of humans for machinery. In some specific instances that I’m aware of, it doesn’t matter whether the LMS is open source or not. Why?

Software remains hard to modify

Glass (2001), describing one of the frequently forgotten fundamental facts about software engineering, suggested that maintenance consumes about 40 to 80 percent of software costs, with 60% of the maintenance cost is due to enhancement. i.e. a significant proportion of the cost of any software system is adding new features to it. You need to remember that this is a general statement. If the software you are talking about is part of a system that operates within a continually changing context, then the figure is going to be much, much higher.

Most software engineering remains focused on creation. On the design and implementation of the software. There hasn’t been enough focus on on-going modification, evolution or co-emergence of the software and local needs.

Take Moodle. It’s an LMS. Good and bad like other LMS. But it’s open source. It is meant to be easy to modify. That’s one of the arguments wheeled out by proponents when institutions are having to select a new LMS. And Moodle and its development processes are fairly flexible. It’s not that hard to add a new activity module to perform some task you want that isn’t supported by the core.

The trouble is that Moodle is currently entering a phase which suggests it suffers much the same problems as most large enterprise software applications. The transition from Moodle 1.x to Moodle 2.0 is highlighting the problems with modification. Some folk are reporting difficulties with the upgrade process, others are deciding to delay the upgrade as some of the third-party modules they use haven’t been converted to Moodle 2. There are even suggestions from some that mirror the “implement vanilla” advice for ERPs.

It appears that “we are optimizing humans for machinery”.

I’m wondering if there is anyone doing research how to make systems like Moodle more readily modifiable for local contexts. At the very least, looking at how/if the version upgrade problem can be improved. But also, the ability to modify the core to better suit local requirements. There are aspects there already. One of the difficulties is that to achieve this you would have to cross boundaries between the original developers, service providers (Moodle partners) and the practices of internal IT divisions.

Not everyone wants to program

One reason this will be hard is that not everyone wants to program. Recently, D’Arcy Norman wrote a post talking about the difference between the geeks and folk like his dad. His dad doesn’t want to bother with this techy stuff, he doesn’t want to “program”.

This sort of problem is made worse if you have an IT division that has senior management with backgrounds in non-IT work. For example, an IT director with a background in facilities management isn’t going to understand that IT is protean, that it can be programmed. Familiar with the relative permanence of physical buildings and infrastructure such a person isn’t going to understand that IT can be changed, that it should be optimized for the human beings using the system.

Organisational structures and processes prevent programming

One of the key arguments in my EDUCAUSE presentation (and my thesis) is that the structures and processes that universities are using to support e-learning are biased away from modification of the system. They are biased towards vanilla implementation.

First, helpdesk provision is treated as a generic task. The folk on the helpdesk are seen as low-level, interchangeable cogs in a machine that provides support for all an organisation’s applications. The responsibility of the helpdesk is to fix known problems quickly. They don’t/can’t become experts in the needs of the users. The systems within which they work don’t encourage, or possibly even allow, the development of deep understanding.

For the more complex software applications there will be an escalation process. If the front-line helpdesk can’t solve the problem it gets handed up to application experts. These are experts in using the application. They are trained and required to help the user figure out how to use the application to achieve their aims. These application experts are expert in optimizing the humans for the machinery. For example, if an academic says they want students to have an individual journal, a Moodle 1.9 application expert will come back with suggestions about how this might be done with the Moodle wiki or some other kludge with some other Moodle tool. If Moodle 1.9 doesn’t provide a direct match, they figure out how to kludge together functionality it does have. The application expert usually can’t suggest using something else.

By this stage, an academic has either given up on the idea, accepted the kludge, gone and done it themselves, or (bravely) decided to escalate the problem further by entering into the application governance process. This is the heavy weight, apparently rational process through which requests for additional functionality are weighed against the needs of the organisation and the available resources. If it’s deemed important enough the new functionality might get scheduled for implementation at some point in the future.

There are many problems with this process

  • Non-users making the decisions;
    Most of the folk involved in the governance process are not front-line users. They are managers, both IT and organisational. They might include a couple of experts – e-learning and technology. And they might include a couple of token end-users/academics. Though these are typically going to be innovators. They are not going to be representative of the majority of users.

    What these people see as important or necessary, is not going to be representative of what the majority of academic staff/users think is important. In fact, these groups can quickly become biased against the users. I attended one such meeting where the first 10/15 minutes was spent complaining about foibles of academic staff.

  • Chinese whispers;
    The argument/information presented to such a group will have had to go through chinese whispers like game. An analyst is sent to talk to a few users asking for a new feature. The analyst talks to the developers and other folk expert in the application. The analysts recommendations will be “vetted” by their manager and possibly other interested parties. The analysts recommendation is then described at the governance meeting by someone else.

    All along this line, vested interests, cognitive biases, different frames of references, initial confusion, limited expertise and experience, and a variety of other factors contribute to the original need being morphed into something completely different.

  • Up-front decision making; and
    Finally, many of these requests will have to battle against already set priorities. As part of the budgeting process, the organisation will already have decided what projects and changes it will be implementing this year. The decisions has been made. Any new requirements have to compete for whatever is left.
  • Competing priorities.
    Last in this list, but not last overall, are competing priorities. The academic attempting to implement individual student journals has as their priority improving the learning experience of the student. They are trying to get the students to engage in reflection and other good practices. This priority has to battle with other priorities.

    The head of the IT division will have as a priority of staying in budget and keeping the other senior managers happy with the performance of the IT division. Most of the IT folk will have a priority, or will be told that their priority is, to make the IT division and the head of IT look good. Similarly, and more broadly, the other senior managers on 5 year contracts will have as a priority making sure that the aims of their immediate supervisor are being seen to be achieved……..

These and other factors lead me to believe that as currently practiced, the nature of most large organisations is to be programmed. That is, when it comes to using digital technologies they are more likely to optimize the humans within the organisation for the needs of the technology.

Achieving the alternate path, optimizing the machinery for the needs of the humans and the organisation is not a simple task. It is very difficult. However, by either ignoring or being unaware of the bias of their processes, organisations are sacrificing much of the potential of digital techology. If they can’t figure out how to start programming, such organisations will end up being programmed.

References

Robey, D., Ross, W., & Boudreau, M.-C. (2002). Learning to implement enterprise systems: An exploratory study of the dialectics of change. Journal of Management Information Systems, 19(1), 17-46.

University e-learning systems: the need for new product and process models and some examples

I’m in the midst of the horrible task of trying to abstract what I think I know about implementing e-learning information systems within universities into the formal “language” required of an information systems design theory and a PhD thesis. This post is a welcome break from that, but is still connected in that it builds on what is perhaps fundamentally different between what most universities are currently doing, and what I think is a more effective approach. In particular, it highlights some more recent developments which are arguably a step towards what I’m thinking.

As it turns out, this post is also an attempt to crystalise some early thinking about what goes into the ISDT. So some of the following is a bit rough. Actually, writing this has identified one perspective that I hadn’t thought of, which is potentially important.

Edu 2.0

The post arises from having listened to this interview with Graham Glass the guy behind Edu 2.0, which is essentially a cloud-based LMS. It’s probably one of a growing number out there. What I found interesting was his description of the product and the process behind Edu 2.0.

In terms of product (i.e. the technology used to provide the e-learning services), the suggestion was that because Edu 2.0 is based in the cloud – in this case Amazon’s S3 service – it could be updated much more quickly than more traditional institutionally hosted LMSs. There some connection here with Google’s approach to on-going modifications to live software.

Coupled with this product flexibility was a process (i.e. the process through which users were supported and the system evolved) that very much focused on the Edu 2.0 developers interacting with the users of the product. For example, releasing proposals and screenshots of new features within discussion forums populated with users and getting feedback; and also responding quickly to requests for fixes or extensions from users. To such an extent that Glass reports users of Edu 2.0 feeling like it is “there EDU 2.0” because it responds so quickly to them and their needs.

The traditional Uni/LMS approach is broken

In the thesis I argue that when you look at how universities are currently implementing e-learning information systems (i.e. selecting and implementing an LMS) the product (the enterprise LMS, the one ring to rule them all) and the process they use are not a very good match at all for the requirements of effectively supporting learning and teaching. In a nut shell, the product and the process is aimed at reducing diversity and the ability to learn, while diversity is a key characteristic of learning and teaching at a university. Not to mention that when it comes to e-learning within universities, it’s still very early days and it is essential that any systemic approach to e-learning have the ability to learn from its implementation and make changes.

I attempted to expand on this argument in the presentation I gave at the EDUCAUSE’2009 conference in Denver last year.

What is needed

The alternative I’m trying to propose within the formal language of the ISDT is that e-learning within universities should seek to use a product (i.e. a specific collection of technologies) that is incredible flexible. The product must, as much as possible, enable rapid, on-going, and sometimes quite significant changes.

To harness this flexibility, the support and development process for e-learning should, rather than be focused on top-down, quality assurance type processes, be focused on closely observing what is being done with the system and using those lessons to modify the product to better suit the diversity of local needs. In particular, the process needs to be adopter focused, which is described by Surry and Farquhar (1997) as seeing the individual choosing to adopt the innovation as the primary force for change.

To some extent, this ability to respond to the local social context can be hard with a software product that has to be used in multiple different contexts. e.g. an LMS used in different institutions.

Slow evolution but not there yet

All university e-learning implementation is not the same. There has been a gentle evolution away from less flexible products to more flexible produces, e.g.

  1. Commercial LMS, hosted on institutional servers.
    Incredibly inflexible. You have to wait for the commercial vendor to see the cost/benefit argument to implement a change in the code base, and then you have to wait until your local IT department can schedule the upgrade to the product.
  2. Open source LMS, hosted on institutional servers.
    Less inflexible. You still have to wait for a developer to see your change as an interesting itch to scratch. This can be quite quick, but it can also be slow. It can be especially quick if your institution has good developers, but good developers cost big money. Even if the developer scratches your itch, the change has to be accepted into the open source code base, which can take some time if its a major change. Then, finally, after the code base is changed, you have to wait for your local IT shop to schedule the upgrade.
  3. Open source LMS, with hosting outsourced.
    This can be a bit quicker than the institutional hosted version. Mainly because the hosting company may well have some decent developers and significant knowledge of upgrading the LMS. However, it’s still going to cost a bit, and it’s not going to be real quick.

The cloud-based approach used by EDU 2.0 does offer a product that is potentially more flexible than existing LMS models. However, apart from the general slowness in the updating, if the change is very specific to an individual institution, it is going to cause some significant problems, regardless of the product model.

Some alternative product models

The EDU 2.0 model doesn’t help the customisation problem. In fact, it probably makes it a bit worse as the same code base is being used by hundreds of institutions from across the globe. The model being adopted by Moodle (and probably others), having plugins you can add, is a step in the right direction in that institutions can choose to have different plugins installed.
However, this model typically assumes that all the plugins have to use the same API, language, or framework. If they don’t, they can’t be installed on the local server and integrated into the LMS.

This requirements is necessary because there is an assumption for many (but not all) plugins that they provide the entire functionality and must be run on the local server. So there is a need for a tighter coupling between the plugin and the LMS and consequently less local flexibility.

A plugin like BIM is a little different. There is a wrapper that is tightly integrated into Moodle to provide some features. However, the majority of the functionality is provided by software (in this case blogging engines) that are chosen by the individual students. Here the flexibility is provided by the loose coupling between blog engine and Moodle.

Mm, still need some more work on this.

References

Surry, D., & Farquhar, J. (1997). Diffusion Theory and Instructional Technology. e-Journal of Instructional Science and Technology, 2(1), 269-278.