On the silliness of “best practice” – or why you shouldn’t (just) copy successful organisations

The very idea of “best practice” is silly. In any meaningful complex activity the idea of simply copying what someone else did is destined to fail because it doesn’t seek to understand the reasons why that best practice worked for them and what are the differences between “them” and “us”.

This post over at 37 Signals expounds a bit more on this and references an article titled Why your startup shouldn’t copy 37signals or Fog Creek. The article gives one of the explanations of why best practices are silly.

Dave Snowden has an article called Managing for Serendipity: why we should lay off “best practice” in Knowledge Management that takes the discussion even further. Some of the reasons he gives include:

  • Human beings naturally learn more effectively from failure than success.
  • There is only a very limited set of circumstances in which you are able to identify some “best way” of doing something (see wicked problems).
  • It’s very unlikely that we can codify this “best way” in a way that makes it possible for others to fully understand and adopt the practice.
  • People are unlikely to actually follow the best practice.

My favourite one, from a number of sources, is that “best” practice, “good” practice and even “bad” bad practice from somewhere else tends to be adopted because it is easier than attempting to really understand the local context and draw on expertise and knowledge to develop solutions appropriate to that context.

Doing that is hard. Much easier to see what “important organisation X” has done and copy them. This is where fads come from.

This is a small part of the argument made in the book Management Fads in Higher Education: Where They Come From, What They Do, Why They Fail by Robert Birnbaum that I’m currently reading. More on this soon.

Seven principles of knowledge management and applications to e-learning, curriculum design and L&T in universities

I’ve been a fan of Dave Snowden and his work for a couple of years. In this blog post from last year Dave shares 7 principles for “rendering knowledge”. For me, these 7 principles have direct connection with the tasks I’m currently involved with e-learning, curriculum design and helping improve the quality of learning and teaching.

If I had the time and weren’t concentrating on another task I’d take some time to expound upon the connections that I see between Snowden’s principles and the tasks I’m currently involved with. I don’t so I will leave it as an exercise for you. Perhaps I’ll get a chance at some stage.

Your considerations would be greatly improved by taking a look at the keynote presentation on social computing at the Knowledge Management Asia conference given by Dave based on these 7 principles. I listened to the podcast yesterday and slides are also available.

I strongly recommend these to anyone working in fields around e-learning, curriculum design etc. to listen to this podcast.

For example

Let’s take #2

  • We only know what we know when we need to know it.
    Human knowledge is deeply contextual and requires stimulus for recall. Unlike computers we do not have a list-all function. Small verbal or nonverbal clues can provide those ah-ha moments when a memory or series of memories are suddenly recalled, in context to enable us to act. When we sleep on things we are engaged in a complex organic form of knowledge recall and creation; in contrast a computer would need to be rebooted.

The design of both e-learning software and learning and teaching currently rely a great deal on traditional design process that rely on analysis, design, implementation and evaluation. For example, at the start of the process people are asked to reflect and share insights and requirements about the software/learning design divorced from the reality of actually using the software or learning design. Based on the knowledge generated by that reflection, decisions are made about change.

The trouble is asking people these questions divorced from the context is never going to get to the real story.

The Dreyfus Model – From Novice to Expert

This presentation by Dave Thomas talks about the Dreyfuss Model of Skill Aquisition and how it applies to software development. However, the ideas and insights seem to apply to a number of other contexts, in particularly learning and teaching at Universities. I certainly found a lot of points that resonated.

The content in this presentation is expanded upon in this book which is also available here.

Choosing your indicators – why, how and what

The unit I work with is undertaking a project called Blackboard Indicators. Essentially the development of a tool that will perform some automated checks on our institution’s Blackboard course sites and show some indicators which might identify potential problems or areas for improvement.

The current status is that we’re starting to develop a slightly better idea of what people are currently doing through use of the literature and also some professional networks (e.g. the Australasian Council on Open, Distance and E-learning) and have an initial prototype running.

Our current problem is how do you choose what the indicators should be? What are the types of problems you might see? What is a “good” course website?

Where are we up to?

Our initial development work has focused on three groupings of category: course content, coordinator presence and all interactions. Some more detail on this previous post.

Colin Beer has contributed some additional thinking about some potential indicators in a recent post on his blog.

Col and I have talked about using our blogs and other locations to talk through what we’re thinking to develop a concrete record of our thoughts and hopefully generate some interest from other folk.

Col’s list includes

  • Learner.
  • Instructor.
  • Content.
  • Interactions: learner/learner, learner/instructor, learner/content, instructor/content

Why and what?

In identifying a list of indicators, as when trying to evaluate anything, it’s probably a good idea to start with a clear definition of why you are starting on this, what are you trying to achieve.

The stated purpose of this project is to help us develop a better understanding of how and how well staff are using the Blackboard courses sites. In particular, we want to know about any potential problems (e.g. a course site not being available to students) that might cause a large amount of “helpdesk activity”. We would also like to know about trends across the board which might indicate the need for some staff development, improvements in the tools or some support resources to improve the experience of both staff and students.

There are many other aims which might apply, but this is the one I feel most comfortable with, at the moment.

Some of the other aims include

  • Providing academic staff with a tool that can aid them during course site creation by checking their work and offering guidance on what might be missing.
  • Provide management with a tool to “check on” course sites they are responsible for.
  • Identify correlations between characteristics of a course website and success.

The constraints we need to work within include

  • Little or no resources – implication being that manual, human checking of course sites is not currently a possibility.
  • Difficult organisational context due to on-going restructure – which makes it hard to get engagement from staff in a task that is seen as additional to existing practice and also suggests a need to be helping staff deal with existing problems more so than creating more work. A need to be seen to be working with staff to improve and change, rather than being seen as inflicting change upon them.
  • LMS will be changing – come 2010 we’ll be using a new LMS, whatever we’re doing has to be transportable.


From one perspective there are two types of process which can be used in a project like this

  1. Teleological or idealist.
    A group of experts get together, decide and design what is going to happen and then explain to everyone else why they should use it and seek to maintain obedience to that original design.
  2. Ateleological or naturalist.
    A group of folk, including significant numbers of folk doing real work, collaborate together to look at the current state of the local context and undertake a lot of small scale experiments to figure out if anything makes sense, they examine and reflect on those small scale experiments and chuck out the ones that didn’t work and build on the ones that did.

(For more on this check out: this presentation video or this presentation video or this paper or this one.)

From the biased way I explained the choices I think it’s fairly obvious which approach I prefer. A preference for the atelelogical approach also means that I’m not likely to want to spend vast amounts of time evaluating and designing criteria based on my perspectives. It’s more important to get a set of useful indicators up and going, in a form that can be accessed by folk and have a range of processes by which discussion and debate is encouraged and then fed back into the improvement of the design.

The on-going discussion about the project is more likely to generate something more useful and contextually important than large up-front analysis.

What next then?

As a first step, we have to get something useful (for both us and others) up and going in a form that is usable and meaningful. We then have to engage with them and find out what they think and where they’d like to take it next. In parallel with this is the idea of finding out, in more detail, what other institutions are doing and see what we can learn.

The engagement is likely going to need to be aimed at a number of different communities including

  • Quality assurance folk: most Australian universities have quality assurance folk charged with helping the university be seen by AUQA as being good.
    This will almost certainly, eventually, require identifying what are effective/good outcomes for a course website as outcomes are a main aim for the next AUQA round.
  • Management folk: the managers/supervisors at CQU who are responsible for the quality of learning and teaching at CQU.
  • Teaching staff: the people responsible for creating these artifacts.
  • Students: for their insights.

Initially, the indicators we develop should match our stated aim – to identify problems with course sites and become more aware with how they are being used. To a large extent this means not worrying about potential indicators of good outcomes and whether or not there is a causal link.

I think we’ll start discussing/describing the indicators we’re using and thinking about on a project page and we’ll see where we go from there.

Alternate foundations – the presentation

A previous post outlined the abstract for a presentation I gave last Monday on some alternate foundations for leadership of learning and teaching at CQUniversity. Well, I’ve finally got the video and slides online so this post reflects on the presentation and gives access to the multimedia resources


It seemed to go over well but there’s significant room for improvement.

The basketball video worked well this time, mainly because the introduction was much better handled.

What was missing

  • Didn’t make the distinction between safe-fail and fail-safe projects.
  • Not enough time on implications, strategies and approaches to work with this alternate foundation.
  • The description of the different parts of the Cynefin Framework were not good

The second point about strategies of working within this area is important as the thinking outlined in the presentation is hopefully going to inform the PLEs@CQUni project.

The resources

The video of the presentation is on Google Video

The slides are on Slideshare

Some alternate foundations for leadership in L&T at CQUniversity

On Monday the 25th of August I am meant to be giving a talk that attempts to link complexity theory (and related topics) to the practice of leadership of learning and teaching within a university setting. The talk is part of a broader seminar series occurring this year at CQUniversity as part of the institution’s learning and teaching seminars. The leadership in L&T series is being pushed/encouraged by Dr Peter Reaburn.

This, and perhaps a couple of other blogs posts, is meant to be a part of a small experiment in the use of social software. The abstract of the talk that goes out to CQUniversity staff will mention this blog post and some related del.icio.us bookmarks. I actually don’t expect it to work all that well as I don’t have the energy to do the necessary preparations.

Enough guff, what follows is the current abstract that will get sent out.


Some alternate foundations for leadership in L&T at CQUniversity


Over recent years an increasing interest in improving the quality of university learning and teaching has driven a number of projects such as the ALTC, LTPF and AUQA. One of the more recent areas of interest has been the question of learning and teaching leaders. In 2006 and 2007 ALTC funded 20 projects worth about $3.4M around leadership in learning and teaching. Locally, there has been a series of CQUniversity L&T seminars focusing on the question of leadership in L&T.

This presentation arises from a long-term sense of disquiet about the foundations of much of this work, an on-going attempt to identify the source of this disquiet and find alternate, hopefully better, foundations. The presentation will attempt to illustrate the disquiet and explain how insights from a number of sources (see some references below) might help provide alternate foundations. It will briefly discuss the implications these alternate foundations may have for the practice of L&T at CQUniversity.

This presentation is very much a work in progress and is aimed at generating an on-going discussion about this topic and its application at CQUniversity. Some parts of that discussion and gathering of related resources is already occuring online at
feel free to join in.

References and Resources

Snowden, D. and M. Boone (2007). A leader’s framework for decision making. Harvard Business Review 85(11): 68-76

Lakomski, G. (2005). Managing without Leadership: Towards a Theory of Organizational Functioning, Elsevier Science.

Davis, B. and D. Sumara (2006). Complexity and education: Inquiries into learning, teaching, and research. Mahwah, New Jersey, Lawrence Erlbaum Associates

Initial thoughts from CogEdge accreditation course

As I’ve mentioned before Myers-Briggs puts me into the INTP box, a Kiersey Archiect-Rational. Which amongst many other things I have an interest in figuring out the structure of things.

As part of that interest in “figuring out the structure” I spent three days last week in Canberra at a Cognitive Edge accreditation course. Primarily run by Dave Snowden (you know that a man with his own Wikipedia page must be important), who along with others has significant criticisms of the Myers-Briggs stuff, the course aims to bring people up to speed with Cognitive Edge’s approach, methods and tools to management and social sciences.

Since this paper in 2000, like many software people who found a resonance with agile software development, I’ve been struggling to incorporate ideas with a connection to complex adaptive systems into my practice. Through that interest I’ve been reading Dave’s blog, his publications and listening to his presentations for sometime. When the opportunity to attend one of his courses arose, I jumped at the chance.

This post serves two main roles:

  1. The trip report I need to generate to explain my absence from CQU for a week.
  2. Forcing me to write down some immediate thoughts about how it might be applied at CQU before I forget.

Over the coming weeks on this blog I will attempt to engage, reflect and attempt to integrate into my context the huge amount of information that was funneled my way during the week. Some of that starts here, but I’m likely to be spending years engaging with some of the ideas.

What’s the summary

In essence the Cognitive Edge approach is to take insights from science, in particular complex adaptive systems theory, cognitive science and techniques from other disciplines and apply them to social science, in particular management.

That’s not particularly insightful or original. It’s essentially a rephrasing of the session blurb. In my defence, I don’t think I can come up with a better description and it is important to state this because the Cognitive Edge approach seriously questions much of the fundamental assumptions of current practices in management and the social sciences.

It’s also important to note that the CogEdge approach only questions these assumptions in certain contexts. The approach does not claim universality, nor does it accept claims of universality from other approaches.

That said, the CogEdge approach does provides a number of theoretical foundations upon which to question much of what passes for practices within the Australian higher education sector and within organisations more broadly. I’ll attempt to give some examples in a later section. The next few sub-sections provide a brief overview of some of these theoretical foundations. I’ll try and pick up these foundations and their implications for practice at CQU and within higher education at a later date.

The Cynefin Framework

At the centre of the CogEdge approach is the Cynefin framework.

The Wikipedia page describes it as a decision making framework. Throughout the course we were shown a range of contexts in which it can be used to guide people in making decisions. The Wikipedia page lists knowledge management, conflict resolution and leadership. During the course there were others mentioned including software development.

My summary (see the wikipedia page for a better one) is that the framework is based on the idea that there are five different types of systems (the brown bit in the middle of the above image is the fifth type of system – disorder, when you don’t know which of the four other systems you’re dealing with). Most existing principles are based on the idea of there being just one type of system. An ordered system. The type of system where causality is straight forward and one that the right leader(ship group) can fully understand and design (or most likely adopt them from elsewhere) interventions that will achieve some desired outcome.

If the intervention happens to fail, then it is a problem with the implementation of the intervention. Someone failed, there wasn’t enough communication, not enough attention paid to the appropriate culture and values etc.

The Cynefin Framework suggests that there are 5 different contexts. This suggests an alternate perspective for failure. That is, that the nature of the approach was not appropriate for the type of system.

A good example of this mismatch is the story which Dave regularly tells about the children’s birthday party. Some examples of this include: an mp3 audio description (taken from this presentation) or a blog post that points to a video offering a much more detailed description.

The kid’s birthday party is an example of what they Cynefin framework calls a complex system. The traditional management by objectives approach originally suggested for use is appropriate for the complicated and simple sectors of the Cynefin framework, but not the complex.

Everything is fragmented

“Everything is fragmented” was a common refrain during the course. It draws on what cognitive science has found out about human cognition. The ideal is that human beings are rational decision makers. We gather all the data, consider the problem from all angles, perhaps consult some experts and then make the best decision (we optimize).

In reality, the human brain only gets access to small fragments of the information that is presented. We compare those small fragments against the known patterns we have in our brain (our past experience) and then choose the first match (we satisfice). The argument is that we take fragments of information and assemble them into something, somewhat meaningful.

The CogEdge approach recognises this and its methods and software are designed to build on this strength.

Approach, methods and software

The CogEdge approach is called “naturalising sensemaking”. Dave offers a simple definition of sensemaking here

the way in which we make sense of the world so that we can act in it

Kurtz and Snowden provide a comparison between what passes for the traditional approaches within organisations (idealistic) and their approach (naturalistic). I’m trying to summarise this comparison in the following table.

Idealistic Naturalistic
identify the future state and implement approaches to achieve that state gain sufficient understanding of the present context and choose projects to stimulate the evolution of the system, monitor that evolution and intervene as necessary
Emphasis is on expert knowledge and their analysis and interpretation Emphasis on the inherent un-knowability of a complex system which means affording no privelege to expert interpretation and instead favouring emergent meaning at the coal-face
Diagnosis precedes and is separate from intervention. Diagnosis/research identifies best practice and informs interventions to close the gap between now and the identified future state All diagnosis are also interventions and all interventions provide an opportunity for diagnosis

As well as providing the theoretical basis for these views the CogEdge approach also provides a collection of methods that help management actually act within a naturalistic, sense-making approach. It isn’t an approach that says step back and let it all happen.

There is also the SenseMaker Suite. Software that supports (is supported by) the methods and informed by the same theoretical insights.

Things too question

Based on the theoretical perspective taken by CogEdge it is possible to raise a range of questions (many of a very serious nature) against a range of practices currently within the Australian Higher Education sector. The following list is a collection of suggestions, I need to work more on these.

The content of this list is based on my assumption that learning and teaching within a current Australian university is a context system and fits into the sector of the Cynefin framework. I believe all of the following practices only work within the simple or the complicated sectors of the Cynefin framework.

My initial list includes the following, and where possible I’ve attempted to list what some of the flaws might be of this approach within the complex sector of the :

  • Quality assurance.
    QA assumes you document all your processes. As practiced the written down practices are quite complete. It assumes you can predict the future. As practiced by AUQA it assumes that a small collection of auditors from outside the organisational context can come in, look around for a few days and make informed comments on the validity of what is being done. It assumes that these auditors are experts making rational decisions, not pattern-matchers fiting what they see against their past experience.
  • Carrick grants emphasising cross institutional projects to encourage adoption.
    Still thinking about this one, but my current unease is based on the belief of the uniqueness of each context and the difficulty of moving the same innovation across different institutional contexts as is.
  • Requiring teaching qualifications from new academic staff.
    There is an assumption that the quality of university learning and teaching can be increased by requiring all new academic staff to complete a graduate certificate in learning and teaching. This assumes that folk won’t game the requirement. i.e. complete the grad. cert. and then ignore the majority of what they “learnt” when they return to a context which does not value or reward good teaching. It assumes that academics will gain access to the knowledge they need to improve in such a grad cert. A situation in which they are normally not going to be developing a great deal of TPCK. i.e. the knowledge they get won’t be contextualised to their unique situation.
  • The application of traditional, plan-driven technology governance and management models to the practice of e-learning.
    Such models are inherently idealistic and simply do not work well to a practice that is inherently complex.
  • Current evaluation of learning and teaching.
    The current surveys given to students at the end of term are generally out of context (i.e. applied after the student has had the positive/negative experience). The use of surveys also limit the bredth of the information that can be provided by students to the limitations enshrined in the questions. The course barometer idea we’ve been playing with for a long time is a small step in the right direction.

There are many more, but it’s getting past time to post this.

Possible projects

Throughout the course there were all sorts of ideas about how aspects of the CogEdge approach could be applied to improve learning and teaching at CQU. Of course, many of these have been lost or are still in my notebooks waiting to be saved.

A first step would be to fix the practices which I believe are now highly questionable outlined in the previous section. Some others include

  • Implement a learning and teaching innovation scheme based on some of the ideas of the Grameen bank.
    e.g. if at least 3 academics from different disciplines can develop an idea for a particular L&T innovation and agree to help each other implement it in each of their courses, then it gets supported immediatley. No evaluation by an “expert panel”.
  • Expand/integrate the course barometer idea to collect stories from students (and staff?) during the term and have those stories placed into the SenseMaker software.
    This could significantly increase CQU’s ability to pick up weak signals about trouble (but also about things that are working) and be able to intervene. Not to mention generating a strong collection of evidence to use with AUQA etc.
  • A number of the different CogEdge methods to help create a context in which quality learning and teaching arise more naturally.

There are many others, but it’s time to get this post, posted.


I’ve been a believer in complexity informed, bottom-up approaches for a long time. My mind has a collection patterns about this stuff to which I am positively inclined. Hence it is no great surprise that the CogEdge approach resonates very strongly with me.

Your mileage may vary.

In fact, I’d imagine that most hard-core, plan-driven IT folk, those in the business process re-engineering and quality assurance worlds and others from a traditional top-down management school probably disagree strong with all of the above.

If so, please feel free to comment. Let’s get a dialectic going.

I’m also still processing all of the material covered in the three day course and in the additional readings. This post was done over a few days in different locations there are certain to be inconsistencies, typos, poor grammar and basic mistakes.

If so, please feel free to correct.

From scarcity to over abundance – paradigm change for IT departments (and others)

Nothing all that new in this post, at least not that others haven’t talked about previously. But writing this helps me think about a few things.

Paradigms, good and bad

A paradigm can be/has been defined as a particularly collection of beliefs and ways of seeing the world. Perhaps as the series of high level abstractions which a particular community create to enable very quick communication. For this purpose a common paradigm/collection of abstractions is incredibly useful, especially within a discipline. It provides members of a community from throughout a wide geographic area with a shared language which they can use.

It also has a down side, paradigm paralysis. The high level abstractions, the ways of seeing the world, become so ingrained that members of that community are unable to see outside of that paradigm. A good example is the longitude problem where established experts ignored an innovation from a non-expert because it fell outside of their paradigm, their way of looking at the world.

Based on my previous posts it is no great surprise to find out that I think that there is currently a similar problem going on with the practice of IT provision within organisations.

What’s changed

The paradigm around organisational IT provision arose within a context that was very different. A context that has existed for quite sometime, but is now under-going a significant shift caused by (at least) three factors

  1. The rise of really cheap, almost ubiquitous computer hardware.
  2. The rise of cheap (sometimes free), easy to use software.
  3. The spread of computer literacy beyond the high priests of ITD.

The major change is that what was once scarce and had to be managed as a scarce resource (hardware, software and expertise) is now available in abundance.


From the 50s until recently, hardware was really, really expensive, generally under-powered and consequently had to be protected and managed. For example, in the late 1960s in the USA there weren’t too many human endeavours that would have had more available computing power than the Apollo 11 moon landing. And yet, in modern terms, it was a pitifully under-resourced enterprise.

Mission control, the folk on earth responsible for controlling/supporting the flight had access to computer power equivalent to (probably less) than the Macbook Pro I’m writing this blog entry with. The lunar module, the bit that took the astronauts from moon orbit, down, and then back again is said to have had less power than the digital watch I am currently wearing.

Moore’s law means that computer power increases exponentially with a similar impact on price.


Software has traditionally been something you had to purchase. Originally, only from the manufacturer of the hardware you used. Then software vendors arose, as hardware became more prevalent. Then there was public domain software, open source software and recently Web 2.0 software.

Not only was there more software available in these alternate approaches, this software became easier to use. There are at least half a dozen free blog services and a similar number of email services available on the Web. All offering a better user experience than similar services provided by organisations.

Knowledge and literacy

The primitive nature of the “old” computers meant that they were very difficult to program and support. But since their introduction the ability to maintain and manipulate computers in order to achieve something useful has become increasingly easy. Originally, it was only the academics, scientists and engineers who were designing computers who could maintain and manipulate them. Eventually a profession arose around the maintenance and manipulation of computers. As the evolution continued teenage boys of a certain social grouping became extremely proficient through to today when increasing numbers (but still not the majority) are able to maintain and manipulate computers to achieve their ends.

At the same time the spread of computers meant that more and more children grew up with computers. A number of the “uber-nerds” that grew up in the 60s and 70s had parents who worked in industries that enabled the nascent uber-nerds to access computers. To grow up with them. Today it is increasingly rare for anyone not to grow up with some familiarity with technology.

For example, Africa has the fastest growing adoption rate of mobile phones in the world. I recently read that the diffusion of mobile phones in South Africa put at 98%.

Yes, there is still a place for professionals. But the increasing power and ease of use of computers means that their place is increasingly not about providing specialised services for a particular organisation, but instead providing generalised platforms which the increasingly informed general public can manipulate and use without the need for IT.

For example, there’s an increasingly limited need (not quite no need) for an organisation to provide an email service when there are numerous free email services that are generally more reliable, more accessible and provide greater functionality than internal organisational services.

From scarcity to abundance

The paradigm of traditional IT governance etc is based around the idea that hardware, software and literacy are scarce. This is no longer the case. All are abundant. This implies that new approaches are possible, perhaps even desirable and necessary.

This isn’t something that just applies to IT departments. The line of work I’m in, broadly speaking “e-learning”, is also influenced by this idea. The requirement for universities to provide learning management systems is becoming increasingly questionable, especially if you believe this change from scarcity to abundance suggests the need for a paradigm change.

The question for me is what will the new paradigm be? What problems will it create that need to be addressed? Not just the problems caused by an old paradigm battling a new paradigm, the problems that the new paradigm will have. What shape will the new paradigm take? How can organisations make use of this change?

Some initial thoughts from others – better than free.

A related question is what impact will this have on the design of learning and teaching?