Category Archives: c2d2

Situated shared practice, curriculum design and academic development

Am currently reading Faegri et al (2010) as part of developing the justificatory knowledge for the final ISDT for e-learning that is meant to be the contribution of the thesis. The principle from the ISDT that this paper connects with is the idea of a “Multi-skilled, integrated development and support team” (the name is a work in progress). The following is simply a placeholder for a quote from the paper and a brief connection with the ISDT and what I think it means for curriculum design and academic development.

The quote

The paper itself is talking about an action research project where job rotation was introduced into a software development firm with the aim of increasing the quality of the knowledge held by software developers. The basic finding was that in this case, there were some benefits, however, the problems outweighed them. I haven’t read all the way through, I’m currently working through the literature review. The following quote is from the review.

Key enabling factors for knowledge creation is knowledge sharing
and integration [36,54]. Research in organizational learning has emphasized the value of practice; people acquire and share knowledge in socially situated work. Learning in the organization occurs in the interplay between tacit and explicit knowledge while it crosses boundaries of groups, departments, and organizations as people participate in work [17,54]. The process should be situated in shared practice with a joint, collective purpose [12,14,15].

Another related quote

The following is from a bit more related reading, in particular Seely Brown & Duguid (1991) – emphasis added

The source of the oppositions perceived between working, learning, and innovating lies primarily in the gulf between precepts and practice. Formal descriptions of work (e.g., “office procedures”) and of learning (e.g., “subject matter”) are abstracted from actual practice. They inevitably and intentionally omit the details. In a society that attaches particular value to “abstract knowledge,” the details of practice have come to be seen as nonessential, unimportant, and easily developed once the relevant abstractions have been grasped. Thus education, training, and technology design generally focus on abstract representations to the detriment, if not exclusion of actual practice. We, by contrast, suggest that practice is central to understanding work. Abstractions detached from practice distort or obscure intricacies of that practice. Without a clear understanding of those intricacies and the role they play, the practice itself cannot be well understood, engendered (through training), or enhanced (through innovation).

Relevance?

I see this as highly relevant to the question of how to improve learning and teaching in universities, especially in terms of the practice of e-learning, curriculum design and academic development. It’s my suggestion that the common approaches to these tasks in most universities ignore the key enabling factors mentioned in the above quote.

For example, the e-learning designers/developers, curriculum designers and academic developers are generally not directly involved with the everyday practice of learning and teaching within the institution. As a result the teaching academics and these other support staff don’t get the benefit of shared practice.

A further impediment to shared practice is the divisions between e-learning support staff, curriculum designers and academic developers that are introduced by organisational hierarchies. At one stage, I worked at a university where the e-learning support people reported to the IT division, the academic staff developers reported to the HR division, the curriculum designers reported to the library, and teaching academics were organised into faculties. There wasn’t a common shared practice amongst these folk.

Instead, any sharing that did occur was either at high level project or management boards and committees, or in design projects prior to implementation. The separation reduce the ability to combine, share and create new knowledge about what was possible.

The resulting problem

The following quote is from Seely Brown and Duiguid (1991)

Because this corporation’s training programs follow a similar downskilling approach, the reps regard them as generally unhelpful. As a result, a wedge is driven between the corporation and its reps: the corporation assumes the reps are untrainable, uncooperative, and unskilled; whereas the reps view the overly simplistic training programs as a reflection of the corporation’s low estimation of their worth and skills. In fact, their valuation is a testament to the depth of the rep’s insight. They recognize the superficiality of the training because they are conscious of the full complexity of the technology and what it takes to keep it running. The corporation, on the other hand, blinkered by its implicit faith in formal training and canonical practice and its misinterpretation of the rep’s behavior, is unable to appreciate either aspect of their insight.

It resonates strongly with some recent experience of mine at an institution rolling out a new LMS. The training programs around the new LMS, the view of management, and the subsequent response from the academics showed some very strong resemblances to the situation described above.

An alternative

One alternative, is what I’m proposing in the ISDT for e-learning. The following is an initial description of the roles/purpose of the “Multi-skilled, integrated development and support team”. Without too much effort you could probably translate this into broader learning and teaching, not just e-learning. Heaven forbid, you could even use it for “blended learning”.

An emergent university e-learning information system should have a team of people that:

  • is responsible for performing the necessary training, development, helpdesk, and other support tasks required by system use within the institution;
  • contains an appropriate combination of technical, training, media design and production, institutional, and learning and teaching skills and knowledge;
  • through the performance of its allocated tasks the team is integrated into the everyday practice of learning and teaching within the institution and cultivates relationships with system users, especially teaching staff;
  • is integrated into the one organisational unit, and as much as possible, co-located;
  • can perform small scale changes to the system in response to problems, observations, and lessons learned during system support and training tasks rapidly without needing formal governance approval;
  • actively examines and reflects on system use and non-use – with a particular emphasis on identifying and examining what early innovators – to identify areas for system improvement and extension;
  • is able to identify and to raise the need for large scale changes to the system with an appropriate governance process; and
  • is trusted by organisational leadership to translate organisational goals into changes within the system, its support and use.

References

Faegri, T. E., Dyba, T., & Dingsoyr, T. (2010). Introducing knowledge redundancy practice in software development: Experiences with job rotation in support work. Information and Software Technology, 52(10), 1118-1132.

Seely Brown, J., & Duguid, P. (1991). Organizational learning and communities-of-practice: Toward a unified view of working, learning, and innovation. Organization Science, 2(1), 40-57.

The rider, elephant, and shaping the path

Listened to this interview of Chip Heath, a Stanford Professor in Organizational Behaviour about his co-authored book Switch: How to change things when change is hard. My particular interest in this arises from figuring out how to improve learning and teaching in universities. From the interview and the podcast this seems to be another one in a line of “popular science” books aimed at making clear what science/research knows about the topic.

The basic summary of the findings seems to be. If you wish to make change more likely, then your approach has to (metaphorically):

  • direct the rider;
    The rider represents the rational/analytical decision making capability of an individual. This capability needs to be appropriately directed.
  • engage the elephant; and
    The elephant represents the individual’s emotional/instinctive decision making approach. From the interview, the elephant/rider metaphor has the express purpose of showing that the elephant is far stronger than the rider. In typical situations, the elephant is going to win, unless there’s some engagement.
  • shape the path.
    This represents the physical and related environment in which the change is going to take place. My recollection is that the shaping has to support the first two components, but also be designed to make it easier to traverse the path and get to the goal.

There are two parts of the discussion that stuck with me as I think they connect with the task of improving learning and teaching within universities.

  1. The over-rationalisation of experts.
  2. Small scale wins.

Over-rationalisation of experts

The connection between organisational change and losing weight seems increasingly common, it’s one I used and it’s mentioned in the interview. One example used in the interview is to show how a major problem with change is that it is driven by experts. Experts who have significantly larger “riders” (i.e. rational/analytical knowledge) of the problem area/target of change than the people they are trying to change. This overly large rider leads to change mechanisms that over complicate things.

The example they use is the recently modified food pyramid from the United States that makes suggestions something like, “For a balanced diet you should consume X tablespoons of Y a day”. While this makes sense to the experts, a normal person has no idea of how many tablespoons of Y is in their daily diet. In order to achieve the desired change, the individual needs to develop all sorts of additional knowledge and expertise. Which is just not likely.

They compare this with some US-based populariser of weight loss who proposes much simpler suggestions e.g. “Don’t eat anything that comes through your car window”. It’s a simpler, more evocative suggestion that appears to be easier for the rider to understand and helps engage the elephant somewhat.

I can see the equivalent of this within learning and teaching in higher education. Change processes are typically conceived and managed by experts. Experts who over rationalise.

Small scale wins

Related to the above is the idea that change always consists of barriers or steps that have to be stepped over. Change is difficult. The suggestion is that when shaping the path you want to design it in such a way so that the elephant can almost just walk over the barrier. The interviewer gives the example of never being able to get her teenage sons to stop taking towels out of the bathroom and into their bedroom. Eventually what worked was “shaping the path” by storing the sons’ underwear in the bathroom, not their bedroom.

When it comes to improving learning and teaching in universities, I don’t think enough attention is paid to “shaping the path” like this. I think this is in part due to the process being driven by the experts, so they simply don’t see the need. But it is also, increasingly, due to the fact that the people involved can’t shape the path. Some of the reasons the path can’t be shaped include:

  • Changing the “research is what gets me promoted” culture in higher education is very, very difficult and not likely to happen effectively if just one institution does it.
  • When it comes to L&T path (e.g. the LMS product model or the physical infrastructure of a campus) it is not exactly set up to enable “shaping”.
  • The people involved at a university, especially in e-learning, don’t have the skills or the organisational structure to enable “shaping”.

30% of information about task performance

Over on the Remote Learner blog, Jason Cole has posted some information about a keynote by Dr Richard Clark at one of the US MoodleMoots. I want to focus on one key quote from that talk and its implications for Australian higher education and current trends to “improve” learning and teaching and adopt open source LMS (like Moodle).

It’s my argument that this quote, and the research behind it, has implications for the way these projects are conceptualised and run. i.e. they are missing out on a huge amount of potential.

Task analysis and the 30%

The quote from the presentation is

In task analysis, top experts only provide 30% of information about how they perform tasks.

It’s claimed that all the points made by Clark in his presentation are supported by research. It appears likely that the support for this claim comes from Sullivan et al (2008). This paper address the problem of trying to develop procedural skills necessary for professions such as surgery.

The above quote arises due to the problems experts have in describing what they do. Sullivan et al (2008) offer various descriptions and references of this problem in the introduction

This is often difficult because as physicians gain expertise their skills become automated and the steps of the skill blend together [2]. Automated knowledge is achieved by years of practice and experience, wherein the basic elements of the task are performed largely without conscious awareness [3]. This causes experts to omit specific steps when trying to describe a procedure because this information is no longer accessible to conscious processes [2]

Then later, when describing the findings of their research they write

The fact that the experts were not able to articulate all of
the steps and decisions of the task is consistent with the expertise literature that shows that expertise is highly automated [2,3,5] and that experts make errors when trying to describe how they complete a task [3,6,7]. In essence, as the experts developed expertise, their knowledge of the task changed from declarative to procedural knowledge. Declarative knowledge is knowing facts, events, and objects and is found in our conscious working memory [2]. Procedural knowledge is knowing how to perform a task and includes both motor and cognitive skills [2]. Procedural knowledge is automated and operates outside of conscious awareness [2,3]. Once a skill becomes automated, it is fine-tuned to run on autopilot and executes much faster than conscious processes [2,8]. This causes the expert to omit steps and decision points while teaching a procedure because they have literally lost access to the behaviors and cognitive decisions that are made during skill execution [2,5].

The link to analysis and design

A large number of universities within Australia are either:

  1. Changing their LMS to an open source LMS (e.g. Moodle or Sakai), and using this as an opportunity to “renew” their online learning; and/or
  2. Busy on broader interventions to “renew” their online learning due to changes in government policies such as quality assurance, graduate attributes and a move to demand funding for university places.

The common process being adopted by most of these projects is from the planning school of process. i.e. you undertake analysis to identify all relevant, objective information and then design the solution on that basis. You then employ a project team to ensure that the design gets implemented, and finally you put in a skeleton team that maintains the design. This works in terms of information systems (e.g. the selection, implementation and support of a LMS) or broader organisational change (e.g. strategic plans).

The problem is that the “expert problem” Clark refers to above means that it is difficult to gather all the necessary information. It’s difficult to get the people with the knowledge to tell all that they know.

A related example.

The StaffMyCQU Example

Some colleagues and I – over a period of almost 10 years – designed, supported, and evolved an information system call Staff MyCQU. An early part of it’s evolution is described in the “Student Records” section of this paper. It was a fairly simple web application that provided university staff with access to student records and range of related services. Over it’s life cycle, a range of new and different features were added and existing features tweaked, all in response to interactions with the system’s users.

Importantly, the systems developers were also generally the people handling user queries and problems on the “helpdesk”. Quite often, changes to the system would result in tweaks and changes. Rather than being designed up front, the system grew and changed with people using it.

The technology used to implement Staff MyCQU is now deemed ancient and, even more importantly, the system and what it represents is now politically tainted within the organisation. Hence, for the last year or so, the information technology folk at the institution have been working on replacement systems. Just recently, there’s been some concrete outcomes of that work which has resulted in systems being shown to folk, including some of the folk who had used Staff MyCQU. On being shown a particular feature of the new system, it soon became obvious that the system didn’t include a fairly common extension of the feature. An extension that had actually been within StaffMyCQU from the start.

The designers of the new system, with little or no direct connection with actual users doing actual work, don’t have the knowledge about user needs to design a system that is equivalent to what already exists. A perfect example of why the strict separation of analysis, design, implementation and use/maintenance that is explicit in most IT projects and divisions is a significant problem.

The need for growing knowledge

Sullivan et al (2008) suggest cognitive task analysis as a way to better “getting at” the knowledge held by the expert, and there’s a place for that. However, I also think that there is a need, especially in some contexts, for recognition that the engineering/planning method is just not appropriate for some contexts. In some contexts, you need more of a growing/gardening approach. Or, in some cases you need to include more of the growing/gardening approach into your engineering method.

Rather than seeking to gather and analyse all knowledge separated from practice and prior to implementation. Implementation needs to be designed to pay close attention to knowledge that is generated during implementation and the ability to act upon that knowledge.

Especially for wicked problems and complex systems

Trying to improve learning and teaching within a university is a wicked problem. There are many different stakeholders or groups of stakeholders, each with a different frame of reference which leads to different understanding of how to solve the problem. Simple techno-rational solutions to wicked problems rely on the adoption of one of those frames of reference and ignorance of the remainder.

For example, implementation of a new LMS is seen as an information technology problem and treated as such. Consequently, success is measured by uptime and successful project implementation. Not on the quality of learning and teaching that results.

In addition, as you solve wicked problems, you and all of the stakeholders learn more about the problem. The multiple frames of reference change and consequently the appropriate solutions change. This is getting into the area of complex adaptive systems. Dave Snowden has a recent post about why human complex adaptive systems are different.

Prediction

Universities that lean too heavily on engineering/planning approaches to improving learning and teaching will fail. However, they are likely to appear to succeed due to the types of indicators they choose to adopt to as measurements of success and the capability of actors to game those indicators.

Universities that adopt more of a gardening approach, will have greater levels of success, but will have a messier time of it during their projects. These universities will be where the really innovative stuff comes from.

References

Sullivan, M., A. Oretga, et al. (2008). “Assessing the teaching of procedural skills: can cognitive task analysis add to our traditional teaching methods.” The American Journal of Surgery 195: 20-23.

Implications of cognitive theory for instructional design

The following is a summary/reflection of Winn (1990), the abstract follows

This article examines some of the implications of recent developments in cognitive theory for instuctional design. It is argued that behavioral theory is inadequate to prescribe instructional strategies that teach for understanding. Examples of how instructional designers have adopted relevant aspects of cognitive theory are described. However, it is argued that such adoption is only a first step. The growing body of evidence for the indeterminism of human cognition requires even further changes in how instructional designers think and act. A number of bodies of scholarly research and opinion are cited in support of this claim. Three implications of cognitive theory for design are offered: instructional strategies need to be developed to counter the reductionism implicit in task analysis; design needs to be integrated into the implementation of instruction; designers should work from a thorough knowledge of theory not just from design procedures.

Summary

Suggests problems arise when decisions within instructional design are driven by cognitive theory, not behavioural. Mostly around the assumptions of rationality and predictability and the subsequent appropriateness of the traditional teleological design process used by instructional design. Suggests some approaches/implications that might help address these somewhat.

Reflection

The ideas expressed here offer support for the ideas I’ve been formulating about how to improve learning and teaching at Universities. Which obviously means I think it is an important bit of work by an intelligent person. It probably does have flaws. Will need to read and reflect more.

Still not sure that these principles have been applied broadly enough (though the conclusion seems to indicate yes). Winn has focused on changes to the practice of instructional designers in how they approach design without talking about how they may have to change how they work with the academics. Instructional design, for me, is as much about staff development as it is about design, at least within the current university context. Instructional design within universities can’t scale unless it builds capacity amongst the academic staff and the system to help in design.

Many of these limitations of instructional design are similar to those I’ve been trying to push around the institutional implementation of e-learning and more generally about approaches to improve learning and teaching e.g. graduate attributes.

Introduction

Starts with a definition of instructional design from Reigeluth (1993) – essentially it is a set of decision-marking procedures which, given the outcomes to be achieved and the conditions under which they are to achieve them, develops the most effective instructional strategies.

Generally done with analysis of outcomes/conditions, selection of strategies, iterative testing until some level of success achieved. The decisions are guided by instructional theory.

Gives examples of instructional design processes informed by cognitive theory.

Suggests evidence that cognitive theory is impacting thinking/actions of instructional designers, however, suggests that cognitive theory requires further changes in the way they think/act. Has problems with the analysis and selection/testing stages. Current approaches are not sufficient.

Suggests that instructional design should be driven by an understanding of theories of learning and instruction, rather than mastery of design techniques.

I’m assuming here he means that the type and nature of the steps within design process itself should be informed by these, not what he also recognises is that the decisions made within these steps are already driven by this. I’m a bit slow this morning.

Instructional design and behavioural theory

Supports/explains the notion that instructional design originated in behavioural theory, the dominant learning theory of the time when ID originated. Shows how instructional design processes evolved to fit the needs of behavioural theory. Examples include the reductionist nature of task analysis and pilot testing being sufficient to debug instruction that consisted of stimulus-response prescriptions. i.e. behavourists did not consider that there were “mental operations” within the learner that might mediate between stimulus and response. This resulted in design being separated from implementation.

If instruction can be developed to the point where acceptable student performance is likely to occur, then it matters little whether instruction is implemented immediately after the designer has attained this standard, or at some later time and in some other place.

Connects with literature that acknowledges the separation (Richey, 1986), thinks it creates problems (Nunan, 1983, Streibel, 1989, Winn, 1989) and others which think it desirable (Heinich, 1970 and 1984). Desirable because “it allows instruction to be brought up to a high standard, and then to be distributed far and wide so that all students can enjoy the advantages of top-rate teaching”.

Lastly, suggests the idea that instructional design can be “done by the numbers” also arises from the behavioural tradition. The idea is that any novice designer can be successful if they just follow the process – do it by the numbers.

In summary, 3 important areas where behaviourism still exerts power over instructional design:

  1. Reductionist premise that you can identify the parts, then you can teach the whole.
  2. Separate design from implementation.
  3. Assumption that following good procedures, applied correctly results in good instruction.

Sticking with the behavioural traditions, suggests that these 3 are not a problem, if you’re limiting yourself to low-level skills. The problems arise when you go to high levels of cognitive processing.

Cognitive theory

The aim here is to explain why the three assumptions are problematic as informed by cognitive theory – the obvious though here is what would constructivism or connectivism suggest.

The description of cognitive theory is

Changes in behavior are seen as indirect rather than direct outcomes of learning. Observable behavior is mediated and controlled by such mental activities as the acquisition, organization and application of knowledge about the world (Neisser, 1976; Rumelhart and Norman, 1981); by the development of skills that allow the encoding, storing and relrieval of information (E. Gagne, 1985; Shuell, 1986); by people’s motivation (Keller, 1983); their perception of what a task requires of them (Salomon, 1983a); and their perception of their likelihood of success (Salomon, 1983b; Schunk, 1984). Consequently, students are seen as active in the construction of knowledge and the development of skills, leading to the conclusion that learning is a generative process under the control of the learner (Wittrock, 1979, 1982).

To my somewhat untrained ear, this sounds like it has aspects of constructivism.

References Bonner (1988) as identifying a number of the differences between traditional designers and those informed by cognitive theory including:

  • task analysis;
    Traditionally aims to identify directly observed behaviours. Cognitive theory requires that “unobservable” tasks be analysed. i.e. the mental tasks to be mastered prior to observable performance being possible. examples including identifying declarative and procedural knowledge or schemata required to perform. Also recognition that novice to expert involves many steps that need to mastered.
  • objectives;
    Statements of what the student is to accomplish under what conditions and to what criterion is a behaviourist approach. Cognitive objectives are schematic representations of the knowledge to be acquired and procedures to apply.
  • learner characteristics;
    Focus on the schemata/mental models students bring to instruction, not their behaviours. May not be a clear line between what they need to know and what they know – learner as dirty slate.
    This acknowledges the importance of current knowledge of the world, represented in mental models, for the acquisition of new knowledge and skills. Research (De Kleer and Brown, 1981; Larkin, 1985; and authors contributing to Gentner and Stevens, 1983) has shown that learning occurs as students’ mental models acquire refinement and accuracy.

  • instructional strategies;
    Behaviourism selected instructional strategies based on the type of learning to take place, the type of learning outcome.
    But because the cognitive conception of learning places so much importance on the student’s development of adequate knowledge structures, cognitive procedures and mental models, the designer should create opportunities for what Bonner calls a “cognitive apprenticeship” centered around problem-solving rather than prescribe strategies apriori.

    Some general principle may determine aspects of the strategy, however, it evolves like a conversation.

Rieber (1987) point out, is that instruction that is designed from cognitive principles can lead to understanding rather than just to memorization and skill performance.

This speaks to me because too much of what passes for improving learning and teaching strikes me as most likely to create memorisation and skill performance, not long term change.

The need for further change

While instructional designers are adopting principles from cognitive theory, the idea is that recent thinking in cognitive psychology and related fields brings into question some the assumptions of cognitive theory as currently accepted. Moving onto the reasons:

  • metacognition;
    Metacognition research shows that students have or can be trained to acquire the ability to reflect on their performance and adopt/adapt different learning strategies. This means that the intent of a instructional design can be circumvented if the student finds the chosen strategy problematic. If the instruction is not adaptable or the student doesn’t choose a good strategy, then the instructional design is compromised.

    I wonder what implications this has for constructive alignment and its idea of forcing the student to do the right thing?

  • dynamic nature of learning;
    Very interesting. As the learner learns, they develop knowledge and skill that is different from the start. The analysis performed at the start to select the instructional strategy no longer holds. If the analysis was done now, a different strategy would be required.
    Nunan (1983) develops this line of reasoning in his argument against the value of instructional design, drawing on arguments against the separation of thought from action put forward by Oakshotte (1962) and Polanyi (1958).

  • emergent properties of cognition;
    This is the argument against reductionism. Emergence is defined as the idea where the properties of the whole, cannot be explained solely be examining the individual parts of the whole. The nature of the whole affects the way elements within them behave. The suggestion is that a number of people have claimed that the actions of the human mind exhibit emergent properties (Churchland, 1988; Gardner, 1985).

    The reductionism that underpins task and learner analysis “acts counter to, or at best ignores, a significant aspect of human cognition, which is the creation of something entirely new and unexpected from the “raw material” that has to be learned.

  • plausible reasoning;
    A designer informed by cognitive theory assumes that the thought processes of a student will be as logical as the instruction itself. In order to learn from a machine, the student has to think like a machine (Streibel, 1986). There is lots of evidence to suggest people are not logical. “Plausible reasoning” is Collins (1978) idea that people proceed on hunches and incomplete information. Hunt (1982) suggests plausible reasoning has allowed the human species to survive.
    If we waited for complete sets of data before making decisions, we would never make any and would not have evolved into intelligent beings.

  • situated cognition; and
    Somewhat related to previous. Streibel (1989) argues that “cognitive science can never form the basis for instructional design because it proceeds from the assumption that human reasoning is planful and logical when in fact it is not”. References Brown, Collins and Duguid (1989); Lave (1988) and Suchman (1987) – i.e. situated cognition folk – to argue that the way we solve problems is dependent on the situation in which the problem occurs. We do not use formal or mathematical reasoning.
  • unpredictability of human behaviour.
    The 5 previous points suggest that human behaviour is indeterminate. Csiko (1989) gives 5 types of evidence to argue that the unpredictability and indeterminism of human behaviour is central to the debate concerning epistemology of educational research. Winn (1990) suggests it applies equally well to instructional design:
    1. Individual learner differences interact in complex ways with treatments which make prediction of performance difficult.
    2. Chaos theory suggests the smallest changes in initial states lead to wild and totally unpredictable fluctuations in a systems behaviour. Something that is more pronounced in complex cognitive systems.
    3. Much learning is “evolutionary” in that it arises from chance responses to novel stimuli
    4. Humans have free will which can be exercised and subsequently invalidate any predictions about behaviours made deterministically from data.
    5. Quantum mechanics shows that observing a phenomenon, changes that phenomenon so that the results of observations are probabilities, not certainties.

Though eclectic, this body of argument leads one to question seriously both the assumption of the validity of instructional prescriptions and the assumption that what works for some students will work for others.

While prediction may not be part of instructional design, it is of the theories it depends upon and Reigelluth (183) points out that any theory of instruction, while not deterministic, does rely on the probability that prescriptions made form it for its validity. Without such validity, you may as well rely on trial and error.

Conclusions

Cognitive theory has been incorporated into instructional design, but behaviourism influence remains and that causes problems.

Cognitive task analysis to develop objectives is just as reductionist as behaviourist approaches.. The whole approach designers take needs to be re-examined. Three directions might include:

  1. Analysis and synthesis;
    Addressing reductionist analysis – instructional strategies need to ensure knowledge/skill components are put back together in meaningful ways….e.g. Reigeluth and Stein’s (1983) use of “summarisers” and “synthesizes” in elaboration theory.

    Balance analysis as a design procedure with synthesis as an instructional strategy. Such prescriptions should exist in instructional theories.

  2. Design and implementation;
    For instruction to be successful, it must therefore constantly monitor and adapt to unpredicted changes in student behavior and thinking as instruction proceeds……To succeed, then, instructional decisions need to be made while instruction is under way and need to be based on complete theories that allow the generation of prescriptions rather than on predetermined sets of prescriptions chosen ahead of time by a designer. (p64)

    Requires the teacher to monitor and modify strategies as they’ve been prescribed. Requires teachers to be well schooled in instructional design and a solid knowledge of theories of learning and instructions – so that they can respond in some sort of informed way. e.g. need methods that will allow them to invent prescriptive principles when the need arises.

    Second recommendation is that the designer needs to monitor the actual use of the instructional system during implementation, or for the designer to make provision for the use to change instruction strategies.

  3. Theory and procedure.
    Decisions about instructional strategies need to be based on more than just the application of design procedures. Rather than techniques being taught, the principles should be.
    This problem is made worse by researchers who are content to identify strategies that work on single occasions rather than determine the necessary conditions for their success (Clark, 1983).

Reservations about instructional design

The following is at first a rambling diatribe outlining some of my reservations with instructional design as it is practiced. Then it is a summary/reflection on Winn (1990) – “Some implications of cognitive theory for instructional design”. The abstract for Winn (199)

This article examines some of the implications of recent developments in cognitive theory for instmctional design. It is argued that behavioral theory is inadequate to prescribe instructional strategies that teach for understanding. Examples of how instructional designers have adopted relevant aspects of cognitive theory are described. However, it is argued that such adoption is only a first step. The growing body of evidence for the indeterminism of human cognition requires even further changes in how instructional designers think and act. A number of bodies of scholarly research and opinion are cited in support of this claim. Three implications of cognitive theory for design are offered: instructional strategies need to be developed to counter the reductienism implicit in task analysis; design needs to be integrated into the implementation of instruction; designers should work from a thorough knowledge of theory not just from design proceduts.

Actually, I’m running out of time, this post will be just the diatribe. The summary/reflection on Winn (1990) will have to wait till later.

Some context

The following line of thought is part of an on-going attempt to identify potential problems in the practice of instructional design because I work within a Curriculum Design & Development Unit at a University. I am trying to identify and understand these problems as an attempt to move toward something that might be more effective (but would likely have its own problems). The current attempt at moving toward a solution will hopefully arise out of some ideas around curriculum mapping.

The diatribe

Back in the mid-1990s I was being put in charge of my first courses. The institution I worked at was, at that stage, a true 2nd generation distance education provider bolted onto an on-campus university (the university was a few years old, having evolved from an institute of advance education). Second generation distance education was “enterprise” print distance education. There was a whole infrastructure, set of processes and resources targeted at the production of print-based study guides and resource materials that were sent to students as their prime means of education. A part of the resources were instructional designers.

From the start, my experiences with the instructional designers and the system they existed within was not good. The system couldn’t see it was increasingly less relevant through the rise of information technology and the instructional designers seemed more interested in their knowledge about what was the right thing to do, rather than recognising the realities of my context and abilities. Rather than engaging with me and my context and applying their knowledge to show how I could solve my problems, they kept pushing their own ideal situations.

Over 15 years on, and not a lot has changed. I still see the same problem in folk trying to improve learning and teaching at that institution. Rather than engage in an on-going process of improvement and reflection, it’s all about big bang changes and their problems. Worse, then as now, only the smallest population of the academics are being effectively engaged by the instructional designers. i.e. the academics that are keen, the ones that are willing to engage with the ideas of the designers (and others). This is perhaps my biggest concern/proposition, that the majority of academics are not engaging with this work and that a significant proportion of them (but not all) are not improving their teaching. But there are others:

  • Instructional designers are increasingly the tools of management, not folk helping academics.
    In an increasingly managerialist sector, the “correct” directions/methods for learning and teaching are increasingly being set by government, government funded bodies (e.g. ALTC and AUQA) and subsequently the management and professionals (e.g. instructional designers, staff developers, quality assurance etc.) that are institutionally responsible for being seen to respond effectively to the outside demands.

    There are two problems with this:

    1. the technologists alliance; and
      The professionals within universities, because of their interactions with the external bodies and because their success depends on engaging with and responding to the demands of the external body, start to think more like the external body. For example, many of the folk on the ALTC boards/etc are from university L&T centres. Their agenda internally becomes more about achieving ALTC outcomes, rather than outcomes for the academics. Geoghegan (1994) identified the technologists alliance around technology, it is increasingly in existence for L&T.
    2. do what management says.
      Similarly, because senior management within universities are being measured on how well they respond to the external demands. They to are suffering the same problem. In addition, because they are generally on short-term contracts there’s increased demand to respond via short-term approaches that show short-term gain but are questionable in the long-term. Instructional designers etc are then directed to carry out these short-term approaches, even if they will hurt in the long term are or seen as nonsensical by academics.

    The end result is that academics perceive instructional designers as people doing change to them, not doing change with them or for them. Not a good foundation on which to encourage change and improvement in something as personal as teaching.

  • Traditional instructional design is not scalable.
    My current institution has about 4 instructional designers. The first term of this year sees the institution offering 400+ courses. That means somewhere around 800 courses a year. That’s 200 courses a year per instructional designer. If you’re looking at each course being “helped” once every two years, that means each course gets the instructional designer for 2 days every 2 years, at best.

    In this environment, traditional ADDIE type big-bang approaches can’t scale.

  • Instructional design seems informed by a great knowledge of ideal learning and teaching, but none of how to effectively bridge the gap between academics and that ideal.

References

Geoghegan, W. (1994). Whatever happened to instructional technology? 22nd Annual Conferences of the International Business Schools Computing Association, Baltimore, MD, IBM.

Winn, W. (1990). “Some implications of cognitive theory for instructional design.” Instructional Science 19(1): 53-69.

Improving university teaching – learning from constructive alignment by *NOT* mandating it

The problem

Most university teaching is crap. Too general, too sweeping? Maybe, but based on my experience I’m fairly comfortable with that statement. The vast majority of what passes for teaching at Universities has a number of really significant flaws. It’s based more on what the teaching academic is familiar with (generally based on the discipline experience) than on any idea of what might be effective.

So, how do you improve it? This is not a simple question to answer. However, I also believe that most of the current and proposed answers being used by universities to answer this question are are destined to fail. That is, they will be able to show some good practice amongst a small percentage of academic staff, but have the vast majority of learning and teaching to be less than good.

I should point out that almost all of my attempts to describe why I think this is the case and to outline a more appropriate solution have been, essentially, failures.

The following is an attempt to draw on Biggs’ (2001) three levels of teaching to formulate three levels of improving teaching that can be used to understand approaches to improving learning and teaching. I’ll briefly outline an important part of what I think is a better solution. I’ll also reject the approach Bigg’s (2001) outlines as being too teleological, too complex, simply not likely to be effectively implemented and consequently, fail.

By the end of writing this post, I’ve come up with a name “reflective alignment” for my suggested solution.

Biggs’ three levels of teaching

Levels of thinking about learning and teaching

The image to the left is taken from a short film that explains constructive alignment, an approach developed by John Biggs. (I recommend the film if you want another perspective on this.)

These levels of knowledge about teaching lays the blame for poor student outcomes in the hands of the teachers and what they perceive teaching to be about. The three levels are as a focus on:

  1. What the student is.
    This is the horrible “blame the student” approach to teaching. I’ll keep doing what I do. If the students can’t learn then it is because they are bad students. It’s not my fault. Nothing I can do.
  2. What the teacher does.
    This is the horrible “look at me and all the neato, innovative teaching that I’m doing”. I’m doing lots of good and difficult things in my teaching. Are the students learning?
  3. What the student does.
    Obviously this is the good level. The focus is on teaching and leads to learning. Biggs (2001) uses a quote from Tyler (1949) to illustrate that this is not a new idea
    [learning] takes place through the active behavior of the student: it is what he does that he learns, not what the teacher does

Flowing from these levels is the idea of constructive alignment that encompasses the type of teaching likely to suggest a level 3 teacher. Constructive alignment is based on the simple steps of:

  • Clearly specifying detailed learning objectives for students.
  • Arrange teaching and learning activities that encourage/require students to carry out tasks that provide the student with exposure, practice and feedback on the learning objectives.
  • Design a grading/marking system that requires the student to demonstrate how well they achieve the stated learning objectives.

Performing these 3 simple steps well results in the situation that Biggs (2001) describes

In aligned teaching, where all components support each other, students are “trapped” into engaging in the appropriate learning activities, or as Cowan (1998) puts it, teaching is “the purposeful creation of situations from which motivated learners should not be able to escape without learning or developing” (p. 112). A lack of alignment somewhere in the system allows students to escape with inadequate learning.

Sounds simple, doesn’t it? So why don’t more people use it?

“Staff development” is crap!

That’s my characterisation of the position Biggs (2001) espouses (SDC = Staff Development Centre). This includes the following comments

…getting teachers to teach better, which is what staff development is all about…..staff development…is being minimized in many universities, not only in the UK but also in
Australia and New Zealand…..Typically, staff development is undertaken in workshops run by the staff development centre…This is the fundamental problem facing SDCs: the focus is on individual teachers, not on teaching

I particularly liked the following comment from Biggs (2001) and find a lot of resonances with local contextual happenings.

Too often SDCs are seen from a
Level 2 theory as places providing tips for teachers, or as remedial clinics for poor or beginning teachers. Most recently, they are being replaced by training in educational technology, in the confused belief that if teachers are using IT then they must be teaching properly for the new millennium.

Biggs’ solution

Biggs’ (2001) own summary is hard to argue with

In sum, QE cannot be left to the sense of responsibility or to the priorities of individual teachers. The institution must provide the incentives and support structures for teachers to enhance their teaching, and most importantly, to involve individuals through their normal departmental teaching in QE processes.

However, the detail of his suggested solution is, I think, hideously unworkable to such an extent as likely to have a negative impact on the quality of teaching if any institution of a decent size tried to implement it. As Biggs (2001) says, but about a slightly different aspect, “the practical problems are enormous”.

I’ve been involved with the underbelly of teaching and learning at universities to have a significant amount of doubt about whether the reality of learning and teaching matches this representation to the external world. I’ve seen institutions struggle with far simpler tasks than the above and individual academics and managers “game the system” to be seen to comply while not really fulfilling (or even understanding) the requirements.


3 levels of improving teaching

Leadership: when in doubt, wave a flag

I’d like to propose that there are 3 levels of improving teaching that have some connection with Biggs’ 3 levels of teaching. My 3 levels are:

  1. What the teacher is.
    This is where management put teachers into good and bad categories. Any problems with the quality of teaching is the fault of the academic staff. Not the system in which they work.
  2. What the management does.
    This is the horrible simplistic approach taken by most managers and typically takes the forms of fads. i.e. where they think X (where X might be generic skills, quality assurance, problem-based learning or even, if they are really silly, a new bit of technology) will make all the different and proceed to take on the heroic task of making sure everyone is doing X. The task is heroic because it usually involves a large project and radical change. It requires the leadership to be “leaders”. To wield power, to re-organise i.e. complex change that is destined to fail.
  3. What the teacher does.
    The focus is on what the teacher does to design and deliver their course. The aim is to ensure that the learning and teaching system, its processes, rewards and constraints are aiming to ensure that the teacher is engaging in those activities which ensure quality learning and teaching. In a way that makes sense for the teacher, their course and their students.

Reflective alignment – my suggested solution

Biggs’ constructive alignment draws on active student construction of learning as the best way to learn. Hence the “constructive” bit in the name. I’m thinking that “reflective alignment” would be a good name for what I’m thinking.

This is based on the assumption that what we really want academic staff to be doing in order to ensure that they are always improving their learning and teaching is “being reflective”. That they are engaging in deliberate practice. I’ve talked a bit about this in an earlier post.

I’m just reading a paper (Kreber and Castleden, 2009) that includes some support for my idea

We propose that teaching expertise requires a disposition
to engage in reflection on core beliefs…..The value attributed to the notion of ‘reflective practice’ in teaching stems from the widely acknowledged view that reflection on teaching experience contributes to the development of more sophisticated conceptual structures (Leinhardt and Greeno 1986), which in turn lead to enhanced teaching practice and eventually, it is hoped, to improved student learning.

So, simply and without detail, I believe it is important that if a university wants to significantly improve the quality of the majority, if not all, of its learning and teaching then it is to create a context within which academic staff can’t but help to engage in reflective practice as part of their learning and teaching.

That’s the minimum, and not all that easy. The next step would be to create an environment in which academic staff can receive support and assistance in carrying out the ideas which their reflection identifies. But this is secondary. In the absence of this, but the presence of effective reflection, they will work out solutions without the support.

(There is some potential overlap with Biggs’ (2001) solution, but I don’t think his focuses primarily on encouraging reflection. It has more in common with Level 2 approaches to improving learning and teaching, especially in how it would be implemented in most universities. Yes, the implementation problem still remains for my solution and could also most likely be implemented as a Level 2 approach. But any solution should be contextually sensitive.)

References

Biggs, J. (2001). “The Reflective Institution: Assuring and Enhancing the Quality of Teaching and Learning.” Higher Education 41(3): 221-238.

Kreber, C. and H. Castleden (2009). “Reflection on teaching and epistemological structure: reflective and critically reflective processes in ‘pure/soft’ and ‘pure/hard’ fields.” Higher Education 57(4): 509-531.

Good teaching is not innate, it can be “learned” – and what’s wrong with academic staff development

The title to this post is included in a quote from Kane, Sandretto and Heath (2004)

The research team, comprising two teacher educators and an academic staff developer, embarked upon this research confident in the belief that good teaching is not innate, it can be learned. With this in mind, the project sought to theorise the attributes of excellent tertiary teachers and the relationships among those attributes, with the long-term goal of assisting novice academics in their development as teachers.

This coincides nicely with my current task and also with an idea I came across on the week-end about deliberate practice and the work of Anders Ericsson.

The combination of these “discoveries” is also providing some intellectual structure and support for the REACT idea about how to improve learning and teaching. However, it’s also highlighting some flaws in that idea. Though the flaws aren’t anywhere near as large as what passes for the majority of academic staff development around learning and teaching.

The following introduces these ideas and how these ideas might be used to improve academic staff development.

Excellent tertiary teaching

Kane et al (2004) close the introduction of their paper with

We propose that purposeful reflection on their teaching plays a key role in assisting our participants to integrate the dimensions of subject knowledge, skill, interpersonal relations, research/teaching nexus and personality into recognised teaching excellence. We conclude with a discussion of the implications of our model for staff development efforts.

Their proposition about the role of reflection in contributing to excellent teaching matches with my long held belief and perception that all of the best university teachers I’ve seen have been those that engage in on-going reflection about their teaching, keep looking for new knowledge and keep trying (and evaluating) innovations based on that knowledge in the hope to improve upon their teaching.

The authors summarise a long history of research into excellent teaching that focused on identifying the attributes of excellent teachers (e.g. well prepared, stimulate interest, show high expectations etc.) but they then suggest a very important distinction.

While these, and other studies, contribute to understanding the perceived attributes of excellent teachers, they have had limited influence on improving the practice of less experienced university teachers. Identifying the elements of “good” university teaching has not shed light on how university teachers develop these attributes.

The model the develop is shown below. The suggest

Reflection lies at the hub of our model and we propose that it is the process through which our participants integrate the various dimensions

Attributes of excellent tertiary teaching

The authors don’t claim this model to have identified any novel sets of attributes. But they do suggest that

the
way in which the participants think about and understand their own practice through purposeful reflection, that has led to their development of excellence

What’s been said about reflection?

The authors have a few paragraphs summarising what’s been said about reflection in connection to tertiary teaching, for example

Day (1999) wrote “it is generally agreed that reflection in, on and about practice is essential to building, maintaining and further developing the capactities of teachers to think and act professionally over the span of their careers”

.

They trace reflection back to Dewey and his definition

“an active, persistent, and careful consideration of any belief or supposed form of knowledge in light of the grounds supporting it and future considerations to which it tends

The also mention a framework of reflection outlined by Hatton and Smith (1995) and use it to provide evidence of reflection from their sample of excellent teachers.

Expertise and deliberate practice

Among the many quotes Kane et al (2004) provide supporting the importance of reflection is this one from Stenberg and Horvath (1995)

in the minds of many, the disposition toward reflection is central to expert teaching

Another good quote (Common 1989, p. 385).

“Master teachers are not born; they become. They become primarily by developing a habit of mind, a way of looking critically at the work they do; by developing the courage to recognize faults, and by struggling to improve”

Related to this view is the question “Was Mozart, and other child prodigies, brilliant because of some innate talent?”. This is a question that this blog post takes up. The answer it gives is no. Instead, it’s the amount and quality of practice they engage in which makes the difference. Nurture wins the “nature versus nurture” battle.

The blog post builds on the work of Anders Ericsson and the concept of “deliberate practice”. The abstract for Ericsson et al (1993) is

The theoretical framework presented in this article explains expert performance as the end result of individuals’ prolonged efforts to improve performance while negotiating motivational and external constraints. In most domains of expertise, individuals begin in their childhood a regimen of effortful activities (deliberate practice) designed to optimize improvement. Individual differences, even among elite performers, are closely related to assessed amounts of deliberate practice. Many characteristics once believed to reflect innate talent are actually the result of intense practice extended for a minimum of 10 years. Analysis of expert performance provides unique evidence on the potential and limits of extreme environmental adaptation and learning.

Implications for academic staff development

If reflection or deliberate practice are key to developing mastery or expertise, then how do approaches to academic staff development and associated policies, processes and structures around university learning and teaching help encourage and enable this practice?

Seminars and presentations probably help those that are keen to become aware of new ideas that may aid their deliberate practice. However, attendance at such events are minimal. Much of existing practice seems to provide some level of support to those, the minority, already engaging in deliberate practice around learning and teaching.

The majority seem to be able to get away without engaging like this. Perhaps there’s something here?

References

Common, D.L. (1989). ‘Master teachers in higher education: A matter of settings’, The Review of Higher Education 12(4), 375–387.

Hatton, N. and Smith, D. (1995). ‘Reflection in teacher education: Towards definition and implementation’, Teaching & Teacher Education 11(1), 33–49.

Kane, R., S. Sandretto, et al. (2004). “An investigation into excellent tertiary teaching: Emphasising reflective practice.” Higher Education 47(3): 283-310.

Sternberg, R. and Horvath, J. (1995). ‘A prototype view of expert teaching’, Educational Researcher 24(6), 9–17.

The design of a 6 hour orientation to course analysis and design

It’s that time of year again, next week I get to run a session with 20 or so new CQU academics looking at course analysis and design. The session is part of a four day program entitled Foundations of University Learning and Teaching (FoULT). The session is run twice a year.

The following post gives an overview of some of my thinking behind the session this year. The sessions won’t really be finalised until the sessions are over, so if you have any feedback or suggestions, fire away.

Constraints

The following constraints apply

  • The session lasts 6 hours.
  • I’m told there will be 24 participants, I expect less than that.
  • I’ll be the only facilitator.
  • The participants are required to do this as part of the employment and some may be less than enthusiastic, though there are generally some very keen participants.
  • The sessions will be held in a computer lab. The computers are arranged around the walls of the room and there is a table without computers in the middle of the room.
  • 3 hours after lunch on one day and the 3 hours before lunch the following day.
  • The participants will be a day and a half into the four days by the time they get to this session (information overload kicking in).
  • Earlier on the first day they will have done sessions on “knowledge management” and assessment – moderation and marking.
  • The title of the sessions is “course analysis and design” so should probably do something close to that.
  • I don’t have the time to do a lot of work because of time constraints and other responsibilities.
  • Have done this session a few times before (slides from the last time are Introduction, Implementation, Analysis and design) so that experience will constrain my thinking.
  • Theoretically, I don’t believe that there is much chance of radically changing minds or developing expertise in new skills. The best I can hope for is sparking interest, raising awareness and pointing them in the right direction.

The plan

I’m thinking that the session should aim to

  • Make people aware of the tools and support currently available to help with their teaching.
  • Introduce them to some concepts or ideas that may lead them to re-think the assumptions on which they base their course design.
  • Introduce them to some resources and ideas that may help them design their courses.

Activities during the session will include

  • Some presentation of ideas using video and images.
  • Discussion and sharing of responses and their own ideas via in class discussion but also perhaps through the CDDU wiki and/or perhaps this blog.
  • A small amount of activity aimed at performing some design tasks.
  • A bit of playing around with various systems and resources.

There won’t be any assessment for this one.

The sessions

I’m planning on having 4 sessions over the 6 hours

  1. Introduction
    Set up who I am and what we’re going to be doing. Find out more about the participants – maybe get them to put this on the wiki or perhaps a WordPress blog — that sounds like an idea. Introduce the Trigwell (2001) model of university teaching that I’ll be using as a basic organising concept. Use it to introduce some of the ideas and explain the aim of the sessions. Introduce them to the technology we’ll be using and get them going.
  2. The T&L Context
    Talk about the details of the CQUni T&L context. What tools and resources are available? What do students see when they use various systems (something staff often don’t see)? Who to ask for help? etc. Also include mention of “Web 2.0″ tools i.e. that the context and tools for T&L aren’t limited to what is provided by the institution. Provide an opportunity to play and ask questions about this. Aim is to be concrete, active and get folk aware of what tools they can use. Hopefully to keep them awake after lunch.
  3. Teachers’ thinking
    Introduce and “attack” some basic ideas that inform the way people think about learning and teaching. Some ideas about course design, learning and teaching and human cognition.
  4. Teachers’ planning
    Talk about the process of actually doing course design and some of the ideas, resources and tools that can be used during this process.

The plan is that the first two would be on the afternoon of the first day with the last two on the following day.

The Trigwell (2001) model of teaching is shown in the following image and is briefly described on the flickr page for the image. You should see the connection between the names of the sessions and the model

Trigwell's model of teaching

Actually, after posting this I’ve made some changes to expand the use of the Trigwell (2001) model including teachers’ strategies and in particular gathering some of their strategies.

What’s needed? What would be nice?

I want to provide pointers to additional resources and also make use of good resources during the sessions. The list of what I’ve got is available on del.icio.us.

If you know of any additional resources you’d recommend please either add them in the comments of this post or tag them in del.icio.us with foult

Feedback on the above ideas would also be welcome.

References

Trigwell, K. (2001). “Judging university teaching.” The International Journal for Academic Development 6(1): 65-73.

Somethings that are broken with evaluation of university teaching

This article from a Training industry magazine raises a number of issues, well known in the research literature, about the significant limitations that exist with the evaluation of university teaching.

Essentially the only type of evaluation done at most universities is what the article refers to as “level 1 smile sheets”. That is student evaluation forms that ask them to rank what they felt they learn, what they felt about the course and the teacher. As Will Thalheimer describes

Smile sheets (the feedback forms we give learners after learning events) are an almost inevitable practice for training programs throughout the workplace learning industry. Residing at Donald Kirkpatrick’s 1st level—the Reaction level—smile sheets offer some benefits and some difficulties.

His post goes on to list some problems, benefits and a potential improvement. Geoff Parkin shares his negative view on them.

The highlight for me from the Training mag article was

In some instances, there is not only a low correlation between Level I and subsequent levels of evaluation, but a negative one.

The emphasis on level 1 evaluation – why

Most interestingly, the article then asks the question, “why do so many training organisations, including universities, continue to rely on level 1 smile sheets?”

The answer it provides is that they are too scared to do them in case of what they find. It’s the ostrich approach of sticking the head in the sand.

What else should be done?

This google book search result offers some background on “level 1″ and talks about the other 3 levels. Another resource provides some insights and points to other resources. I’m sure if I dug further there would be a lot more information about alternatives.

Simply spreading the above findings amongst the folk at universities who rely and respond to findings of level 1 smile sheets might be a good start. Probably necessary to start moving beyond the status quo.