Should academics manually create course websites?

(There is a response/attempted clarification to comments on the following by Stephen Downes in another post)

The focus of the following is not the “evil” LMS. That’s another argument, and I agree with much of it. The question here assumes that your university is going to use, or even require the use of, an LMS. Given that, should the institution expect or even allow academics to manually create course websites in the LMS?

This question arises out of my last post reflecting back on some decisions made back in 2000/2001 and how that compares to existing common practice. Especially in connection with Mark Smither’s recent problems with MOPPS post.

Back in 2000/2001 the Webfuse system answered this question with a no. Staff could still create their own site, but a default course site was automatically created for all sites. Academics could further modify this default course site, but the didn’t have to create it.

The rationale was that having academics manually create the course websites was inefficient, resulted in poor quality outcomes, and limited the ability for institutional control and evolution of the minimum level of service. The following expands on this rationale and relates it to recent experience of using Moodle. Based on the combination of experience with Webfuse and Moodle, I’m tending to answer no. Institutions should not be expecting academics to manually create websites.

What do you think? Are there institutions that don’t expect this? What do they do?

It is Inefficient

A long time ago, I used to teach Systems Administration. One of the lessons we tried to teach in Systems Administration was “if you do something more than once, automate it”. I recently had to create a Moodle course site from scratch. It was a simple (some might argue simplistic) site, by no means stretching the capabilities of Moodle. But creating even this simple site, I found annoying and inefficient.

The site used the weekly Moodle format and had 10-12 weeks. Each week basically followed the same structure: a pointer to the study guide chapter for that week, a pointer to a discussion forum specific to that week, a reminder to complete a journal entry for the week, and occasionally a reminder for another assessment item. This means that to create the course site I was essentially repeating the same steps for each week. I had to perform the same steps with the Moodle web interface for each week.

Setting up the entire site probably took me three hours. After becoming more familiar with Moodle course site design, the majority of the time was spent on the manual process of implementing the design. This was quicker because I did it on a Moodle instance running on my laptop. Trying to do it on the live institutional server could have at least doubled this.

It gets worse

I’m an advanced computer user, a designer and modifier of e-learning systems, and an experienced academic who’s been doing e-learning since the early 90s. I am not like most academics.

The academic I was working with, if left to their own devices, would have expended more than a week on this task. This would have included becoming familiar with Moodle, figuring out the options in terms of course design and then performing the low level tasks to implement the site. Even worse, this academic is probably middling in terms of skills. There are a significant number of academics at my current institution that would have taken longer. In fact, I heard a number of stories of academics earlier this year spending weeks getting their first Moodle course sites up and going.

The implications

Multiple this out for an entire university with 500+ courses, and there’s a significant expenditure/wastage of resources. Remember, this is for perhaps the simplest, minimum course site you can create. Nothing fancy.

As more course sites are created in Moodle, subsequent terms won’t be quite so bad as academics will tend to copy the previous course site and make some modifications. But this creates other problems addressed below.

Poor quality outcomes

Have academics manually create default course sites also contributes to poor quality course sites. There are two main reasons:

  1. Missing skills; and
    Creating a good quality course website requires a good mixture of skills in teaching, technology, design, communication and other skills. Few academics have the right mixture of all of these.
  2. Human error.
    In creating the simple Moodle course site I had to perform the same sequence of steps 10 or 12 times. I’m almost certain that I made a minor mistake in at least one of those repetitions. Depending on the nature of those mistakes, they will come back in the future and cause more inefficiencies, especially if they involve the incorrect date for an assignment. An academic with a more limited understanding of Moodle is perhaps even more likely to introduce mistakes due to human error.

Limited institutional control

This is may not be a big problem in some universities, but increasingly in the Australian higher education sector institutions are being held accountable for the quality of the education they offer. This is translating into an explosion of minimum service standards (the MOPPS Mark Smithers talks about) where the institution identifies an organisation wide set of minimum standards for course websites.

In my experience, the expectation for most of these service standards is that the academics will translate those standards into features on their course websites. Some will argue this is so they can apply their local expertise to develop contextually sensitive implementations of the standards. It is argued that considering the standards helps encourage more thinking about course design by academics. In my experience it mostly leads to compliance and task corruption.

Either way, it is up to the academics to translate the standards into an actual course site. Given the difficulty and inefficiencies identified above in creating course sites, the potential for misinterpretation of the minimum standards, the potential for those standards to badly designed or communicated to academics, and the imbalance in importance between teaching and research it is no great surprise that academics collude to comply and compromise the standards.

Based on this argument, If the aim of an institution is to control the minimum quality of the institution’s course websites, expecting academics to manually create course websites is both inefficient and ineffective. It won’t work.

Limited evolution

What’s worse, is if the institution then decides that those minimum standards have to change, either to solve a problem or improve the quality. They then have convince the academics to make these changes. Once an academic has a course site, the common approach is to simply copy what has gone before. Any changes in minimum standards that require significant changes to the basic structure of a course site has little or no chance of widespread, successful adoption.

One solution

This post briefly describes the alternative that was implemented within the Webfuse system in 2001 and also a prior aborted attempt.

Default course sites and wizards

There is now a version 2.0 of this section.

The following is the next section from chapter 5 of my thesis. This one describes attempts made to provided a functionality within Webfuse that improved the quality of the default course websites, without increasing workload for academic staff and while retaining some elements of autonomy.

Sadly (to me anyway) the institution in which this work evolved has gone backwards.

Default course sites and wizards

As described in chapter 4, the initial assumptions built into Webfuse was that a course website would simply be an empty page. From this single, empty page it was assumed that each individual teaching staff member would then draw on the variety of page types (hypermedia templates) provided as course website building blocks by Webfuse to construct their course website. Very quickly it became obvious that the majority of academics did not have the time, inclination, or skills to engage in this sort of design and construction process. Those staff that did have this combination of skills often wanted to use other tools or approaches with which they were already familiar.

In addition, it became obvious that a significant percentage of the tasks associated with creating a course website were fairly low-level tasks, often involving reuse of information and resources already provided. These observations led to the practice where support staff created initial default course sites by manually editing the initial empty course sites and uploading standard information (e.g. course synopsis, staff contact details etc). However, these course sites were of limited quality, failed to encourage further enhancement from academic staff, and required significant workload from faculty administrative staff. It is within this context that it became necessary to better support the concept of a course and encourage greater engagement with course sites from academic staff.

During 1999 an initial attempt at addressing this problem was commenced as the “Wizard” project. Briefly described in Jones and Lynch (1999) the Wizard project planned to provide an interface based on the Wizards common to Widows programs of the late 1980s. Such an interface would walk the academic through a series of questions about their course, the provided answers would be combined with the Webfuse page types to create a course site. A particular focus of this plan was an adopter-focused development approach, however, due to organisational uncertainty and limited development resources, this project did not move beyond the prototype and planning stage.

The next attempt to address this problem was the creation of an automated and expanded default course site approach for the second term of 2001. This coincided with the broader roll out of CQU’s new student records system. This new default course site approach was made possible by the expanded Infocom web team, the improvements in the Webfuse process and code-base described in previous sections, and was the primary task of the author during the first half of 2001. The automated and expanded default course site approach evolved to consist of the following components:

  1. An expanded default course site;
    As described in more detail below, the single empty page default course site was replaced with a much expanded site consisting of five separate sections, containing a range of course related data and services within a re-designed interface. Each offering of every course offered by Infocom would have a new default course site.
  2. Integration with existing data sources;
    Much of the data used in the creation of the default course site was drawn upon from existing, organisational data sources created by other processes.
  3. Automatic creation;
    The creation and population of all default course sites was performed by running a script that, given the details of a specific term and year, would existing information, Webfuse page types, and related scripts and create default course sites for all courses offered by Infocom.
  4. A copying process; and
    Teaching staff could further modify the default course site by adding additional course information and services. Rather than re-add these additional features for each subsequent term a copying process was developed by which staff could specify what they would like to copy to the new course site.
  5. Support for a real course site.
    There continued to be a number of teaching staff who wished to create their own course website with other tools. Each default course site had support for an optional “real” course site This added another area of the default course site in which the staff member could place their own course site.

Figure 5.1 is the home page for a default course site from July 2001. This home page formed the top of the hierarchical structure of the default course site. Underneath the home page there were five sections:

  1. Updates;
    The updates section provided a function that allowed teaching staff to provide and store course wide updates or announcements. The titles and post dates of the most recent updates were also visible from the home page.
  2. Study schedule;
    This section provided a week by week breakdown of the course, its topics, content and assessment.
  3. Assessment;
    Provided access to details about the course assessment. By default this would summarise for each assessment item, the title, due date and percentage of the overall assessment.
  4. Resources; and
    All remaining course resources and services were made available via this section. By default this included a link to the course profile (syllabus) document, details of the course textbook(s) (including a link to the university bookshop, and, if used, a discussion forum or mailing list.
  5. Staff.
    This provided both the personal details of the staff teaching the course as well as an area restricted to the teaching staff used for discussion or sharing resources. The personal details of teaching staff included name, contact details and where available a photo.
    If the teaching staff in charge of a course decided to create a real course site, then a sixth section was added under the default course site home page. The teaching staff could upload and manage their real course site within this section.

Webfuse default course site home page

Figure 5.1 – Home page for a Webfuse default course site (July 2001) – click to make bigger

.

As identified in chapter 4, since Webfuse was a faculty system, not an institution system, there were organisational, political and technical limits on how well it could be integrated with institutional systems. These limitations continued to exist post 2001. Consequently, the default course site creation process included a number of work arounds and could not achieve all of the automation and integration desired. For example, study schedules had to be manually entered, even though the course profile (syllabus) already contained such a schedule. It was not until 2008 that CQU’s distance education publications process included a semi-automated process for the provision of electronic versions of course study guides.

The default course site process did create some initial and on-going disquiet around questions of academic independence, consistency and institutional identity. This was particularly a problem for academic staff wanting to create their own course sites. Of the 99 course sites created in the first term of the expanded default course site approach, 8 courses had a real course site. Table 5.4 summarises the total number of default course sites versus real course sites created by Webfuse from 2002 through 2009.

Table 5.4 – Comparison between default course sites and real course sites (2002-2009)
Year Total courses Real course sites % Real course sites
2002 313 27 8.6%
2003 305 29 9.5%
2004 329 16 4.9%
2005 299 15 5.0%
2006 297 15 5.1%
2007 251 4 1.6%
2008 225 2 0.9%
2009 211 0 0

One advantage provided by the expanded default course site approach was the ability for the institution to exercise some control over the minimum level of service provided to students. Changes to the expanded default course site occurred through two means:

  1. Institutional or strategic direction; and
    If the faculty identified some information or a service that it thought should be part of the minimum level of service to students it could implement this minimum standard across all courses via the default course site process. For example, from the second half of 2001 through 2002 all Infocom courses were required to have a course barometer (Jones 2002) to provide a mechanism to capture student feedback during a term.
  2. Adopter-focused and emergent development.
    The Infocom web team modified the default course site operation in response to lessons learned from supporting academics using the system and observing what innovative staff added to the default site. This is in line with the adopter-focused and emergent development practices discussed in previous sections.

Due to the foundation provided by the Webfuse page types and templates it was not necessary for all default course sites to have the same structure or the same look and feel. Theoretically, every course site could be completely different. The flexibility of the default course site idea was tested in 2007 with the creation of a “Web 2.0” course site (see Figure 5.2). This “Web 2.0” course site was implemented as a Webfuse default site using Webfuse page types, however, using a different appearance and structure to a normal default site. This site is a “Web 2.0” site because all of the functionality (discussion, wiki, blog, portfolio and resource sharing) were provided by freely available Web 2.0 tools hosted on external sites. Webfuse used RSS feeds generated by these external tools to create and maintain the course site. Students used these RSS feeds to remove the need to visit the course site at all.

Home page for Web 2.0 course site

Figure 5.2 – Web 2.0 Course site (2007)

The confusion of small and big changes

Over the last couple of days I’ve enjoyed a small discussion that has arisen out of some comments Kevin has made on my blog. This post is an attempt to partially engage with the most recent comment. I echo Kevin’s conclusion, I’d love to hear anyone else’s take on this.

The unanswered question

The main point I’d like to discuss is the question of small versus big changes. I have an opinion on this, but there’s not enough evidence to suggest that it’s an answer. The basic question might be phrased as: How do you improve the quality significant improvement in the quality of L&T in universities? You could make this much more general, along the lines of “How do you change organisational practices?”, but I’m going to stick with the specific.

I’m familiar with two broad responses:

  • Revolutionary (usually top-down) change; and
    This is where the necessary change is identified by someone, eventually they get agreement/the ability to implement the change through some sort of change management process. This usually involves some big change. e.g. the adoption of a new LMS for a university, trashing the LMS and adopting WPMU for L&T, adopting university wide graduate attributes, requiring every academic to have a formal teaching qualification etc. Or even more radical, the death of universities and their replacement by something else.
  • Evolutionary (usually bottom-up) change.
    Small-scale changes in practice, usually at the local level.

Kevin’s comment gives a good summary of the common problem with the evolutionary change approach

In my experience, especially at a large institution, taking the “small changes” route is the road to perdition. For me, this means I have to engage in a million little negotiations to get the small to accumulate to something significant. At the rate I’m going it will take me two lifetimes to bring about real change in the English Department.

As I mentioned above and indicate by the heading for this section, I don’t have what I would call an answer. I have an argument for the approach I would take and some evidence to support it, but I don’t think I can call it “the answer” (yet).

What I think is the answer

Last I year I gave a presentation called Herding cats, losing weight and how to improve learning and teaching (slides and video are available). In that presentation, the analogy used is that revolutionary change is like herding cats and that evolutionary change is like losing weight. Using this analogy I argue that the herding cats approach to improve the quality of teaching at a University has not worked empirically and that there is significant theory to explain why it will never work. That same theory suggests that an evolutionary approach informed by lessons learned from weight loss, is much more promising.

The general solution I suggest is one slide 200 or so (it was only a 60 minute presentation) and goes under the title “reflective alignment” and can be summarised as

All aspects of the learning and teaching environment are aligned to enable and encourage academic staff to reflect on their teaching with the aim of achieving 3rd order change.

Framed another way, the teaching environment at a university encourages and enables academics to be changing their thinking and practice of teaching. That is essentially do what they do now, make small changes each time they teach a course, but rather than changes that are not constrained by the same ways of thinking about teaching.

Having academics continually making these sorts of 3rd order changes (within an institution that encourages and enables them to make those 3rd order changes) will result (I think) in radically different and significantly improved learning and teaching.

When small changes won’t work

Like Kevin, I think that universities relying on small changes to improve learning and teaching will not work. Mostly because the university environment does not encourage nor enable the type of small scale changes that are required.

In the herding cats presentation a large part of the time was listing the parts of the university teaching environment that actively prevents the type of 3rd order change that is necessary. In fact, much of the bleating in posts on this blog are complaining about these limitations. Some examples include:

  • Rewards that favour research, not teaching.
    No matter how many formal teaching qualifications an academic is forced to acquire, if they get promoted (both at their current and other universities) through the quality of their promotion, then they will focus on research, not teaching.
  • Pressures arising from quality assurance and simplistic KPIs.
    Since the mid-1990s I’ve observed that it is only the courses with large failure rates or student complaints that get any attention from university management. Students, like most people get scared when their expectations aren’t meant. That means if you try something innovative students will complain. In addition, if you try something innovative you might have problems, which management hate. If you try something different, you are more likely to have to waste time responding to “management concerns”. The presentation references research showing that this is preventing academics from trying innovative work.

    With the rise of quality assurance and corporate aproaches to management, this trend is getting worse.

  • Removal of autonomy;
    As I’ve argued in a couple of posts top-down management is removing academic autonomy and perhaps purpose and subsequently reducing academic motivation.
  • Constraining systems;
    Increasingly universities are using information systems to perform learning and teaching. Those systems are designed on particular assumptions that limit the ability to change. The most obvious example is the LMS (be it open or closed source). This recent post includes discussion of this point around the LMS.

    The people, processes and policies within universities are being set up to use these systems. If you use something different, you are being inefficient.

  • Simplistic understandings of innovation.
    When it comes to understanding innovations (e.g. something as simple as a new LMS), universities have naive perspectives of the adoption process. As recognised by Bigum and Rowan (2004) this naive perspective assumes that the innovation passes through the adoption process largely unchanged, which means that the social must conform with the innovation.

    i.e. As the institution starts to adopt Moodle across all its courses, Moodle can and should stay exactly the same. You only need to show people how to use Moodle, nothing more. If what they want to do is not supported by Moodle, then they need to conform to what Moodle does, regardless of the ramifications.

My argument is that all of this and other factors within a university environment actively prevent small changes having broad outcomes. The university environment is actively discouraging 3rd order change and isn’t even very good at achieving 2nd order change.

But small change can’t make a big difference

Ignoring all that, people still get stuck on the idea of lots of small change creating really big change. They are wrong.

To justify that, first let me draw on people recognised as being much smarter and more important than I (Weick and Quinn, 1999)

The distinctive quality of continuous change is the idea that small continuous adjustments, created simultaneously across units, can cumulate and create substantial change.

The main reason people have trouble with this idea (I think) is that they think that the world is ordered and predictable. That the world is an ordered system. If you make a small change, you get a small effect. However, when you’re talking about a complex system, small changes can create radical outcomes.

I don’t have time to expand on this here, it’s talked about in the presentation I mentioned above. Anyway, Dave Snowden and any number of other people make this point better than I.

Big and small change in the wrong place

Here’s a new idea. One of the reasons why I think most universities are failing to improve the quality of their teaching is that they are focusing on big and small change in the wrong places.

In my experience, most universities are trying to make big improvement in teaching by introducing big changes in what academics do. Use a different system, use a different pedagogy, radically change your teaching so you are constructively aligned, get a teaching qualification etc. But at the same time, there is no radical change in the how the teaching environment works. There are no solutions to the above problems with the environment.

What I am suggesting is that there should be big changes in the environment to enable small changes on the part of the academic. In fact, in the presentation I argue that the aim is to help the academics do what good teaching academics have always done (Common, 1989)

Master teachers are not born; they become. They become primarily by developing a habit of mind, a way of looking critically at the work they do by developing the courage to recognise faults, and struggling to improve.

References

Bigum, C. and L. Rowan (2004). “Flexible learning in teacher education: myths, muddles and models.” Asia-Pacific Journal of Teacher Education 32(3): 213-226.

Common, D. (1989). “Master teachers in higher education: A matter of settings.” The Review of Higher Education 12(4): 375-387.

Weick, K. and R. Quinn (1999). “Organizational change and development.” Annual Review of Psychology 50: 361-386.

The Wf Framework

Yet another section from Chapter 5 of the thesis describing the various changes made to Webfuse in the period from 2000 onwards. This one (very briefly) describes the Webfuse framework for dynamic web applications.

You can see the impact of this experience in the development practices I’m bringing to my work in PHP and Moodle. There are early glimmers of MVC and the Wf Framework in BIM and the indicators block.

The Wf Framework

The absence of support for dynamic web applications is the second lesson identified in Chapter 4 from the development and use of Webfuse from 1997 through 1999. Rather than just being a publishing environment the Web was becoming an application development environment. An external consultancy during 1999 that required the development of a web-based helpdesk interface, led to the development of the Wf framework for Webfuse. The Wf framework was based on the Model-View-Controller (MVC) framework, made use of the data mapper pattern described in the previous section, and was used to develop 65 dynamic web applications.

Originally proposed during the 1980s for the development of graphical user interfaces, the MVC framework allows a single object to be presented in multiple ways with each presentation having a separate style of interaction (Sommerville 2001). Gamma et al (1995) describe MVC as a triad of classes used to build user interfaces in Smalltalk-80 and draws on a number of patterns including Strategy, Factory, Observer, Composite and Strategy. The MVC architectural pattern has since become widespread through its use in a number of web application frameworks.

All dynamic web applications using the Wf framework used a URL of the format

For example, the URL for the “Staff MyCQU” application’s course history method for the CQU course COIS12073 was accessed using the following URL

To parse and handle this URL the Apache web server was configured to use a Perl module. That module used the WebfuseFactory class to identify and call the appropriate application controller (i.e. matching the objectName – StaffMyCQU – from the URL) and calls the appropriate method (i.e. matching the methodName – CourseHistory – from the URL) of that controller class. Any parameters (e.g. COURSE=COIS12073) would be passed to that method and the controller would also perform authentication and access control checks to ensure that the user had permission to perform the requested method. The method in turn would normally consist of creating a model class (CourseHistory), a view class (CourseHistory_View), passing the model to the view, and using the view to generate the HTML to send back to the browser.

Sommerville (2001) argues that the inherent complexity of frameworks mean that it takes time to learn how to use them and that this can limit their use. Experience within Webfuse in the early 2000s reinforced this perspective as new developers, familiar with the simpler coding approaches of web “scripting” common at this time, took some time to grasp and see the value of this complexity. The consistent metaphor provided by the Wf framework also clashed somewhat with the Perl (the scripting language used to implement Webfuse) ethos of “There is more then one way to do it” and the tendency for Perl developers to “do it their way” (Jones 2003). However, as shown below (especially in Section 5.3.6 on Workarounds), the consistent metaphor and other advantages provided by this additional, initial complexity provided an important part of the ability of Webfuse to respond quickly and effectively to organisational requirements and changes.

References

Gamma, E., R. Helm, et al. (1995). Design Patterns: Elements of Reusable Object-Oriented Software. Reading, Massachusetts, Addison-Wesley.

Jones, D. (2003). How to live with ERP systems and thrive. 2003 Tertiary Education Management Conference, Adelaide.

Sommerville, I. (2001). Software Engineering, Addison-Wesley.

Object orientation and design patterns

The following is the third in a sequence of sections from chapter 5 of my thesis. These sections are describing the changes made in the development and support of Webfuse from 2000 through 2004 (and a bit beyond). This post very briefly describes the adoption a design patterns influenced, OO design.

The biggest challenge I faced in moving from Webfuse to Moodle was returning to a procedural approach to software development. Not just the movement from OO back to procedural, but from a system where I was deeply familiar with 900+ classes that provided a lot of low level and high level services for “e-learning” to a collection of procedural, spaghetti code where there was no clean separation of services, no easy way of finding which part did what. Of course, part of this difficulty was simply my newness to Moodle.

I am wondering what the implications might be for Moodle if it were to have been based on a “better” OO design.

Object orientation and design patterns

While the initial design rationale for Webfuse (Jones and Buchanan 1996) mentions that object-orientation is one of a number of approaches known to maximise adaptability the initial Webfuse implementation did not make use of object orientation. The key ideas of object-orientation arose during the 1960s, but it was the early 1990s before Fichman and Kemerer (1993) argued the object-orientation was the leading candidate to become “tommorow’s dominant software process technology”. With object-oriented design, system designers analyse and design in terms of objects or “things” – instead of operations or functions – with an executing system made up of interacting objects that maintain state and provided operations that manipulate that state information (Sommerville 2001). Proponents argue that object-orientation is an approach that helps avoid the labour intensive need to build all code from scratch due to its support for constructing software systems through the assembly of previous developed components (Fichman and Kemerer 1993). The independent encapsulation of state and operations enables this reuse and reduces design, programming, and validation costs and also reduces risk (Sommerville 2001).

However, programming with objects is complex and in the case of large and complex systems some of the ramifications are not yet fully mastered or understood (Szyperski 1999). Recognition of this problem contributed to significant interest in the identification, abstraction and use of design patterns to the problem of designing object-oriented systems. In perhaps the most important book on design patterns, Gamma et al (1995) argue that design patterns make it easier to reuse successful designs an architectures by expressing proven techniques in ways that are more accessible to developers and by allowing choice between design alternatives. A pattern is ‘a generic approach to solving a particular problem that can be tailored to specific cases. Properly used, they can save time and improve quality’ (Fernandez 1998). Sommerville (2001) argues that while patterns are a very effective form or reuse, they do have a high cost of introduction in that they can only be used effectively by experienced developers. Given a particular object-oriented design issue, a design pattern will name, abstract and identify key aspects of a common design structure, describe when it might or might not apply given other constraints, and discuss the consequences and trade-offs of the pattern (Gamma, Helm et al. 1995).

The use of object-oriented design and design patterns in Webfuse consisted of the following inter-related uses:

  1. Object-oriented wrapper around database operations;
    Differences in how data is structured makes the transfer of data between relational databases and objects creates significant complexity (Fowler 2003). A solution to this complexity is a layer of software that isolates the two schema, a layer commonly called a data mapper (Fowler 2003). Work on the Webfuse “data mapper” began in 1999 and became a foundation for many of the remaining applications of object-oriented design.
  2. University classes;
    A number of classes were created to match common objects within the University which Webfuse had to manipulate. For example, the People::Campus class provided state and operations around CQU’s various campuses.
  3. Support classes for Webfuse page types;
    As the first part of a planned move to convert the Webfuse page types to a complete object-oriented based approach, a range of support classes were implemented. These classes replaced the use of procedural code within new page types. The length of an average page type was reduced from 1000+ lines of code to less than 250 lines. The move to an object-oriented page type process was never completed due to a focus on other developments. The OO page type process was intended to provide the ability for a single web page to be produced by a combination of multiple page types, thus addressing the first lesson identified in Chapter 4.
  4. Dynamic web applications;
    To address the second lesson identified in Chapter 4 a framework for developing dynamic web applications was developed based on a model-view-controller framework. This work is described in more detail in the next section.

By 2010, the Webfuse code-base included 900+ classes, 65 dynamic web applications and a 190+ test harnesses. The test harnesses were mostly developed from 2001 through 2003 when the combination of the Webfuse agile development process, the increasing use of object-orientation, and a resourced Infocom web team enabled the adoption of test-driven development. Most importantly, the design patterns influenced adoption of object-oriented design provided the ability for the Webfuse adopter-focused, agile development process that resulted in the following developments.

References

Fernandez, E. B. (1998). Building Systems Using Analysis Patterns. Third International Workshop on Software Architecture, Orlando, Florida, Association for Computing Machinery.

Fichman, R. and C. Kemerer (1993). "Adoption of software engineering process innovations: The case of object orientation." Sloan Management Review 34(2): 7-22.

Fowler, M. (2003). Patterns of Enterprise Architecture. Boston, Addison-Wesley.

Gamma, E., R. Helm, et al. (1995). Design Patterns: Elements of Reusable Object-Oriented Software. Reading, Massachusetts, Addison-Wesley.

Jones, D. and R. Buchanan (1996). The design of an integrated online learning environment. Proceedings of ASCILITE’96, Adelaide.

Sommerville, I. (2001). Software Engineering, Addison-Wesley.

Szyperski, C. (1999). Component software: Beyond object-oriented programming. New York, Addison-Wesley.

Emergent and agile development

The following is the second of the sections from my thesis that describe the various changes made to Webfuse and how it was implemented during the years 2000-2004 (and a bit beyond). It’s a rough first draft. The first section describes how the Webfuse development process came to be very focused on the potential adopters of Webfuse. This section describes why and how the Webfuse development process became more emergent/agile.

I continue to hold that the traditional development life cycles (i.e. not adopter focused and not agile) used to implement e-learning within universities are a major problem.

Emergent and agile development

Jones and Buchanan (1996) in describing the initial proposed design guidelines for Webfuse have the first guidline as “flexibility and the ability to adapt to change”. This was based on the assumption that “the one unchanging characteristic of the Internet, and the computing field in general, is that it never stops changing” (Jones and Buchanan 1996). In describing how to “design for change” Jones and Buchanan (1996) focus on design factors (e.g. separation of implementation from interface and industry standards) and make no specific mention of development processes. As described in Chapter 4, the adoption of a traditional design process based around big up-front design and a long period of stable use significantly limited the flexibility and ability to adapt to change of Webfuse. The processes used to support and develop Webfuse had to change to address this limitation.

Initial moves to change the development process for Webfuse commenced with the development model proposed by Jones and Lynch (1999), in particular with its emphasis on design for repair, rather than replacement. This particular emphasis arose from increasingly familiarity with the design patterns literature (Coplien 1999) that was becoming well known at the time. A process by which system evolution was enabled by continual reflection and modification of the patterns and constructive templates by the development team (Jones and Lynch 1999). During 1999, however, additional ideas of how to modify the Webfuse design and support process arose from insights around emergent development (Truex, Baskerville et al. 1999), what would come to be known as agile development (Highsmith 2000; Highsmith and Cockburn 2001), and eXtreme Programming (Beck 1999; Beck 2000).

Based on those insights, Jones (2000) argues the need for the need to move e-learning system development to emergent develoment. This arugment is based on the long history of failed technology-based innovations in education (Reeves 1999), which fail due to innovators underestimating the consequences of new technologies (Sproull and Kiesler 1991) and a failure to accommodate environmental and contextual factors affecting implementation (Jonassen 1998). The attraction of emergent and agile develoment approaches is that they are based on the assumption that the system developers are continually striving to achieve alignment between the organisation and its information systems (duPlooy 2003).

To some extent, it can be argued that the template based structure of Webfuse provided good support for the goals of emergent development described by Truex et al (1999) as being continual analysis; dynamic requirements negotation; useful, incomplete specifications; continuous redevelopment; and the ability to adapt. However, it was not until late 2000, when the author took on the lead role of an expanded Infocom web team, that concrete steps were taken to adopt emergent development. In part this was done by adopting many of the practices specified by extreme programming (Beck 2000). Adopted practices included the planning game, small releases, system metaphor, simple design, continuous testing, refactoring, continuous integration, coding standards and collective code ownership. Pair programming was used where possible but this was not often. Since not all of the practices of extreme programming were adopted it cannot be (strictly) claimed that the Webfuse development process was extreme programming (Beck 2000). Additionally, there are numerous examples where the development team was unable to maintain the discipline extreme programming requires. However, it is argued in Jones (2003) that an emphasis on code reuse, flexibility, closeness to the user, a test-driven coding style and various other practices provided Webfuse with an agile development process.

Important Webfuse developments that enable this more agile or emergent development process included:

  1. Object orientation and design patterns;
    As described in Section 5.3.3 a new object-oriented design for the Webfuse code, heavily influenced by the design patterns literature, was developed. This change made it significantly easier to adapt and continuously redevelop Webfuse as the Webfuse code became the incomplete specifications.
  2. The wf framework;
    A significant part of the new OO architecture was the wf framework (described in detail in Section 5.3.4) which provided the system metaphor for the development of interactive web applications.
  3. Minimum course websites;
    The minimum course websites (describe in detail in Section 5.3.5), as well as addressing problems of adoption, also provided an important part of the system metaphor.
  4. Webfuse’s existing architecture, and.
    The Webfuse “micro-kernel” architecture and its use of hypermedia templates (as described in Chapter 4) provides the foundation for many of the above enablers. The ability to add and modify templates independently significantly enables the ability to design for repair, not replacement.

  5. Support of the faculty Dean.
    At the simplest level, the faculty Dean enabled the adoption of this emergent process through the provision of resources necessary to expand the Infocom web team. More importantly, as described in his writings (Marshall 2001; Marshall and Gregor 2002), the emergent development approach matched his beliefs about the higher education environment and how to proceed within it.

Section 5.3.6 describes a series of workarounds that were made possible by the adopter-focused, emergent development approach for Webfuse between 2000 and 2004.

References

Beck, K. (1999). "Embracing change with extreme programming." IEEE Computer 32: 70-77.

Beck, K. (2000). Extreme Programming Explained: Embrace Change, Addison-Wesley.

Coplien, J. (1999). "Reevaluating the architectural metaphor: Toward piecemeal growth." IEEE Software 16(5): 40-44.

duPlooy, N. F. (2003). Information systems as social systems. Critical Reflections on Information Systems: A Systematic Approach. J. Cano. Hershey, IDEA Group Inc.

Highsmith, J. (2000). Adaptive software development: A collaborative approach to managing complex systems. New York, NY, Dorset House Publishing.

Highsmith, J. and A. Cockburn (2001). "Agile software development: Business of innovation." IEEE Computer 34(9): 120-122.

Jonassen, D. (1998). Designing Constructivist Learning Environments. Instructional Theories and Models. C. M. Reigeluth. Mahwah, NJ, Lawrence Erlbaum.

Jones, D. (2000). Emergent development and the virtual university. Learning’2000. Roanoke, Virginia.

Jones, D. (2003). How to live with ERP systems and thrive. 2003 Tertiary Education Management Conference, Adelaide.

Jones, D. and R. Buchanan (1996). The design of an integrated online learning environment. Proceedings of ASCILITE’96, Adelaide.

Jones, D. and T. Lynch (1999). A Model for the Design of Web-based Systems that supports Adoption, Appropriation and Evolution. First ICSE Workshop on Web Engineering, Los Angeles.

Marshall, S. (2001). Faculty level strategies in response to globalisation. 12th Annual International Conference of the Australian Association for Institutional Research. Rockhampton, QLD, Australia.

Marshall, S. and S. Gregor (2002). Distance education in the online world: Implications for higher education. The design and management of effective distance learning programs. R. Discenza, C. Howard and K. Schenk. Hershey, PA, USA, IGI Publishing: 21-36.

Reeves, T. (1999). A Research Agenda for Interactive Learning in the New Millennium. Proceedings of EdMedia’99, Seattle, Washington, AACE.

Sproull, L. and S. Kiesler (1991). Connections: new ways of working in the networked organization. Cambridge, MIT Press.

Truex, D., R. Baskerville, et al. (1999). "Growing systems in emergent organizations." Communications of the ACM 42(8): 117-123.

The road not taken

A recent post of mine continued the trend of reflecting on the impacts – in my mind negative impacts – of a top-down, compliance driven culture in higher education. This bit has been encouraged by
a comment on that post which makes a number of interesting points, at least in terms of encouraging some additional thinking on my part. It’s also serindipitously coincided with some recent local events.

My interpretation

I’ve interpreted the post as suggesting that no oversight can lead to a proliferation of chaos or bad practice. In terms of talking about teaching and learning within a university I tend to agree – more on this below. There’s also a point about moving academics beyond some of their existing practices and the suggestion that the top-down chain of command isn’t really a solution. It closes with something sparked this post

Yes, this means we have to sell, not try to dictate. Long road.

Free-for-all, top-down compliance and chaos

“Chaos” or complexity is not necessarily a bad thing. However, I do except that an organisation – like a university – does generally have to do something to ensure that the quality of its teaching and learning is improving. (Note: one of my principles is “It’s not how bad you start, but how quickly you get better”. I don’t believe in “being good” as a goal, it’s an on-going process.) At the very least I think a university taking public funds has to demonstrate that it is using those funds somewhat effectively.

This is why in the post that started this thread I proposed that the first stage of improving learning and teaching (i.e. what the teacher is) is not way to achieve this. In that stage, each academic is left to their own devices. What they do is up to them and their preferences and capabilities. There is little or no support. In my experience with this stage, there are some examples of very good teaching, but the vast majority is somewhat lacking.

This is where the process/quality/teaching nazis appear. These include consultants, government, educational researchers, senior management, IT folk etc. Each of these folk have the solution. The quality of teaching would be wonderful within the organisation if only every academic used process Y, product X. If every course had mapped its graduate attributes and had a course site that met a minimum service standard, then the quality of teaching would be wonderful. So, let’s set up a project team, specify the outcomes, implement them and then report success. Typically the aim is something along the lines of “Develop a systemic University-wide approach to learning and teaching” or perhaps even worse prove efficiency and control by aiming to “Centralise the strategic planning and managing of funds for learning and teaching support, activities and initiatives”.

In my experience these approaches never work. Mainly because the decisions made by the centralised, systemic University approach to learning and teaching are informed by experiences far removed from the realities of the teaching academics. The people making the decisions are generally senior managers who have either no recent teaching experience or only very narrow teaching experiences. Instead, the experience of these folk quickly becomes limited to the “systemic university-wide approach to learning and teaching”. That is, the initiatives they identify as important (e.g. mapping graduate attributes) become their main experience. Everything they think and do arises from that project. Their experience limits what decisions they can make.

What’s worse, the current management environment in Australian university encourages short-term (5 year) contracts for senior managers. In order to keep their job or move onto a new one, these managers have to prove their “ability to lead”. This means that they have to have successfully “led” completed projects which they can put on their CV. What’s worse, those projects have to fit within the current fads within higher education. The priority of these managers is not improving the experience of coal-face teaching academics, it’s about achieving the successful implementation of “systemic University-wide” projects.

This is why senior management can be so confident saying that Project X is a great success, when the coal-face teaching academics will be telling a very different story. This is what Chris Argyris (1990) termed organisational defensive routines and model 1 behaviour in organisations – discussed in this post.

So, in the second way, which I describe as “what management does”, the decisions about how to improve learning and teaching are being made by people who have limited experience of coal-face teaching and who also have significant motivating factors to have successful projects. Is it any surprise that this approach doesn’t create long-term sustainable change?

Rather than create the “proof” of effectiveness required by those providing the funds, this approach creates compliance and task corruption. i.e. the KPIs are met, but by ticking the boxes, not in outcome. For example, I know of an institution that has developed a “checklist” for course websites. It’s a long list of requirements that a minimum course site is expected to fulfill. The idea is that academics that build these sites, and their colleagues of moderate the course and course site, will work their way through the checklist ensuring that each requirement is fulfilled. In reality, a significant number of academics are asking “Is you’re site ready?” and then ticking all the boxes.

The road not taken

My argument is that there is a third way that promises better outcomes, but it’s continuing to the road not taken. Which is somewhat surprising for my current institution given that it’s strategic plan includes the following in its vision

We strive to understand their environment and situation, their
circumstances and goals, so we can help them achieve what they want to achieve and be who and what they want to be, one person at a time.

This is a brilliant summation of what I’m trying to get at with the “third way”. My post from yesterday gives some background into the origins of this perspective (more to come).

In terms of the third way, I should have mentioned Dave Snowden’s “how to manage a birthday party story” (the video is below) which also fits nicely with the three ways I’ve expressed. Here’s the connection I make:

  • chaotic system == what the teacher is.
  • ordered system == what management does.
  • complex system == what the teacher does.

References

Argyris, C. (1990). Overcoming Organisational Defenses: Facilitating Organisational Learning, Prentice Hall.

Adopter focused development and diffusion theory

The following is a first draft of the next section in Chapter 5 of the thesis. It’s the first section which starts describing the various different changes that were made to Webfuse and how it was supported from 2000 and on. IMHO, this particular change is something that continues to be missing from almost all university attempts to support e-learning. If not missing, they simply haven’t recognised the need for greater levels of skills. It is particularly sad that my current institution, the institution at which these changes/lesson were first implemented and written about 10+years ago, still hasn’t learnt the lesson – or perhaps the lesson just isn’t important enough.

Adopter focused development and diffusion theory

By 1999, the Webfuse experience led Jones and Lynch (1999) to identify three problems facing the development of web-based learning – appropriation, adoption and evolution – and propose a model for the design of web-based system, particularly for Web-based learning. The three problems were described as follows:

  1. Appropriation;
    As described in Chapter 4, only a small number of staff were heavily using web-based learning. These innovative staff members were developing a number of interesting and useful applications of web-based learning. However, few of these applications were being appropriated for use by the remaining teaching staff. This mirrored experience reported elsewhere (Geoghegan 1994; Taylor 1998; Mendes and Hall 1999).
  2. Adoption, and;
    Beyond appropriation of innovations, there was a broader problem with limited adoption of web-based learning and instructional technology in general (Surry and Farquhar 1997). Jones and Lynch (1999) identified the distinction between developer-based and adopter-based development methodologies as an explanation for this limited adoption. The pre-dominant developer-based approaches assume that a demonstrably better artifact will automatically replace existing products or practices (Surry and Farquhar 1997). By not paying attention to the users and the context, developer-based approaches create systems that are difficult to use and provide little benefit for the user.
  3. Evolution.
    Traditional development approaches aim to develop an “ideal” or all-encompassing system that meets all needs. Jones and Lynch (1999) argue that the time and resources necessary to achieve this goal generates little pay off due to the ever-changing requirements of web-learning. It is suggested that as the context changes, these ideal systems become a burden preventing adaptability (Jones and Lynch 1999).

To address these issues Jones and Lynch drew on insights from diffusion theory (Rogers 1995), related insights from adopter-based development (Surry and Farquhar 1997), the design patterns community (Alexander, Ishikawa et al. 1977; Gamma, Helm et al. 1995), and established links between design patterns and hypermedia templates (Nanard, Nanard et al. 1998) to propose a development model to address these problems. The model can be described as a combination of the following principles or assumptions:

  1. There exists a development team that actively seeks to understand the social context within which web-based learning is occurring. The inter-relationships between the developers of the system, the developed system, the potential adopters of the system and the contexts in which they system is developed and used is of significant importance.
  2. The development team develops a set of constructive templates (Nanard, Nanard et al. 1998) that teaching staff can use to create course websites for use by students.
  3. To encourage adoption and use by teaching staff the design of these templates is informed by insights from diffusion theory.
  4. Template design is informed by design patterns that encapsulate knowledge around learning, teaching and web-based services.
  5. There is recognition that innovative staff members are likely to do unexpected things with the constructive templates, even ignore them all together and use other means. This is allowed, and where possible enabled.
  6. The development team continue to observe and support staff throughout the use of the course sites to identify what is working and what is not.
  7. Based on this observation the development team abstracts new design patterns, retires those no longer appropriate and does the same for constructive templates.

Jones and Lynch (1999) believed that this development model would provide three major benefits:

  1. Develop systems which are more likely to be adopted. This is achieved by a major emphasis on context, adopter led development approaches and theory from the diffusion of innovations.
  2. Enable the appropriation and reuse of prior experience. Gained by a continual process of evaluating the work of innovators for potential abstraction and storage in a pattern repository and implementation as a constructive template.
  3. Enable the continued evolution of the system to meet changing needs. Evolution is provided by the continued application of patterns in a form of piecemeal growth and emphasising design for repair rather than replacement.

By 1999 it was well understood that a significant majority of academic staff, for whatever reason, were not going to construct elaborate course websites. Jones and Lynch (1999) describe the plan to develop a Webfuse Wizard. Built on top of the Webfuse template library, the Wizard would provide a “wizard” interface that would guide teaching staff through the creation of a course site. The working prototype due in July 1999 was not completed due to various factors, including the need for the author to continue teaching. The eventual replacement for the wizard, in terms of providing a easier method for creating course websites, was the development of the default or minimum course site idea described in Section 5.3.5. The principles and assumptions of the development model described by Jones and Lynch were to form the basis for how the Infocom web team operated from 2000 through 2004.

The use of diffusion theory continued and evolved throughout this period. Jones, Jamieson and Clark (2003) propose a model to “aid educators increase their awareness of potential implementation issues, estimate the likelihood of reinvention, and predict the amount and type of effort required to achieve successful implementation of specific WBE innovations”. This model is based on one aspect of diffusion theory and was based on the practices of the Infocom web team during the period 2000-2004. Rather than focus on the potential objective benefits of an innovation (e.g. online assignment submission will result in faster turnaround times) the model recommends an evaluation of the innovation in terms of the likely rate of adoption, perceived innovation attributes, the type of innovation decision, the available communication channels, the social system in which it will be implemented, and the nature of the change agents and their efforts. Jones et al (2003) argue that this type of evaluation is highly context specific to the exent that even with the same innovation and the same institution, as time passes the evaluation should be repeated.

Diffusion theory is not without its limitations or problems. Bigum and Rowan (2004) argue that “the problem for a theory of change that relies on pre-established categories is that it is limited in its capacity to account for new and unanticipated arrangements or orderings”. McMaster and Wastell (2005), in arguing that diffusionism is a myth with a potency based not on empiricial validity but on a synergy with a colonialistic mid-set, offer a description of the many flaws that have been identified in diffusion theory. Amongst these is that diffusion theory is deterministic and positivistic in philosophical orientation which leads its proponents to predict outcomes based on the measurement of a small number of variables (McMaster and Wastell 2005). The “factor approach”, where key features/factors are correlated with outcome measures, is common to diffusionist research in the IS field (McMaster and Wastell 2005). The approach described by Jones et al (2003) is an example of one such approach. McMaster and Wastell (2005) cite numerous authors to support the argument that while such “factor approaches” can “highlight impotant influences, they necessarily fail to capture the dynamic, processual character of social-technical innovation”.

The development model developed by the Infocom web team over the period from 2000 through 2004 addressed the “dynamic, processual character of social innovation” with its particular focus on evolution. Have been guided by “diffusion theory” in its selection of a particular set of changes, the web team maintained a close eye on how these innovations were being used by individual staff. This was enabled by the fact that the web team were not only the developers of Webfuse, they also provided helpdesk support and training for users of Webfuse. The web team was not employed by the central IT division, they were employed by the faculty. This meant the web team had offices in the same building, went to the same team room and attended faculty retreats. These social interactions provided a deeper insight into the experiences of the academic users of Webfuse. This meant that the Webfuse development model move towards what McMaster and Wastell (2005) describe

Here we would argue that the innovation succeeded due to internal development, through a participative process involving strong local leadership, engaged staff and the fortuitous occurrence of a series of local crises that aligned all stakeholders around the need for change.

Bigum and Rowan (2004) argue that a serious problem with diffusion theory is that the innovation is understood to pass through the adoption process largely unchanged meaning that the social is seen either to conform or not to conform with the requirements of the innovation. As an adopted focused development process, the Webfuse development model recognised that not only will the innovations change during the adoption process, great benefit can arise by being able to recognise that change and respond to it in ways appropriate to the social setting. The development model understood and sought to enable the users and consequence of the innovations to emerge from the complex interactions between the social and technical. As described by Markus and Robey (1988) this type of perspective replaces prediction with a detailed understanding of dynamic organisational processes, the intentions of actors and features of information technology.

References

Alexander, C., S. Ishikawa, et al. (1977). A Pattern Language: Towns, Buildings, Construction, Oxford University Press.

Bigum, C. and L. Rowan (2004). "Flexible learning in teacher education: myths, muddles and models." Asia-Pacific Journal of Teacher Education 32(3): 213-226.

Gamma, E., R. Helm, et al. (1995). Design Patterns: Elements of Reusable Object-Oriented Software. Reading, Massachusetts, Addison-Wesley.

Geoghegan, W. (1994). Whatever happened to instructional technology? 22nd Annual Conferences of the International Business Schools Computing Association, Baltimore, MD, IBM.

Jones, D., K. Jamieson, et al. (2003). A model for evaluating potential Web-based education innovations. 36th Annual Hawaii International Conference on System Sciences, Hawaii, IEEE.

Jones, D. and T. Lynch (1999). A Model for the Design of Web-based Systems that supports Adoption, Appropriation and Evolution. First ICSE Workshop on Web Engineering, Los Angeles.

Markus, M. L. and D. Robey (1988). "Information technology and organizational change: causal structure in theory and research." Management Science 34(5): 583-598.

McMaster, T. and D. Wastell (2005). "Diffusion – or delusion? Challenging an IS research tradition." Information Technology & People 18(4): 383-404.

Mendes, M. and W. Hall (1999). "Hyper-Authoring for Education: A Qualitative Evaluation." Computers and Education 32: 51-64.

Nanard, M., J. Nanard, et al. (1998). Pushing Reuse in Hypermedia Design: Golden Rules, Design Patterns and Constructive Templates. Proceedings of the 9th ACM Conference on Hypertext and Hypermedia, ACM.

Rogers, E. (1995). Diffusion of Innovations. New York, The Free Press.

Surry, D. and J. Farquhar (1997). "Diffusion Theory and Instructional Technology." e-Journal of Instructional Science and Technology 2(1): 269-278.

Taylor, P. (1998). "Institutional Change in Uncertain Times: Lone Ranging is Not Enough." Studies in Higher Education 23(3): 269-278.

The design of a Moodle course site

For a couple of different reasons I am helping someone with the design and implementation of a Moodle course site. I’ve developed an activity module for Moodle but have never created an entire course site. Have thought about how it might be done, but never done it. The following is a description and some reflections on the experience.

I’m particular interested in:

  • Is the “pragmatic” approach to design a Moodle course site as widespread as I think? Or are their great swathes of teachers creating Moodle course sites from scratch?
  • What are the experiences of folk who have used the social format for a course? Good? Bad? Happy with the support Moodle provides for this format?
  • What about those using the topic/weekly course formats, how do you deal with scrolling problem?

Reflection

After doing all of the following, it’s not a great surprise that my opinions, biases and prejudices have been confirmed. In particular, it doesn’t matter what minimum standards or tool affordances put in place by management and technologists, most academics are looking to tweak how they’ve previously taught their courses and then move onto other things. If you want to improve the quality of L&T, you have to move beyond developing policies or technologies.

It’s also reinforced that Moodle 1.9.x still retains some fairly significant limitations in trying to do something fairly flexible. I’m not sure there’s much here that’s going to break an academics “tweaking” behaviour. Don’t get me wrong, in comparison to some other LMS, it’s better, in places. It’s just that this isn’t exactly a very high bar to jump.

It would be tremendously interesting to have been able to gather some more qualitative stories of how academics have handled this transition and then compare that with what management thinks has gone on and then compare all that with what the students thought.

The purpose of the course site

I can hear some educational developers I know wanting to step back and look at the outcomes of the course, do a curriculum mapping, evaluate the alignment of the course and identify appropriate learning activities that would improve this. Preferably if we were talking about Chickering and Gamson’s (1987) 7 principles for good practice in undergraduate education (a particular lens our institution is using). I can hear another staff member rabbiting on about the work of Oliver (2000), Herrington and colleagues. The learning design crew (Bennett et al 2006), the LAMS folk (Dalziel, 2003) or any of a number of other educationally informed design approaches. Long term Moodle folk might refer to the “principles” often said to underpin Moodle.

The reality is that this academic has other priorities which means the academic wants to do the minimum. To quote

All I essentially want to do is to copy the last time I ran ..the course..

This is in line Stark (2000) identified as the dominant setting for most academics, i.e. teaching an existing course, generally one they’ve taught before and subsequently they will spend most of their time fine tuning a course or making minor modifications to material or content.

That is, most academics are not going to design a course from scratch. They are going to recreate what they know. This is one of the reasons why one of the most popular local “innovations” around Moodle at my current institution has been a Moodle site “template” that re-creates the hierarchical structure of a course site from Blackboard. The LMS most staff will have used prior to their move to Moodle.

So, this academic is not alone in simply wanting to re-create what they did before.

Other constraints

This, however, is not the only constraint. The institution has introduced “Minimum Service Standards for course delivery” which is intended to “provide the pedagogical basis for developing online learning environments and to encourage academic staff to look beyond existing practices and consider the useful features of the new LMS.” (Tickle et al, 2009). Anecdotal evidence suggest that for a significant number of staff the minimum standard has become a “tick the boxes” exercise. i.e at the best make sure I have ticked the boxes, at worst, make sure I tick the boxes without necessarily having achieved the standard.

So, the site design will have to meet those requirements.

Starting the copy process

If the aim is to copy what was done before, the questions are:

  • What was done before?
    What’s in the previous course site, how was it structured.
  • What can be done now?
    i.e. what are the constraints and assumptions built into Moodle.
  • How can you get from one to the other?

The rest of this seeks to answer those questions.

What was done before?

The course was hosted in Webfuse and was implemented as a standard Webfuse minimum course site. Such a site uses a simple hierarchical structure with a home page that had a description of the course and a list of updates and 5 sub-sections:

  1. Updates – place where course wide updates were created/archived.
    Only a couple of system wide updates used.
  2. Study Schedule – a week by week breakdown of the course.
    Reasonably complete with a description of the weeks topic and a collection of basic tasks. The Word documents for the study guide are uploaded in this section. The design assumption (I was the designer of Webfuse) was that it would be uploaded into the Resources section and linked from here and the resource section.
  3. Assessment – a description of the assessment for the course.
    A description of the assessment pieces.
  4. Resources – a collection of the learning resources.
    In this case, only the discussion forum, mailing list and barometer. Having both a discussion forum and a mailing list is interesting.
  5. Staff – photos and contact details for all staff and also a staff only section.

A fairly (very) basic course. Two glimmers of hope (if you’re taking the learning nazi approach) are:

  • the Discussion forum.
    Quite detailed in structure, the attempt seems to have been made to think this through. However, not many contributions. Given that the vast majority of students in this course are at the international campuses and have a heavy focus on face-to-face instruction (normally) this isn’t that surprising.
  • use of BAM.
    Students are expected to maintain individual blogs for reflection as part of the assessment.

What can be done now?

Layout

This page from Uni Ballarat gives a good overview of the layout of Moodle course site. i.e. a main course area in the middle and two columns of blocks on either side (thought at least one of the columns can be turned off).

With the blocks it appears that you can add and remove blocks as you like. Will have to test that out later.

With the layout of the main course area, there are three normal Moodle options and a fourth local kludge. The three normal Moodle layouts are:

  1. Weekly – where the course site is divided into sequential, weekly blocks.
  2. Topical – where the sequential blocks are based on topics.
  3. Social – where the site is structured around the discussion forum.

All three of these options might be options for this course. The original study schedule could be copied into either the weekly or topical formats with little or no modification. The use of the discussion forum in the old course, could fit very nicely with the social format with some of the information from the study schedule weaved into messages.

The fourth local kludge, appears to be a mutation of the weekly format where it appears that the course area (non-week first block) is used for a course logo or similar. The first main week’s block is used for a collection of HTML tables that creates:

  • A welcome course description message in one table.
  • A collection of menu items (e.g. the course, resources, discussions, assessment etc.) which give the illusion of a hierarchical site. Under each menu is a collection of links to Moodle activities/resources related to that menu item.
  • A simple weekly navigation menu (week 1, week 2 etc.) that links to a simple web page that summarises the tasks for that week.

The second week’s block is hidden. This, it appears, is where the actual activities and resources are added to the site. They are then linked, as appropriate, to the menu section and the weekly summaries.

I don’t think this will work in this situation. It’s a fair bit of work to set up and appears to break the affordances of Moodle. While I’m keen to minimise the difficulty of the transition, surely, you do actually want to move with the affordance of the new tool?

Resource and activities

The requirements for this course are fairly limited: a discussion forum, some word documents and a BAM equivalent. As there is no standard module in Moodle that implements a BAM service, this could have been a problem. However, given that the institution is currently using the BIM activity module I wrote as a BAM port, this isn’t a problem.

Helping the academic decide

While I can imagine either the weekly/topic or social format versions of this course, I don’t think that the academic will be able to. In addition, I’ve never really seen a social format Moodle course, so I’m not 100% confident that my predictions will match the reality. Hence the need to create examples sites, something concrete for the academic to look at and play with.

Create the site

Am doing this on a local install of Moodle, so create the site.

Mm, there are other formats. LAMS, SCORM and one or two others. Whether they are available on the institutional site?

If you choose “Social format” it still asks for number of weeks/topics, wouldn’t that be no longer needed?

So, that’s the site created. By default it’s got some pre-defined blocks down the left hand side – I think these are based on the institutional defaults, might be just normal Moodle defaults. There’s the option to add blocks in the right hand column and just about an empty space in the middle but with a button “Add a new discussion topic”.

Adding the topics

In my head, we’re going to borrow the approach used in the old course site’s discussion forum. i.e. different forums for different purposes, including each week. The last time I’d taught, I’d used a similar approach. The plan is to use pretty much the same structure.

Oh, that is sad. It appears that rather than separate discussion forums which can be used to separate out tasks, Moodle has by default set the course area as a single forum. Ahh, and the topics are shown with most recent first. Surely there’s got to be an option to change that?

It appears not, looks like my assumptions don’t match the affordances in Moodle. That’s sad and means we’re back to the bog standard weekly/topic format.

The general visuals on the Moodle forum were also not that great. Somewhat ugly and not great from an interface perspective. I wonder if that’s inherent or arises because of the institutional template? It’s sad because the discussion forum tool in Webfuse (circa about 2002) seems to have a better interface (not a great one, but certainly better than Moodle).

Creating the weekly format

I’ve chosen weekly because like it or not most of the students/staff think in weeks of term, plus there’s not a lot of difference in Moodle between the weekly and topic formats – at least to my inexperienced eye.

So, edit the options for the course to weekly and start a process of re-creating the study schedule from the old site in Moodle.

From here it’s a fairly manual process with only a few tweaks about how to do this.

Time to wait and see what the academic thinks.</p

The scrolling problem

I think that the “scrolling problem” is a fairly typical complaint about Moodle sites. If you use the topic/week format and have more than a few fairly complete blocks, people have to start scrolling to get any where. That can add to confusion for some.

This was a problem we faced with the Webfuse study schedule page design. The obvious way to solve it was internal links. If you visit this study schedule page you will see that each weekly block as internal links to each of the other 12 weeks.

Is there an automated process in Moodle that helps do this?

I think for this site, I’ll have to put in a kludge, somewhat like the local kludge course layout.

References

Bennett, S., S. Agostinho, et al. (2006). “Supporting university teachers create pedagogical sound learning environments using learning designs and learning objects.” IADIS Internatioanl Journal on WWW/Internet 4(1): 16-26.

Dalziel, J. (2003). Implementing learning design: The learning activity management syste (LAMS). 20th Annual Conference of the Australasian Society for Computers in Learning in Tertiary Education, Adelaide, SA.

Chickering, A. W. and Z. F. Gamson (1987). “Seven principles for good practice in undergraduate education.” AAHE Bulletin 39(7): 3-7.

Oliver, R. (2000). When teaching meets learning: Design principles and strategies for Web-based learning environments that support knowledge construction. ASCILITE’2000, Coffs Harbour.

Stark, J. (2000). “Planning introductory college courses: Content, context and form.” Instructional Science 28(5): 413-438.

Tickle, K., N. Muldoon, et al. (2009). Moodle and the institutional repositioning of learning and teaching at CQUniversity. ascilite 2009. Auckland, NZ: 1038-1047.

The role of experience

Peter Albion picked up on an earlier post of mine and offers a brief description of his own experience within Australian universities. In particular, the increasing focus on compliance with bureaucratic systems as a means of assuring quality, a move back to hierarchies of command and control and apparent adoption of a Theory X view. A view that resonates with what I see within my current institution and one others talk about.

This morning I was listening to this talk by Baroness Susan Greenfield. In the end she suggests that online network is potentially harmful, but I’m going to ignore that. One of the fundamental planks for her argument is brain plasticity. i.e. that the brain is shaped by what we do with it. What we experience, what we think shapes our brain.

What is the current environment of compliance, command and control, and Theory X doing to the thoughts and brains of the academics that work within them?

Dan Pink talks about motivation and suggests that it requires workers to have feelings of autonomy, mastery and purpose. When it comes to learning and teaching within universities, I’ve argued previously that for some the current environment provides anything but that combination.

As it happens, I’m also reading at the moment a book by James Zull called The Art of Changing the Brain: Enriching the practice of teaching by exploring the biology of learning. I think this quote is interesting (emphasis in original)

..no outside influence or force can cause a brain to learn. It will decide on its own. Thus, one important rule for helping people to learn is to help the learner feel she is in control.

For me, the lesson here is that if you want to improve learning and teaching at Universities, the academics have to feel that they are in control. This does not mean they do their own thing. As Peter wrote

There is some benefit in ensuring that certain basics are in place but there is also room for some variation that provides scope for the next improvement to emerge.

The academic has to feel like they are in charge of that next improvement, to have the room for some variation. The compliance, top-down culture infecting universities (in Australia at least) is removing that control and is often ineffective in ensuring that the basics are in place because it has removed the motivation (in the form of autonomy, mastery and purpose) from the academics.