Thinking about evaluating Webfuse (1996 through 1999) – evaluation of an LMS?

For the last couple of weeks I’ve been working on chapter 4 of my thesis. I’ve worked my way through explaining the context (general context and (use of e-learning), the design guidelines and the implementation (parts 1, 2 and 3). I’ve now reached the evaluation section, where I’m meant to describe what happened with the use of Webfuse and make some judgement calls about how it went.

The purpose of this post is to make concrete what I’m thinking about doing. A sort of planning document. I don’t think it will be of much use to most others, though the following section on related work might be some interest.

Other related work

Indicators project

Col and Ken, two colleagues at CQU have started the indicators project which is seeking to provide academics with tools to reflect on their own usage of LMSes. There most recent presentation is up on Slideshare (where’s the video Col?).

They are currently drawing primarily on data from the Blackboard LMS which was used at CQU from about 2004 through 2009. Webfuse was essentially a parallel system, but it ran from 1997 through 2009. Both are being replaced by Moodle in 2009.

At some stage, I am hoping to mirror the work they are doing with Blackboard on Webfuse. This will complete the picture to encompass all e-learning at CQU and also potentially provide some interesting comparisons between Webfuse and Blackboard. This will be somewhat problematic as there are differences in assumptions between Webfuse and Blackboard. For example, Webfuse generally doesn’t require students to login to visit the course website. Most are freely available.

Some of the data from Ken’s and Col’s presentation about Blackboard:

  • 5147 courses – would be interesting to hear the definition of course as a number of Blackboard courses during this period were simply pointers to Webfuse courses.
  • Feature adoption using a framework adopted from Malikowski (2007) as a percentage of online courses from 2005 through 2009
    • Files: ranging from 50% to 78%
      Which raises the question, what did the other 22-50% of courses have in them, if no files? Just HTML?
    • News/Announcements: ranging from 77% to 91% (with a peak in 2007.
    • Gradebook: ranging from 17% to 41%
    • Forums: ranging from 28% to 61%
    • Quizzes: ranging from 8 through 15%
    • Assignment submission: ranging from 4 to 20%.

    An interesting peak: In most of the “lower level” features there seems to have been a peak, in percentage terms, in 2007. What does that mean? A similar, though to less an extent peak is visible in the forums, quizzes and assignment submission categories.

    Might be interesting to see these figures as a percentage of students. Or perhaps with courses broken down into categories such as: predominantly AIC (CQU’s international campuses), predominantly CQ campuses, predominantly distance education, large (300+ students), small, complex (5+ teaching staff), simple.

  • Hits on the course site
    There’s a couple of graphs that show big peak at the start of term with slow dropping off, with the occasional peak during term.

    It might be interesting to see the hit counts for those courses that don’t have discussion forums, quizzes or assignment submission. I feel that these are the only reasons there might be peaks as the term progresses as students use these facilities for assessment.

  • Student visits and grades.
    There are a few graphs that show a potentially clear connection between number of visits on a course site and the final grade (e.g. High Distinction students – top grade – average bit over 500 hits, students who fail average just over 150 hits). It is more pronounced for distance education students than for on-campus students (e.g. distance ed high distinction students average almost 900 hits).
  • Average hits by campus.
    Distance education students averaged almost 600 hits. Students at the AICs, less than 150.
  • Average files per course in term 1.
    Grown from just over 10 in 2005 to just over 30 in 2009.

    I wonder how much of this is through gradual accretion? In my experience most course sites are created by copying the course site from last term and then making some additions/modifications. Under this model, it might be possible for the average number of files to grow because the old files aren’t being deleted.

Malikowski, Thompson and Theis

Malikowski et al (2007) proposed a model for evaluation the use of course management systems. The following figure is from their paper. I’ve made use of their work when examining the quantity of usage of features (read this if you want more information on their work) of an LMS in my thesis.

Malikowski Flow Chart

Purpose of the evaluation

The design guidelines underpinning Webfuse in this period were:

  • Webfuse will be a web publishing tool
  • Webfuse will be an integrated online learning environment
  • Webfuse will be eclectic, yet integrated
  • Webfuse will be flexible and support diversity
  • Webfuse will seek to encourage adoption

I’m assuming that the evaluation should focus on the achievement (or not) of those guidelines. The limitations I have is that I’m restricted to archives of websites and system logs. I won’t be asking people as this was 1996 to 1999.

Some initial ideas, at least for a starting place:

  • Webfuse will be a web publishing tool
    How many websites did it manage? How many web pages on those sites? How much were the used by both readers and authors?
  • Webfuse will be an integrated online learning environment
    Perhaps use the model of Malikowski et al (2007) to summarise the “learning” functions that were present in the course sites. Some repeat of figures from the above.

    I recognise this doesn’t really say much about learning. But you can’t really effectively judge learning any better when using automated analysis of system logs.

  • Webfuse will be eclectic, yet integrated
    This will come down to the nature of the structure/implementation of Webfuse. i.e. it was eclectic, yet integrated
  • Webfuse will be flexible and support diversity
    Examine the diversity of usage (not much). Flexibility will arise to some extent from the different systems implemented.
  • Webfuse will seek to encourage adoption.
    This will come back to the figures above. Can be a reflection on the statistics outlined in the first two guidelines.

Process

So, there’s a rough idea of what I’m going to do, what about a rough idea of how to implement it? I have access to copies of the course websites for 1998 and 1999. I’m hoping to have access to the 1997 course sites in the next couple of weeks, but it may not happen – some things are just lost to time – though the wayback machine may be able to help out there. I also have the system logs from 1997 onwards.

In terms of meeting Malikowski et al’s (2007) framework, I’ll need to

  • Unpack each year’s set of course websites.
  • Get a list of all the page types used in those sites.
  • Categories those page types into the Malikowski framework.
  • Calculate percentages.

In terms of looking at the files uploaded to the sites, I’ll need to repeat the above, but this time on all the files and exclude those that were produced by Webfuse.

Author updates – I can parse the web server logs for the staff who are updating pages. The same parsing will be able to get records for any students who had to login. This will be a minority.

References

Malikowski, S., Thompson, M., & Theis, J. (2007). A model for research into course management systems: bridging technology and learning theory. Journal of Educational Computing Research, 36(2), 149-173.

One thought on “Thinking about evaluating Webfuse (1996 through 1999) – evaluation of an LMS?

  1. Pingback: Some early results from Webfuse evaluation « The Weblog of (a) David Jones

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s