Monthly Archives: February 2009

Patterns for e-learning – a lost opportunity or destined to fail

In the following I reflect on my aborted and half-baked attempts at harnessing design patterns within the practice of e-learning at universities and wonder whether it was a lost opportunity and/or a project that was destined to fail. This is written in the light shed by the work of a number of other folk (Google “patterns for e-learning”), including the current JISC-emerge project and, I believe, the related Pattern Language Network.

I think I’ll end up contending that it was destined to fail and hope I can provide some justification for that. Or at least that’s what I currently think, before writing the following. Any such suggestion will be very tentative.

Context

Way back in 1999 I was a young, naive guy at the crossroads of software development and e-learning, I was wondering why more academics weren’t being innovative. Actually, the biggest and most troubling question was much simpler, “Why were they repeating the same mistakes I and others had made previously?”. For example, I lost count of the number of folk who tried to use email for online assignment submission in courses with more than 10 or 20 students. Even though many folk tried it, had problems and talked about the problems with additional workload it creates.

At the same time I was looking at how to improve the design of Webfuse, the e-learning system I was working upon, and object-oriented programming seemed like a good answer (it was). Adopting OOP also brought me into contact with the design patterns community within the broader OOP community. Design patterns within OOP were aimed at solving many of the same problems I was facing with e-learning.

Or perhaps this was an example of Kaplan’s law of instrument. i.e. patterns were the hammer and the issues around e-learning looked like a nail.

Whatever the reason some colleagues and I tried to start up a patterns project for online learning (I’m somewhat amazed that the website is still operating). The why page” for the project explains the rationale. We wrote a couple of papers explaining the project (Jones and Stewart, 1999; Jones, Stewart and Power, 1999), gave a presentation (the audio for the presentation is there in RealAudio format, shows how old this stuff is) and ran an initial workshop with some folk at CQU. One of the publications also got featured in ERIC and on OLDaily.

The project did produce a few patterns before dieing out:

There’s also one that was proposed but nothing concrete was produced – “The Disneyland Approach”. This was based on the idea of adapting ideas from how Disney designs their theme parks to online learning.

I can’t even remember what all the reasons were. Though I did get married a few months afterwards and that probably impacted my interest in doing additional work. Not to mention that my chief partner in crime also left the university for the paradise of private enterprise around the same time. That was a big loss.

One explanation and a “warning” for other patterns projects?

At the moment I have a feeling (it needs to be discussed and tested to become more than that) that these types of patterns projects are likely to be very difficult to get to work within the e-learning environment, especially if the aim is to get a broad array of academics to, at least, read and use the patterns. If the aim is to get a broad array of academics to contribute to patterns, then I think it’s become almost impossible. This feeling/belief is based on three “perspectives” that I’ve come to draw upon recently:

  1. Seven principles for knowledge management that suggest pattern mining will be difficult;
  2. the limitations of using the Technologists’ Alliance to bridge the gap;
  3. people (and academics) aren’t rational and this is why they won’t use patterns when designing e-learning and

7 Principles – difficulty of mining patterns

Developing patterns is essentially an attempt at knowledge management. Pattern mining is an attempt to capture what is known about a solution and its implementation and distill it into a form that is suitable for others to access and read. To abstract that knowledge.

Consequently, I think the 7 principles for knowledge management proposed by Dave Snowden apply directly to pattern mining. To illustrate the potential barriers here’s my quick summary of the connection between these 7 principles and pattern mining.

  1. Knowledge can only be volunteered it cannot be conscripted.
    First barrier in engaging academics to share knowledge to aid pattern mining is to get them engaged. To get them to volunteer. By nature, people don’t share complex knowledge, unless they know and trust you. Even then, if their busy…. This has been known about for a while.
  2. We only know what we know when we need to know it.
    Even if you get them to volunteer, then chances are they won’t be able to give you everything you need to know. You’ll be asking them out of the context when they designed or implemented the good practice you’re trying to abstract for a pattern.
  3. In the context of real need few people will withhold their knowledge.
    Pattern mining is almost certainly not going to be in a situation of real need. i.e. those asking aren’t going to need to apply the provided knowledge to solve an immediate problem. We’re talking about abstracting this knowledge into a form someone may need to use at some stage in the future.
  4. Everything is fragmented.
    Patterns may actually be a good match here, depending on the granularity of the pattern and the form used to express it. Patterns are generally fairly small documents.
  5. Tolerated failure imprints learning better than success.
    Patterns attempt to capture good practice which violates this adage. Though the idea of anti-patterns may be more useful, though not without their problems.
  6. The way we know things is not the way we report we know things.
    Even if you are given a very nice, structured explanation as part of pattern mining, chances are that’s not how the design decisions were made. This principle has interesting applications to how/if academics might harness patterns to design e-learning. If the patterns become “embedded” amongst the academics “pattern matching” process, it might just succeed. But that’s a big if.
  7. We always know more than we can say, and we will always say more than we can write down.
    The processes used to pattern mine would have to be well designed to get around this limitation.

Limitations of the technologists’ alliance

Technology adoption life-cycle - Moore's chasm

Given that pattern mining directly to coal-face academics is difficult for the above reasons, a common solution is to use the “Technologists’ Alliance” (Geoghegan, 1994). i.e. the collection of really keen and innovative academics and the associated learning designers and other folk who fit into the left hand two catagories of the technology adoption life cycle. i.e. those to the left of Moore’s chasm.

The problem with this is that the folk on the left of Moore’s chasm are very different to the folk on the right (the majority of academic staff). What the lefties think appropriate is not likely to match what the righties are interested in.

Geoghegan (1994) goes so far as to claim that the “alliance”, and the difference between them the righties, has been the major negative influence on the adoption of instructional technology.

Patterns developed by the lefties are like to be in language not understood by the righties and solve problems that the righties aren’t interested and probably weren’t even aware existed. Which isn’t going to positively contribute to adoption.

People aren’t rational decision makers

The basic idea of gathering patterns is that coal face academics will be so attracted to the idea of design patterns as an easy and effective way to design their courses that they will actually use the resulting pattern language to design their courses. This ignores the way the human mind makes decisions.

People aren’t rational. Most academics are not going to follow a structured approach to the design of their courses. Most aren’t going to quickly adopt a radically different approach to learning and teaching. Not because their recalcitrant mongrels more interested in research (or doing nothing), because they have the same biases and ways of thinking as the rest of us.

I’ve talked about some of the cognitive biases or limitations on how we think in previous posts including:

In this audio snippet (mp3) Dave Snowden argues that any assumption of rational, objective decision making that entails examining all available data and examining all possible alternate solutions is fighting against thousands of years of evolution.

Much of the above applies directly to learning and teaching where the experience of most academics is that they aren’t valued or promoted on the value of their teaching. It’s their research that is of prime concern to the organisation, as long as they can demonstrate a modicum of acceptable teaching ability (i.e. there aren’t great amounts of complaints or other events out of the ordinary).

In this environment with these objectives, is it any surprise that they aren’t all that interested in spending vast amounts of time to overcome their cognitive biases and limitations to adopt radically different approaches to learning and teaching?

Design patterns anyone?

It’s just a theory

Gravity, just a theory

Remember what I said above, this is just a theory, a thought, a proposition. Your mileage may vary. One of these days, when I have the time and if I have the inclination I’d love to read some more and maybe do some research around this “theory”.

I have another feeling that some of the above have significant negative implications for much of the practice of e-learning and attempts to improve learning and teaching in general. In particular, other approaches that attempt to improve the design processes used by academics by coming up with new abstractions. For example, learning design and tools like LAMS. To some extent some of the above might partially explain why learning objects (in the formal sense) never took off.

Please, prove me wrong. Can you point to an institution of higher education where the vast majority of teaching staff have adopted an innovative approach to the design or implementation of learning? I’m talking at least 60/70%.

If I were setting the bar really high, I would ask for prove that they weren’t simply being seen to comply with the innovative approach, that they were actively engaging and embedding it into their everyday thinking about teaching.

What are the solutions?

Based on my current limited understanding and the prejudices I’ve formed during my PhD, I believe that what I currently understand about TPACK offers some promise. Once I read some more I’ll be more certain. There is a chance that it may suffer many of the same problems, but my initial impressions are positive.

References

Geoghegan, W. (1994). Whatever happened to instructional technology? 22nd Annual Conferences of the International Business Schools Computing Association, Baltimore, MD, IBM.

David Jones, Sharonn Stewart, The case for patterns in online learning, Proceedings of Webnet’99 Conference, De Bar, P. & Legget, J. (eds), Association for the Advancement of Computing in Education, Honolulu, Hawaii, Oct 24-30, pp 592-597

David Jones, Sharonn, Stewart, Leonie Power, Patterns: using proven experience to develop online learning, Proceedings of ASCILITE’99, Responding to Diversity, Brisbane: QUT, pp 155-162

“An ISDT for e-learning” – Audio is now synchronized

On Friday the 20th of Feb I gave a talk at the ANU on my PhD. A previous post has some background and an overview of the presentation.

I recorded the presentation using my iPhone and the Happy Talk recorder application. I’ve finally got the audio up and synchronised with the Slideshare presentation.

Hopefully the presentation is embedded below, but I’ve had some problem embedding it in the blog (all the other slideshare presentations have been ok.

Nope, the embedding doesn’t want to work. Bugger. Here’s a link to the presentation page on Slideshare.

Limitations of Slideshare

In this presentation, along with most of my current presentations, I use an adapted form of the “Lessig” method of presentation. A feature of this method is a large number of slides (in my case 129 slides for a 30 minute presentation) with some of the slides being used for very small time frames – some less than a second.

The Slideshare synchronisation tool appears to have a minimum time allowed for each slide – about 15 seconds. At least that is what I found with this presentation. I think perhaps the limitation is due to the interface, or possibly my inability to use it appropriately.

This limitation means that some of the slides in my talk are not exactly synchronised with the audio.

The Happy Talk Recorder

I’m very happy with it. The quality of the audio is surprisingly good. Little or no problems in using it. I think I will use it more.

An information systems design theory for e-learning

Yesterday I gave a presentation at the Australian National University on my PhD. I’m doing it through ANU and this 30 minute presentation is a standard requirement of study. The slides are up on slideshare (embedded below). I recorded the audio and will be trying to put that online later on today and make the slides into a slidecast.

The presentation

It appears embedding the presentation in this post isn’t working at the moment. The slides can be found here on slideshare. — seems the embedding is working now.

The description

It’s been a while since I worked directly on the PhD and creating this presentation was a way to become deeply familiar with the thesis again, in preparation for writing it up. So the presentation is structured in line with the thesis and provides a high level overview of the whole thing.

While the information systems design theory (ISDT) that is the main product of the thesis gets a mention, explaining the design theory is not the primary goal of the presentation. Such descriptions have been given in other papers (Jones and Gregor, 2002; Jones and Gregor, 2004). Instead the emphasis of the presentation is on the other components of the thesis that are in need of some extra work.

Most of the content of the presentation is focused on chapter 2 and the Ps Framework. In fact, must of it is related around the content of a paper I’ve proposed for later in the year.

Essentially the idea is that the practice of e-learning within universities has a definite orthodoxy (which LMS will we adopt). I suggest that for a number of reasons the understandings that underpin that orthodoxy are entirely inappropriate and this is why most university e-learning implementations are plagued by less than widespread use by academics, low quality learning by those that do use it and some concerns around return on investment.

There’s also some early work on the structure of chapter 3 – the research method. But still early days there.

References

Jones, D. and S. Gregor (2004). An information systems design theory for e-learning. Managing New Wave Information Systems: Enterprise, Government and Society, Proceedings of the 15th Australasian Conference on Information Systems, Hobart, Tasmania.

Jones, D. and S. Gregor (2006). The formulation of an Information Systems Design Theory for E-Learning. First International Conference on Design Science Research in Information Systems and Technology, Claremont, CA.

Reliability – an argument against using Web 2.0 services in learning? Probably not.

When you talk to anyone in an “organisational” position (e.g IT or perhaps some leadership positions) within a university about using external “Web 2.0″ tools to support student learning one of the first complaints raised is

How can we ensure it’s reliability, it’s availability? Do we have as much control as if we own and manage the service on our servers? Will they be as reliable and available?

My immediate response has been, “Why would we want to limit them to such low levels of service?”. Of course, it’s a little tounge in cheek and given my reputation in certain circles not one destined to win friends and influence people. There is, however, an important point underpinning the snide, flippant comment.

Just how reliable and available are the services owned and operated by universities? My anecdotal feeling is that they are not that reliable or available.

What about web 2.0 tools?

Paul McNamara has a post titled “Social network sites vary greatly on availability, Pingdom finds” that points to a Social network downtime in 2008 PDF report from Pingdom. The report discusses uptime for 15 social network tools.

A quick summary of some of the comments from the report

  • Only 5 social networks managed an overall uptime of 99.9% or better: Facebook (99.92%), MySpace (99.94%), Classmates.com (99.95%), Xanga (99.95%) and Imeem (99.95%).
  • Twitter – 99.04% uptime
  • LinkedIn – 99.48% uptime
  • Friendster – 99.5% uptime
  • Reunion.com – 99.52% uptime
  • Bebo – 99.56% uptime
  • Hi5 – 99.75% uptime
  • Windows Live Spaces – 99.81% uptime
  • LiveJournal – 99.82% uptime
  • Last.fm – 99.86% uptime
  • Orkut – 99.87% uptime

Is it then a problem?

The best you can draw from this is that if you’re using one of the “big” social network tools then you are probably not going to have too much of a problem. In fact, I’d tend to think you’re likely to have much more uptime than you would with a similar institutional system.

The social network tool is also going to provide you with a number of additional advantages over an institutionally owned and operated system. These include:

  • A much larger user population, which is very important for networking tools.
  • Longer hours of support.
    I know that my institution struggles to provide 10 or 12 x 5 support. Most big social network sites would do at least 10 or 12 x 7 and probably 24×7.
  • Better support
    Most institutional support folk are going to be stretched trying to maintain a broad array of different systems. Simply because of this spread their knowledge is going to be weak in some areas. The support for a social network system is targeted at that system, they should know it inside and out. Plus, the larger user population, is also going to be a help. Most of the help I’ve received using WordPress.com has come from users, not the official support, of the service.
  • Better service
    The design and development resources of the social network tool are also targeted at that tool. They aim to be the best they can, their livelihood is dependent upon it in a way that university-based IT centres don’t have to worry about.

Down with facebook – why I’m going to minimise my use

I’ve had a Facebook account for about a year. I’ve never really used it beyond making contact with other folk. Have never uploaded any content and tonight I’ve decided to make that permanent. I won’t shut the account down. I’ll keep it open so that friends from the past can find me.

However, I won’t recommend it to folk. Just the opposite, stay away. I also won’t be handing over any of my content.

Why?

Alan Levine has a post that closely resembles my own view. Some long term reserve about Facebook and some recent additional motivation due to the change to the Facebook Tos.

Original qualms

My original qualms were due to not really seeing the point of an integrated, one stop shop like Facebook and being philosophically (i.e. probably unreasonably for most) opposed to integrated software that doesn’t support sharing.

I’m a small pieces loosely joined (it’s a PhD/Webfuse/UNIX command line thing) sort of guy. I use Twitter, have a blog, use photo sharing and slidecast and all on different services. Why would I use a single integrated system? One where I am stuck with whatever crap tools they’ve decided to provide.

What’s worse, it’s been claimed that Facebook is doing what Microsoft did, and we all hate Microsoft. At least I do.

This qualm applies to any of the similar integrated systems – e.g. MySpace etc.

The terms of service

The concerns about the recent change in the terms of service may not be not as bad as some fear. However, for me it’s the hair that’s broken the camel’s back.

Of course, your mileage may vary.

I’m sticking with collection of Web 2.0 tools that I can pick and choose from and connect in ways that suit me. Small pieces loosely joined.

Update: Amanda French has a post that compares the Facebook ToS with those of other services. Interesting read.

On the plus side

Facebook is a pretty easy system to use and the ease of connection between folk, not to mention the sheer number and spread of people on it are all very positive observations in favour of Facebook.

I’m assuming it’s really easy for the less computer savvy to get into and the size of its user population is a big plus.

Some ideas for e-learning indicators – releasing half-baked ideas

The following is a quick mind dump of ideas that have arisen today about how you might make use of the usage data and content of course websites from course management systems (CMS) to find out interesting things about learning and teaching. i.e. Col is aiming to develop inidcators that might be of use to the participants associated with e-learning – management, support staff, academics, students etc.

This post is an attempt to live up to some of the ideas of Jon Udell that I discussed in this post about getting half-baked ideas out there. Col and I have talked a bit today and I’ve also regained some prior thoughts that I’m putting down so we don’t forget it.

The major benefit of getting these half-baked ideas out there are your thoughts. What do you think? Want to share your half-baked ideas?

The fundamental problem

How do you identify/generate useful indicators that might be harnessed to act as weak signal detectors? How can we use all of this data about e-learning to generate something useful?

Disclaimer

It is fully understood that drawing simply upon usage data and other electronic data can never tell you the full story about a student’s learning experience or the quality of the teaching. At best it can indicate that something might be there, in almost all cases further investigation would be required to be certain.

For example, lots of discussion on a course discussion forum with lots of people responding to each other might be indicative of a good learning experience. It might be indicative of an out of control flame war.

However, knowing a little bit more about what’s going on and applying it sensible will helpfully be of some use.

The following are propositions about what might be interesting indicators. These need to be tested, both quantatively and qualitatively.

Content correlations?

It’s fairly widely accepted that most CMSes are generally used primarily as content repositories. Academics put lots of documents and other content into them for students to access. In some cases the ability of the CMS to act in some other purpose (e.g. to encourage discussion and collaboration) is significantly limited by the quality and variety of the tools they provide for these services and also some of the fundamental assumptions built into the CMSes (e.g. you have to login to acces the services).

If content is the primary use, is there anything useful that can be gained from. What I can think of includes:

  • If there is no or little information then this is bad.
    If the course site doesn’t contain anything, that’s probably a sign of someone who is not engaging with teaching. Courses with little or no content could be a negative indicator.
  • If the information is structured well, then it is good.
    Poor structure again may indicate some less than knowledgable. Almost all CMSes use a hierarchical structure for information. If all the content is located within 1 of 7 parts of the hierarchy, things may not be good.
  • If the content is heavily used, then this might imply usefulness.
    If students are using content heavily and that heavy use is consistent across most content this might indicate well designed content, which might be a good thing.
  • If the content is primarily the product of publishers, then this might be bad.
    A course that relies almost entirely for content from a textbook publisher might suggest an experience that is not customised to the local context. It might suggest an academic taking the easy way out. Which might indicate a less than positive outcome.
  • A large average # of hits on course content per student, might be positive indicator.
    If, on average, all of the students use the course content more, this may indicate more appropriate/useful material which might indicate a good learning experience.

Looking for particularly strong courses (see images below) might lead to the following being of interest

  • Percentage of total content per course.
    See images below. Essentially, courses with a greater percentage might be better.
  • Percentage of total requests.

Percentage of staff using the system

A simple one, the greater the percentage of the employed teaching staff using the system, the better.

An example

The following images illustrate how this was used in this presentation to compare and contrast usage of Webfuse after a period using the wrong development process and then after a period of using a better development process (remember the disclaimer above).

Results of “bad” process

Usage of an LMS - a measure (1 of 4)

Usage of an LMS - a measure (2 of 4)

Usage of an LMS - a measure (3 of 4)

Usage of an LMS - a measure (4 of 4)

Results of “good” process

Usage of an LMS - staff adoption (1 of 3)

Usage of an LMS - staff adoption (2 of 3)

Usage of an LMS - staff adoption (3 of 3)

Common sense (the things we take for granted) is the big obstacle for innovation

Wesley Fryer has a post summarising a talk given by Sir Ken Robinson. I’m pulling out a few relevant quotes/recollections for later use.

Update: One of the comments on Wesley Fryer’s post points to video of Sir Ken giving a similar talk in another venue.

Common sense and innovation

Common sense (the things we take for granted) is the big obstacle for innovation

Resonates for me because on the main aims of my research in e-learning is that the “common sense” that surrounds current practice in e-learning is a big obstacle for innovation (and adoption, acceptance…). A perspective expanded in this post.

Excellence comes through customizing

The enemy of raising standards is conformity

Quality through consistency has been one of my bug bears for over 10 years. Talked about briefly here. I’m a fan of Oscar Wilde’s take on consistency

Consistency is the last refuge of the unimaginative.

Metaphors for higher education

a better metaphor for education is not manufacturing, but is agriculture

When you consider much of higher education is importing practices from manufacturing…

Frameworks and representation – tidy versus messy

I’m a fan of frameworks and taxonomies. Also known as theories for understanding (Gregor, 2006). It’s the understanding part that I like. They provide, or at least good ones do, a leg up in understanding difficult concepts. As Mischra and Koehler (2006, p 1019) say

Having a framework goes beyond merely identifying problems with current approaches; it offers new ways of looking at and perceiving phenomena and offers information on which to base sound, pragmatic decision making.

As it happens, I’m currently doing a lot of work around one framework and its application and the following arises out of that work.

Two of my current most used frameworks include Dave Snowden’s cynefin framework (Snowden and Boone, 2007) and Mischra and Koehler’s TPACK (2006). Representation is important to frameworks. The cynefin framework, in particular, has a very specific representation that has very specific meaning and purpose.

TPACK framework

The TPACK crew have just released an updated representation of their framework (see the image to the left). I particularly like the addition of ‘contexts’ around the outside. The use of ven diagrams is important, one of the contributions of TPACK is the overlaps.

Tidy versus messy

One of the things I don’t like about frameworks is that they have (for very good reasons) to be tidy. This certainly helps understanding, a key purpose of frameworks, but it also can give the false impression of tidiness, of simplicity of a tame problem. My interest is currently in e-learning within universities, which I consider to be extremely messy. To me it is an example of a wicked problem.

A message version of TPACK

Last week I ran a session on course analysis and design for some CQUniversity academic staff. I used TPACK as one of the major themes. However, at one point I really wanted to emphasise to the participants that none of our discussions should be taken to assume that this is a neat and simple problem. The image to the left is the one I used to reinforce this (they’d already seen the tidy version of TPACK).

In doing this, I sacrificed much of the representational value of TPACK to highlight the messiness involved.

The Ps Framework – Tidy versus messy

For about 3 years (this presentation is the first public evidence) I’ve been working on what is now known as the Ps Framework as part of my PhD.

The first representation of the Ps Framework, taken from the first presentation is included below. A photo of some frozen peas used as a “pun”. The arrows are included, but don’t really mean anything. Still very early days.

Version 1 of the Ps Framework

The next public iteration of a graphical representation of the Ps Framework was the following one for a more recent presentation (you can even watch the video of this one). In this “Place” becomes the underlying context for all the other Ps. Much like the addition of context in TPACK. The frozen peas disappear for nice tidy circles (to some extent each one is meant to be a pea) and the arrows are still there. The arrows are meant to indicate that each of the Ps impacts upon the other in some unspecified way.

Version 2 of the Ps Framework

I had to prepare the above images to deadlines for presentations. I never liked them. Too tidy and they appeared to indicate linear or simple connections between the individual Ps. I don’t believe that. The relationships between these Ps when talking about e-learning implementation within universities is messy, complex and unpredictable – at least beforehand.

So I had to come up with something else. For the last few months of last year Jocene, Nathaniel and I spent a lot of time discussing and arguing about how to represent the Ps Framework. The following is my current best effort – it’s the effort I’ll be using this week at ANU.

a messy version

I have a range of problems with this representation including:

  • It’s still a little too structured.
    i.e. People only overlaps with Past Experience, Purpose and Process. Those overlaps aren’t intentional. They aren’t meant to represent some specific connection. I’m not sure what the connections are, I have an inkling that each and everyone is connected/overlaps with the other but I am stuck with this current conceptualisation.
  • It’s too static.
    The relationships between these components is forever changing. Universities and the place they inhabit are continually changing, each of the other components are changing and each change has some, unpredictable impact on the other components. In my mind I see this dynamic representation of this image where each component is seething and roiling and impacting upon each other.
  • It doesn’t capture perspective.
    Still not certain if this should be another P added to the framework or whether different instantiations of the Ps Framework represent different perspectives. I tend to prefer the latter, but then that leaves unsaid the important point about the perspectives of different groups being very diverse and that this is one of the fundamental problems with e-learning within universities.

Any suggestions?

References

Gregor, S. (2006). “The nature of theory in information systems.” MIS Quarterly 30(3): 611-642.

Mishra, P. and M. Koehler (2006). “Technological pedagogical content knowledge: A framework for teacher knowledge.” Teachers College Record 108(6): 1017-1054.

Snowden, D.J. Boone, M. “A Leader’s Framework for Decision Making”. Harvard Business Review, November 2007, pp. 69-76.

New ways of thinking – quote

Came across the following quote in Mischra and Koehler (2006), storing it here for future use.

The important thing in science is not so much to obtain new facts as to discover new ways of thinking about them.
Sir William Henry Bragg

References

Mishra, P. and M. Koehler (2006). “Technological pedagogical content knowledge: A framework for teacher knowledge.” Teachers College Record 108(6): 1017-1054.

RSS feeds into course management systems – why?

Last night I was looking for some information about recording audio for powerpoint presentations in order to create a slidecast

Aside: I like Slideshare and I like creating slidecasts. However, synchronising the audio with each slide is a pain, even using the interface provided by Slideshare. I’d much prefer being able to record the audio while giving the presentation and having it automatically synchronised. A while ago I thought we had a process using Powerpoint, but no. Bloody powerpoint keeps cutting off the last few seconds of the audio for each slide. To get it to work you have to pause for 5 seconds at the end of each slide. If you have any insight into how to fix this, please let me know. I can’t even find any mention of this problem via Google.

While searching for some information I came across the TLT Group’s wordpress blog because of the low threshold applications included some stuff on narrations. It also had an LTA on integrating RSS feeds into a course management system.

I sent this around to some folk at the PLEs@CQU project and some others. One of them responded with

I am not sure of the advantages of having RSS feeds go through the CMS. It is an easy thing for individuals to set up in their own, online personal learning environments.

It’s easy to do, not

Some of the other low technology applications included on the TLT site include

Personally, I’d class these tasks as much simpler and more familiar to people than integrating RSS into a CMS.

The definition for an LTA used on the TLT blog is

A Low Threshold Application (LTA) is a teaching/learning application of information technology that is reliable, accessible, easy to learn, non-intimidating and (incrementally) inexpensive.Each LTA has observable positive consequences, and contributes to important long term changes in teaching and/or learning. “… the potential user (teacher or learner) perceives an LTA as NOT challenging, not intimidating, not requiring a lot of additional work or new thinking.LTAs… are also ‘low-threshold’ in the sense of having low INCREMENTAL costs for purchase, training, support, and maintenance.”

Even though they are low threshold, you would be surprised at the number of academics who do not know how to carry out these tasks. Computer literacy amongst academics remains fairly low. I also think the same applies for students. Most of these folk know how to do what they do regularly – email, IM etc. But there are few people who are comfortable with and able to explore applications and think of how they can harness the features of technology to improve education.

Especially if it requires a rethinking of how they teach.

Advantages

The uncertainty held about the advantages of this approach is, potentially, one example of this difficulty people have of applying new features of technology to learning and teaching. Some possible examples follow, but they mostly come down to the following description

Incorporating a newsfeed into your WebCT course is a great way to get dynamic, changing content into the password protected environment of WebCT.Potential uses include creating an up to date ‘breaking information’ news source for your class.

which comes from this page which is pointed to from the LTA RSS page.

The example used on that page is for the academic to maintain a course blog that they use to keep students aware of events. This is similar to what was done on the EDED11448 website for “latest discussion”.

The EDED11448 website also shows a more interesting example of this practice in the portfolio, weblog and resources sections. Each of these pages show an example of aggregating individual RSS feeds from students into a single RSS feed and then including it in the course site.

As was pointed out above it is easy enough for students and staff to make use of these RSS feeds in their own personal RSS readers. They don’t need to go to the course site. However, I can think of two reasons why this is a good thing:

  1. It helps maintain an identity for the course.
    Like it or not, course websites remain an important contributor to the identity of a course offering and/or to the staff member coordinating a course. Many folk like, in part because it has become the accepted practice, to have a course website that can be seen as a product of a course. Having it distributed into everyone’s personal learning environment removes that sense of identity. There has been some work around learning networks that suggests that this is one of the requirements of a learning network. For example, look at this paper and search for the section titled “requirements of a learning network”.
  2. It’s still not easy for everyone to use an RSS reader.
    As I pointed out in the previous section. RSS readers are still not common place. A lot of people don’t know what they are. A lot of students have become indoctrinated into the practices associated with a course website. Having the RSS feed in the course website helps the transition. The advantage of this idea is you can support both the course website and those with RSS readers.

    For example, the EDED11448 website looks like a fairly typical course website, this serves the traditional students. There is also an OPML feed that allows the entire site and all its contents and updates to be tracked via an RSS reader.

    Isn’t a key feature of personal learning environments allowing the students to make their own choice. They choose, course website or RSS reader, or both.