Minimalism, constructivism and use of Moodle or any other e-learning tool

Ben-Ari (1999) reports an experiment where 10 members of a University department were asked to verbalise their understandings as they completed tasks in Word. The aim was to explore their conceptual understanding of Word and its link with their use of Word. Ben-Ari (1999) writes

Considering the high quality of the subjects, the most surprising result was the low level of use of this very sophisticated software tool.

Another surprising result was

the degree of anthropomorphic volition attributed to the software….”You see that’s what I mean, it behaves strangely

Given the widespread recognition of the limited use of institutional e-learning information systems (e.g. Moodle), I wonder what a similar experiment focused on teacher and student conceptual understanding would reveal?

I’m willing to bet there might be some significant similarities. Especially given my finding last year that branding the LMS can hurt “learning”.

I wonder if this offers some explanation about why a tool like Moodle – designed from a socio-constructivist perspective – is rarely used that way?

I wonder what, if anything, could be fruitfully done to confirm and fix this?

I wonder if there’s any correlation between this and the nature of the type of training provided to teachers and learners? Most of what training I’ve seen seems to rely on what Ben-Ari labels as minimalism

a method for designing manuals for software documentation and for using these manuals in training users of the software. A minimalist manual is short, stresses active learning and considers errors to be opportunities for learning rather than mistakes to be corrected….its insistence that conceptual material not be included in training, or at least that it be deferred until the student is more experienced.

References

Ben-Ari, M. (1999). Bricolage Forever! In Eleventh Workshop on the Psychology of Programming Interest Group (pp. 53–57). Leeds, UK. Retrieved from http://www.ppig.org/papers/11th- benari.pdf

Extending a little thought experiment

David Wiley has posed a little thought experiment that encourages reflection around levels of automation and “personalisation” within a University course. Judging by my Twitter stream it appears to have arisen out of a session or happening from the ELI conference. The experiment describes a particular teacher purpose, outlines four options for fulfilling that purpose, and offers a standard against which to consider those options.

It’s a thought experiment that connects to a practice of mine and the growing status quo around higher education (at least in Australia). It’s also generated some interesting responses.

I’d like to extend that experiment in order to

  1. Reflect on some of the practices I have engaged in.
  2. Highlight some limitations with the current practice of e-learning in Australian higher education.
  3. Point out a potential problem with one perceived future for e-learning (replace the teacher with technology).

First, it would be useful to read Wiley’s original (and short) thought experiment and the responses.

Types of extensions

There are a range of ways in which the original thought experiment could be extended or modified. I’ll be looking at the following variations

  1. Modify the teacher’s purpose. (The support extension)
    In Wiley’s experiment the teacher is seeking to acknowledge success (score 80% or higher on an exam). Does a change in purpose impact your thinking?
  2. Clarify the context. (The inappropriate massification extension)
    Does the nature and complexity of the educational context matter? Does it change your thoughts?
  3. Add or modify an option. (The personalisation extension)
    Wiley gives four options ranging on a scale from manual/bespoke/human to entirely automated. Some of the comments on Wiley’s post offer additional options that vary the relationship between what is automated and what is manual/human. Generally increasing the complexity of the automation to increase it’s level of “personalisation”. At what level does automation of personalisation become a problem? Why?
  4. Question the standard
    The standard Wiley sets is that the students “receive a message ‘from their teacher’ and that students will interpret the messages as such”. In a world of increasingly digitally mediated experiences, does such a standard make sense?
  5. Change the standard each practice is being measured against. (The “connection not the message” extension).

The support extension

In Wiley’s experiment the purpose is stated as the faculty member deciding

that each time a student scores 80% or higher on an exam, she’ll send them an email congratulating them and encouraging them to keep up the good work

What if the purpose was to

Identify all those students who have not submitted an assignment by the due date and don’t already have an extension. Send each of those students an email asking if there’s a problem that she can help with.

This is the purpose for  which I’ve recently developed and used an option similar to Wiley’s option #3.

Changing the purpose doesn’t appear to really change my thoughts about each of the options, if I use the standard from Wiley’s thought experiment

to ensure that students are actually receiving a message “from their teacher” and that students will interpret the messages as such.

With an option #3 like approach, it’s possible that students may not interpret the message as being “from their teacher”/personal. But it’s not sufficient for me to stop (more below)

But it does rule out an automation option suggested by @KateMfD

Email is bad enough, but faux email? Why not make them badges and be done?

A non-submission badge strikes me as problematic.

The inappropriate, massification extension

Does the context within which the course is taught have any impact on your thinking?

The context in which I adopted option #3 was a course with 300+ students. About 160 of those students are online students. That is, they never aren’t expected to attend a campus and the geographic location of most means that they it would be impossible for them to do so. I’m directly responsible for about 220 of those students and responsible for the course overall. There are 2 other staff responsible for two different campus cohorts.

The course is 100% assignment based. All assignments are submitted via a version of the Moodle assignment submission activity that has been modified somewhat by my institution. For the assignment described in this post only 193 of 318 enrolled students had submitted assignments by the due date. Another 78 students had received extensions meaning that 47 students hadn’t submitted by the due date.

The tool being used to manage this process does not provide any method to identify the 47 that haven’t submitted AND don’t have extensions. Someone manually needs to step through the 125 students who haven’t submitted and exclude those that have extensions.

Having done that the teacher is then expected to personally contact 47 different students? Many of whom the teacher will never meet face-to-face? Many of whom chose the online study option due to how well asynchronous learning fits their busy life and part-time study? Even though attempting to personally contact these 47 students is going to consume a significant amount of time?

Another problem is that the system provided by the institution doesn’t provide any other choice than to adopt Wiley’s option #1 (send them each an email). Not only does the system NOT support the easy identification of non-submit, no extension students. It provides no support for sending a bulk email to each student within that category (or any other category).

In order to choose Wiley’s other options a teacher would have to engage in a bit of bricolage just like I did. Which tends not to happen. As an example consider that my course is a 3rd year course. The 300+ students in my course have been studying for at least 3 years in an online mode. Many of them for longer than that because they are studying part-time. They will typically have studied around 16 courses before starting my course. With that in mind here’s what one student wrote in response to me adoption option #3

Thank you for contacting me in regards to the submission. You’re the first staff member to ever do that so I appreciate this a lot.

Does a teaching context that has seen significant massification unaccompanied by appropriate changes in support for both students and teachers make any difference in your thoughts? If the manual options are seen to take time away from supporting other (or all) students? What if the inappropriate massification of higher education means that the teacher doesn’t (and can’t) know enough personal information about (most of the) individual students to craft a meaningful, personal email message?

The personalisation extension

Wiley’s options and some of the responses tend to vary based on the amount of personalisation, and how much of the personalisation is done by a human or is automated.

A human manually checking the gradebook and writing an individual email to each student seems to strike some as more appropriate (more human?). Manually sending an email from a range of pre-written versions also may be ok. But beyond that and people appear to start to stuggle.

What about the option suggested by James DiGioai

scripting the criterion matching step, which informs the teacher which students are above 80%, and pushes her to write bespoke messages for each matching student. She automates the tedious part of the task and let the teacher do the emotional work of connecting with and support her students.

Is it the type of work that is automated that is important?

What about the apparently holy grail of many to automate the teacher out of the learning experience? Are we fearful that technology will replace teachers? Can technology replace teachers?

Or is it the case that technology can and should

replace many of the routine administrative tasks typically handled by teachers, like taking attendance, entering marks into a grading book

Bringing us back to the question of where do you draw this line?

Question the standard

Wiley’s standard is

our faculty member wants to ensure that students are actually receiving a message “from their teacher” and that students will interpret the messages as such.

The assumption being that there is significant value to the student in the teacher sending and being seen to send a message written specifically for the student. A value evident in some of the responses to Wiley’s post.

In this “digital era” does such a standard/value continue to make sense? @KateMfD suggests that in some cases it may not, but in Wiley’s original case it does

But an email of encouragement strikes me as a different kind of thing. It’s intended either to be a personal message, or to masquerade as one. Political campaigning, marketing, all the discourses that structure our lives, and that we justly dismiss as inauthentic, reach for us with the mimicry of personal communication. “Dear Kate” doesn’t make it so.

Is the “is there a problem? can I help?” message that I use in my context one that can be automated? After all, the purpose of the message is that I don’t know enough about the student’s reason for not submitting to personalise the message.

What if the massification of higher education means that the teacher doesn’t (and can’t) know enough about (most of) the students to craft a personal message? Alright to automate?

I have some anecdotal evidence to support this. I have been using options at or around Wiley’s 3rd option for years. An “email merge” facility was a feature we added to a system I designed in the early 2000s. It was one of the most used features, including use by teachers who were using a different system entirely. This facility mirrored the functionality of a “mail merge” facility where you could insert macros in a message that would be replace with information personal to each individual.

One example of how I used was a simple “how’s it going” message that I would send out a key points of the semester. One response I received from a student (which I’m sure I’ve saved somewhere, but can’t find) was along the lines of “I know this is being sent out as a global email, but it still provides a sense of interaction”.

Suggesting that at least for that student there was still value in the message, even though they knew I didn’t hand craft it.

The “connection not the message” extension

Which brings me to my last point. The standard for Wiley’s thought experiment is based on the value of the message being and being seen to be a personal message to the student. That’s not the standard or the value that I see for my practices.

For what it’s worth I think that the “7 Principles of Good Practice for Undergraduate Education” from Chickering and Gamson (1997) are an ok framework for thinking about learning and teaching. The first of their 7 principles is

  1. Encourages Contact Between Students and Faculty
    Frequent student-faculty contact in and out of classes is the most important factor in student motivation and involvement. Faculty concern helps students get through rough times and keep on working

The standard I use is whether or not the practices I use encourage contact between my students and I. Does it create a connection?

Whether or not the students see the message I sent as being personally written for them is not important. It’s about whether or not it encourages them to respond and helps a connection form between us.

In the case of the not submitted, no extension students I’m hoping they’ll respond, explain the reason they haven’t submitted, and provide an opportunity for me to learn a little more about the problems they are having.

While I haven’t done the analysis, anecdotally I know that each time I send out this email I get responses from multiple students. Most, but not all, respond.

For me, this standard is more important than the standard in Wiley’s thought experiment. It’s also a standard that my personal experience suggests that moving further up Wiley’s options is okay.

It’s also a standard which argues against the complete automation of the personalisation process. The reasons why students haven’t submitted their assignment and the interventions that may be needed and appropriate tend to represent the full richness and variety of the human condition. The type of richness and variety for which an automated system can’t (currently?) handle well.

 

Does branding the LMS hurt learning

The LMS used by my institution is Moodle, but the institution has “branded” it as “Study Desk”. Meaning students and teachers talk about finding X on the “Study Desk”. They don’t talk about finding X on Moodle. The following suggests that this branding of the LMS may actually hurt learning.

Update: Via twitter @georgekroner mentioned his post that has some stats on what institutions are branding their LMS.

Google the name (information literacy?)

The biggest course I teach is aimed at helping pre-service teachers develop knowledge and skills around using digital technology to enhance and transform their students’ learning. Early on in the course a primary goal is to help the students develop the skill/literacy to solve their own digital technology problems. The idea is that we can’t train them on all the technologies they might come across (give them fish), we can only help them learn new technologies and solve their own problems (teach them how to fish).

A key part of that process is the “Tech support cheat sheet” from XKCD. A cheat sheet that summarises what “computer experts” tend to do. One of the key steps is

Google the name of the program plus a few words related to what you want to do. Follow any instructions.

How do you “Google the name of the program” if the institution has branded the LMS?

Does branding the LMS mean that students and teachers don’t know “the name of the program”?

Does this prevent them from following the tech cheat sheet?

What impact does this have on their learning?

A brief investigation

Early in the year I was noticing that a few students were having problems with “Google the name”, so I set an option activity that asked them to create a “technology record”. i.e. a record the names of all the technology that they are using. The idea is that having a record of the technology names can help solving problems. I included in that “technology record” that they specify the name of the software that provides the “Study Desk”.

There were 40 (out of ~300) responses including

  • 10 that identified uconnect, the institutional portal;
  • 8 that weren’t sure;
  • 8 that didn’t provide an answer for the Study Desk question;
  • 4 that identified their web browser;
  • 4 that firmly identified Moodle;
  • 3 that identified Moodle but weren’t sure;
  • 2 answered with the URL – http://usqstudydesk.usq.edu.au;

20% of the respondents were able to identify Moodle.

These are 3rd year students. Almost all will have completed at least 16 courses using Moodle. These are students completing an optional activity indicating perhaps a slightly greater motivation to do well/learn. A quick reveal that most of the students have a GPA above 5.

The still don’t know the name of the LMS.

I wonder how many teaching staff know the name of the LMS?

Does this hurt learning?

Perhaps if everything works with the LMS then this doesn’t create any problem. But if the students wish to engage with social and information networks beyond the institution, they don’t know the common name for the object they want to talk about. That has to hurt learning.

I imagine that there are librarians and others who can point to research identifying the inability to know the correct search term hurts search.

What do you think? Does branding the LMS hurt learning?

Contradictions in adjectives: You can’t be consistent and optimal

One current challenge is attempting to engage productively with institutional strategic/operational planning. The big challenge in doing so is balancing the perceived importance of institutional level concerns (governance etc) with those of an individual teacher.

As part of this challenge I was reading a document summarising the aims of a rather large institutional project in ICT around learning and teaching. Yesterday I tweeted part of the strategies from that project (it starts with “Ensure the development of”

As my tweet suggests I see some contradictions in the adjectives.

Here’s a story from the dim dark past to illustrate how it’s essentially impossible to have an online student experience that is both consistent and optimal.

You shall not use single quotes!

Back in the mid-1990s CQU was a fairly traditional second generation distance education provider. As such it had a style guide for print-based materials (almost the only sort) that were distributed to students. In large part the aim of the style guide was to provide a consistent learning experience for students. One such element of the style guide was ‘You shall not use single quotes’. “Double quotes” were the only acceptable option.

So, that’s consistent.

Less than optimal

As it happens, in the mid-1990s I was the tutor in the course 85343, Machine Intelligence. The practical application of the concepts in this course were done in the Prolog programming language. Here’s a brief section of Prolog code taken from here. Can you see the problem this is going to cause in terms of consistency?

move(1,X,Y,_) :-
write(‘Move top disk from ‘),
write(X),
write(‘ to ‘),
write(Y),
nl.

That’s write, Prolog code makes use of single quotes. The distance education study material for 85343 included sections of Prolog code. Do you know what the central distance education organisation did?

Obviously, because ‘You shall not use single quotes’ they automatically converted all of the single quotes into double quotes, printed the materials, and sent them out to students.

I don’t know whether the coordinator of the course got to proof the study material before it went out. But he was the Head of School and I’m willing to be if he did, he didn’t even think to check the style of quotes used in the Prolog code.

Consistent can’t be optimal

The lesson (for me at least) is that you can’t be consistent across all the courses in a university, while at the same stage claiming to provide an optimal learning experience for students.

This quote from Dede (2008) picks up on why this is a problem (or you can listen to the man himself)

Educational research strongly suggests that individual learning is as diverse and as complex as bonding, or certainly as eating. Yet theories of learning and philosophies about how to use ICT for instruction tend to treat learning like sleeping, as a simple activity relatively invariant across people, subject areas, and educational objectives. Current, widely used instructional technology applications have less variety in approach than a low-end fast-food restaurant. (p. 58)

And it’s not new

Here’s a quote from Jones (1996) – yep I had a bug in my bonnet about this almost 20 years ago and here I am again

With traditional on-campus teaching academics generally have complete control over what they teach and how it is presented. In CQU’s distance education model the subject matter’s presentation is controlled by DDCE. This results in significant tension between the desire to operate standardised systems for production and distribution of courseware and the desire for course designers to be creative and imaginative (Mark, 1990).

‘It’s like deja vu all over again’

There’s a paper or two here.

References

Dede, C. (2008). Theoretical perspectives influencing the use of information technology in teaching and learning. In J. Voogt & G. Knezek (Eds.), International Handbook of Information Technology in Primary and Secondary Education (pp. 43–62). New York: Springer.

20 Mark, M. The Differentiation of Institutional Structures. Contemporary Issues in American Distance Education, Michael Moore (ed), 1990, pp 30-43