ICTs for learning design – the first week

So, it’s the end of O-Week and a four day residential school for Graduate Diploma of Learning and Teaching Students – a post-graduate, pre-service, teaching qualification. About 100 students got a lot thrown at us for four days, now it’s time to get into some work. The following is an attempt to get some idea of and reflect upon the work I need to do with the “e-learning” course. Others will probably follow.

This course actually requires use to use a blog for reflections, so there will be more of that as well. This is a rough and ready summary of what I was doing.

Overwhelmed and Moodle

It’s the nature of this week and starting a new program that people will feel overwhelmed. Especially when there are some students who aren’t gung-ho technology users. Especially when one of the courses is around the use of Information and Communication Technologies (ICTs) in learning and teaching.

Given my background, I’m not one of those students, but I’m still feeling somewhat overwhelmed. In particular, I have this feeling that there are activities or tasks that I need to complete that I’m not aware of. I feel I’m missing something. That’s part of the reason for this post, to try and develop a list of the tasks I need to complete over the next week or so.

Beyond just the nature of starting a new program, I’m also wondering how much of this is connected with Moodle and, in particular, the design of course sites within Moodle. More on this as I generate the list.

The list

  • Add entry to “My profile” wiki (students placed in groups).
    Had some troubles with the Moodle wiki, got there in the end. Currently only two members of the group with entries. Have already experienced the problem with differentiated messaging. i.e. the other group member asked a question about the Moodle Wiki using the Moodle messaging system. I didn’t see that message until days later, long after seeing the person at the residential school.

    It is interesting that the course isn’t using some sort of aggregator to bring student blogs together, we’re being asked to add blog addresses to a wiki page.

  • Readings
    • Expert jigsaw;
      Contact one of the profile wiki members and choose a reading (about learning theories) each and use a PMI to present findings on the wiki.
    • Read the “study guide” – 3 or 4 pages
      • Do the F&S learning style test and reflect in a blog post.
      • Do a MI test and compare result with other students in a discussion forum.
      • Blog about 21st century students and Prensky.
      • Blog about some of the challenges arising from connectivism.
  • Create blog and add it to the blog list.


The “study guide”

Most of the work for this course seems to be described in a “study guide”. A single web page within the Moodle course with some blurb, a reading/activity, and some more blurb. So here’s a summary and some reflection.

Are leaners different?

From the guide

Today’s students are fundamentally different from the learners of 40 years ago.

Mmm, not sure I agree. Sure, there is probably a greater percentage/diversity of learners at schools, especially secondary schools. But are students fundamentally different? That’s a pretty strong statement.

There is no difference

Suggestion that whether f-t-f or e-learning, that knowing the learner and their needs is a key element.

This highlights one of the misassumptions around ICTs/e-learning. People seem to think that it is different. That they haven’t been teaching and learning with technology all along. Most of the face-to-face activities we engaged with at the residential school were mediated by some form of technology. Examples included: paper handouts, sheets of paper cut up and distributed in a bag, whiteboards, rooms and furniture configured in different ways etc. I do wonder when (or perhaps if) we’ll ever get over this tendency to treat ICTs as different?

Learning styles

And so, we’re into learning styles, multiple intelligences etc as the way to know students. I have a vague disquiet about these approaches. My concerns arise from two parts. First, the tendency for these approaches to be used to pigeon hole people into a small number of categories rather than celebrate the true diversity of people. Second, the attention these approaches take away from the more complex and nuanced approaches to understanding diversity. i.e. we apply the test, identify the categories and feel satisfied that we know the students. Perhaps that’s just one source of disquiet intermixed.

Okay, so now I’m doing the survey and finding myself saying “neither a nor b”, “it depends”, “sometimes a bit of a and a bit of b”. Another concern I have about these instruments.

Seems I have to post another entry on this question.

Multiple intelligences

So, we’re meant to take this test based on Gardner’s multiple intelligences. Here are my results.

My "multiple intelligences"

As with most of these tests, it meets my basic expectations, but there continues to be a uncertainty at the back of my mind about this work. I need to read and reflect a bit more. Trying to balance the desire to dig deeper with the desire to complete the study is interesting.

The following quote from Howard Gardner is interesting in this context

I have drafted generic letters that answer the most frequent questions-for example, “Is there a test for multiple intelligences?” (Answer, “Not one that I endorse”)

The associated PDF of FAQs is interesting reading.

The following quote from those FAQs about learning styles and intelligences is interesting (to me)

Educators are prone to collapse the terms intelligence and style. For informal matters, that is no great sin. However, style and intelligence are really fundamentally different psychological constructs. Styles refer to the customary way in which an individual approaches a range of materials—for example, a playful or a planful style. Intelligence refers to the computational power of a mental system: for example, a person whose linguistic intelligence is strong is able readily to compute information that involves language. Speak of styles, speak of intelligences, but don’t conflate the two if you can help it.

And while it’s somewhat off the current topic, I need to save this comment on creativity

Finally, as stressed by my colleague Mihaly Csikszentmihalyi, creativity should not be viewed simply as a characteristic of an individual. Rather, creativity emerges from the interaction of three entities: l) the individual, with his given talents, personality, and motivation; 2) the domain—the discipline or craft in which the individual is working; 3) the field—the set of individuals and social institutions that render judgments about quality and originality

The resonance between this quote and my thoughts about creativity within education and the negative impact of institutional policies and systems – especially LMSs and how they are supported – is quite strong.

And this on how intelligences are not set in stone

The extent to which intelligences develop is a joint product of biological (genetic potential), the emphasis a culture places on an activity, the excellence of the instruction, and the motivation of the individual. Any individual can strengthen an intelligence if she is wellmotivated; if her ambient culture values that intelligence; if there are human and artifactual resources (i.e. texts, computer programs) on which she can draw.

And at the value of the MI tests

As you may know, many other people have devised MI tests. The best known is probably the MIDAS test, developed by Branton Shearer. These tests typically give a rough-and-ready sense of people’s interests and preferences. They suffer from two deficiencies: l) They don’t actually measure strengths—you would need performance tasks to determine how musically intelligent, or spatially intelligent, or interpersonally intelligent a person is; 2) The tests assume that the person has good intrapersonal intelligence—that is, he or she knows himself well. But many of us think that we know ourselves better than we really do. I doubt that anyone would score herself or himself low in the personal intelligences, but some of us must have lesser personal intelligence than others.

I have nothing against a person using the MIDAS or other measurements to learn about the idea of multiple intelligences. Indeed, people—both young and old—often find it interesting and illuminating to think about their own and/or other person’s intellectual profile. And yet, I must stress, we can only learn about a person’s intelligences if we actually measure how well they perform on tasks that presumably draw on specific intelligences.

My take away from this is that there remains significant questions about the value of using the results of a multiple intelligence test as a direct guide to a learning design. Essentially because it’s highly questionable that they actually give a valid result as to what they seek to test. A true measure, perhaps, is only possible through observing participants while performing associated tasks.

There is, however, still value in these sorts of tests as a way of getting learners to think about different intelligences within themselves and their fellow learners.

Another interesting quote is

That is, we reject the focus on a single scholastic intelligence that is measured by a certain kind of short answer test

I find it somewhat ironic that responding to “a single scholastic intelligences that is measured by…short answer test” was one of the reasons for the development of the theory of MIs, and that we are now in a situation where your MIs can be measured by a MCQ.

In summary

Briefly, my theory can reinforce the idea that individuals have many talents that can be of use to society; that a single measure (like a high stake test) is inappropriate for determining graduation, access to college, etc.; and that important materials can be taught in many ways, thereby activating a range of intelligences.

21st century learners

Ahh, Prensky, immigrants and related perspectives.

First up, the Engage me or enrage me article. The suggestion is that today’s technological environment means that all students have technologies of various types that are working hard to engage them. That’s a very broad generalisation. Then there are claims like “Many of today’s third-graders have multiple e-mail addresses” – no evidence to support this and frankly my experience doesn’t support it.

Then there’s an argument about how the top 3 selling games of the era offer vast and engaging experiences that kids aren’t getting at school – this is showing how backward school is. But this 2008 US survey suggests that the average age of gamers is 35. Only 25% of gamers is under 18, 75% are over. So it’s perhaps not the students purchasing those games.

Okay, now there is a point about students being able to handle systems or problems much more complex than what they are being given at school. There’s the example of a 4yo gamer being bored to tears by “learning games”. Mmm, final blog post on this section.

A brief overview of learning theory

So, the standard behavourism, cognitivism and constructivism summary (this seems an useful overview). Interesting that connectivism is included.


There was a lot of reading this week, but even that only barely touched the surface of what is out there. I found it interesting, but without my prior experience this would have been a very daunting week.

Thoughts and applications of connectivism

So, post #3 for week 1 of the ICTs for Learning Design, this time on connectivism. There are two parts to the question, the second is somewhat easier and asks fo examples of how connectivism could be used in a classroom/learning context. The first is somewhat more difficult.

Position on connectivism

The question suggests that connectivism is contested by some and that some of this may arise from its challenging nature to some of the broadly accepted, established concepts or practices. The first problem with completing this arises from the fact that the guide we’re working from doesn’t really point to the folk that challenge/disagree with connectivism. This can be addressed by a quick search. That reveals a growing body of literature and challenges that can be quite complex and time-consuming to engage effectively with. Even with a history of a few years of engaging (not always deeply) with connectivism and its proponents, I don’t feel really all that comfortable enough to express a firm opinion. I’m beginning to wonder if I am over thinking this, so lets go quick and dirty for the purposes of assessment.

At the root of these learning theories/paradigms is a description of learning. I guess one approach to expressing a perspective is to examine the validity of the description of learning provided by the theories/paradigms. Based on this approach I think connectivism provides a model of learning that is perhaps closer to reality than the others. The black box of behaviourism doesn’t say anything about learning. The cognitivist approach is based on an information processing perspective. A learner as computer perspective which I don’t think captures how the brain actually works. The constructivist approach seems to wave its hands and say “learners construct knowledge”, each differently. It strikes me that there is some similarity with the network-based perspective of connectivism. i.e. students are constructing their knowledge by building and pruning networks.

That’s really broad-brush. Perhaps I should take the time to read a bit more (e.g. Kop and Hill, 2008)

One of the points in the question is

It is unsettling to be challenged about existing perceptions of “knowing”, in particular, the lack of purpose in asking our students to KNOW and be able to RECALL what they know in assessment

I’m not sure that there connectivism suggests that there would be a lack of purpose in asking students to know and recall. Connectivism, at least according to Siemens, suggests that the capacity to know (learn more) is more critical than what is known. One interpretation of this is that it doesn’t mean that what is known is unimportant, it’s just not as important as the capacity to know. In the end, demonstration of the capacity to know would seem to require an ability to demonstrate what is known.

Perhaps the view I expressed above connects more closely with Kop’s and Hill’s (2008) discussion of epistemological frameworks for learning. The final paragraph of Kop and Hill is interesting in this context

A paradigm shift, indeed, may be occurring in educational theory, and a new epistemology may be emerging, but it does not seem that connectivism’s contributions to the new paradigm warrant it being treated as a separate learning theory in and of its own right. Connectivism, however, continues to play an important role in the development and emergence of new pedagogies, where control is shifting from the tutor to an increasingly more autonomous learner.

Using this theory in the classroom

Currently I have only a theoretical/anecdotal understanding of what a high school classroom context would be like. I don’t fully appreciated the constraints of such a context, so the following will be limited by that lack of understanding.

I’ll draw on Downes comments on teaching and learning within connectivism here

to teach is to model and demonstrate, to learn is to practice and reflect

and use these within the context of teaching Information Technology – particularly programming. This is a good opportunity as it allows me to make concrete some vague ideas I’ve had for a while.

One aspect of IT is learning how to program. Often programming is taught through “pretend” authentic projects such as creating a reservation system for a restaurant. Other limitation of these projects is that the student starts from scratch and does the programming by themselves. The trouble is that increasingly most software development occurs within broader frameworks. e.g. developing a plugin for Moodle, for WordPress etc.

I’m interested in exploring how programming could be taught be encouraging students to do exactly this. Pick a module, open source application – like Moodle or WordPress – and over time develop or modify a plugin. In terms of teaching it would be my task to model and demonstrate the practices and knowledge required to do this (e.g. like my work on BIM) and the students would be required to engage in the existing developer networks around these open source projects. In terms of encouraging reflection and making connections between the students and those external, the students would be expected to maintain blogs on their practice, must like I do with BIM development.

There’s much more to this, but that’s the basics.


Kop, R., and A. Hill. 2008. Connectivism Learning theory of the future or vestige of the past. The International Review of Research in Open and Distance Learning 9, no. 3. http://www.irrodl.org/index.php/irrodl/article/viewArticle/523/1103%22.

Prensky, immigrants and old problems in new bottles

Okay, so blog post #2 arising from week 1 of an ICTs for Learning Design course I’m doing. This post is intended to address the question of 21st century learners. In part this connects back to a sentence from the start of this weeks study guide with which I had problems. That sentence was

Today’s students are fundamentally different from the learners of 40 years ago.

That struck me as a big claim, and to some extent it still does.

Today’s learners

In terms of today’s learners, I tend to think that they are not fundamentally different. Some (but not all) have certainly had some very significantly different experiences through the widespread availability of ICTs. I don’t, however, agree with Prensky that this experience has fundamentally changed those learners. Not the least because a significant percentage of students don’t have significant ready access to ICTs, so they can’t have changed fundamentally because of technology. Then there are questions about the type of engagement the remaining students have with technology and how much such engagement can fundamentally change them.

I currently think that today’s learners might, at a fundamental level, still have significant commonality with “yesterday’s” learners. At a less fundamental level they certainly do have some significant differences, but I am not yet convinced that those differences are fundamental. I’ll pick up on this in the next section.

Engage or enrage me

Prensky’s argument in this article goes something like this

  • Students are bombarded in their everyday life with technologies (games, phones, apps etc) that work really hard to engage them.
  • Schools just aren’t working that hard to engage them.
  • Schools need “damned good curricula gameplay” to address this problem.

I think this is an example of where there isn’t a fundamental difference between today’s and yesterday’s learners, it’s just that today’s learners have some different experiences that may well be emphasising a long-term problem. Prensky starts off his paper by dividing students into three groups:

  1. Those that are truly self-motivated.
  2. Those who go through the motions.
  3. Those who tune us out.

The last group are described as

These students are convinced that school is totally devoid of interest and totally irrelevant to their life.

I don’t see anything new in these groupings. If I go back 25+ years to my high school experiences I could quite easily place all my fellow students into those three groups. One of the major differences between then and now is that significantly more of the third group are now expected to complete years 11 and 12 of school. 25+ years ago most of the third group would have left at the end of year 10. If I go back to my parent’s days at school then the third group, such as my father, would have been leaving school much earlier.

Based on this, you might suggest the problem with “engage or enrage me” has more to do with changing societal expectations around schooling than technologies. Society now requires schools to look after the third group (and based on some recent discussion it really is a case of “babysit”) of students for much longer than prior without any fundamental change.

From another perspective the importance of learning that engages students interest and its positive effects on outcome is fairly well known/accepted. I don’t believe that the importance of engaging students has changed simply because a the most recent cohort has significantly expanded experience with technology. i.e. the change in technology is not a fundamental change.

It may be, however, a change that is important. If, and it remains a big if, students are increasingly familiar and comfortable with technologies then it might be that increase student engagement in learning may require greater and more appropriate use of engagement.

Another problem I have with Prensky’s argument is that it assumes that the fundamental change required of schools is “damned good curricula gameplay” and not a change in some of the other fundamental characteristics of schools.

Roger Shank (and many others) believe that the entire system is broken. For example, that the list of subjects which in the US owes more to the thoughts of a Harvard University professor from the 19th Century than the need to engage 21st century students.

Ira Socol goes to some lengths to show that the fundamental design of the education system (again US focused) is intended to help it fail.

So, I think there is some substance to the “engage or enrage” argument, but not in its use as the technology that is available to today’s students as being the substance. It’s the need to engage students and the difficulty of achieving such engagement, especially within the constraints of the existing education system that is important. Technology might help, but then the constraints of the existing system might prevent it from happening.

Learning styles, teaching and digital pedagogy

The following post if for the course EDED20491 and is in response to the following activity

Access the Felder and Solomon website and take the online questionnaire

  1. What is your learning style? What sorts of learning experiences would suit you best with your learning style?
  2. In a traditional classroom of 25 students, how would you support the range of learning styles each lesson?
  3. With your current knowledge of ICT, how could your design and digital pedagogy support your learners better?
  4. What sorts of profiling questions would you be asking about your learners to ensure you cater for everyone’s preferences?
  5. How does ICT support differences in learning styles?
  6. Create an entry in your blog (when it is created), and respond to these questions, and any you wish to pose, in the blog.

What is your learning style

The following image – click on it to see it in a larger form – summarises my results for the questionnaire.

Learning Styles

What that represents is that I am (supposedly)

  • A heavily reflective, rather than active learner.
    Reflective blogging, like this an related posts, is an example of a learning experience likely to suit this learning style. As suggested by Felder and Soloman

    Don’t simply read or memorize the material; stop periodically to review what you have read and to think of possible questions or applications. You might find it helpful to write short summaries of readings or class notes in your own words.

    Along similar lines, I’ve found having to teach something an effective way to learn as the act of teaching has required me to reflect upon what I know, how it is organised, how it can be explained, and what activities best encourage learning in others.

  • A heavily intuitive, rather than sensing learner.
    The explosion in information available via the Internet helps me here. I’m able to use Google and other means to find other interpretations, perspectives or theories on a topic. Establishing practices such as listening to a range of podcasts, following various edubloggers etc also open me up to alternate perspectives that enable and encourage connections. The capability to search digital information also helps in making connections to prior writings. e.g. searching my blog for posts that I believe connect in someway to a current topic. Both of which link with Felder’s and Soloman’s advice

    Ask your instructor for interpretations or theories that link the facts, or try to find the connections yourself

  • Slightly more a visual than verbal learner.
    On the verbal side the suggestion is “Write summaries or outlines of course material in your own words”. This seems to link nicely to this particular exercise. I’m essentially paraphrasing the material by Felder and Soloman and linking it to my individual context. For the visual side, the screen shot (though it’s a very text-based visual representation) of the survey results is a step in the right direction. Taking it one step further, I think the generation of slide presentations that have a highly visual component is one strategy that helps. For example, this presentation I did a couple of years ago.
  • Very slightly more global than sequential.
    Another blog post that I’m developing is essentially an attempt to develop a personal oveview of what I’m meant to be doing this week for this course. Again, a practice that links with some advice from Felder and Soloman

    Before you begin to study the first section of a chapter in a text, skim through the entire chapter to get an overview

    The post started with me skimming over the readings and activities for this week and generating a summary. When I was happy with that overview, that’s when I started working on this more detailed post in response to one of the activities.

How would you support a range of learning styles

I’m big on context. Depending on the context of the classroom of 25 students, the answers to this question would vary hugely, at least in the specifics. If I were to attempt for something abstract, the two-pronged approach would probably be

  1. Where possible adopt an appropriate array of different activities designed to support different learning styles.
    i.e. rather than simply always (or never) use classroom discussions, use them sometimes and supplement them with strategies that better suit more reflective learners. Or adopt modifications of classroom discussions that incorporate some aspect of reflection or at least time and space for reflective learners to engage.
  2. Capacity building.
    Students are not always going to find themselves in situations where their preferences are catered for. It would seem important to develop within them the capacity to deal with situations like this either through developing their skills on the other side and/or identifying strategies that help them deal with those situations.

The last point may well be essential for their school education as the context of schools may well mean that few of their learning experiences will provide exactly what their preferences require.

Design of digital pedagogy

To a large extent, I think my answer to this question is embodied in the previous two responses. First, I don’t believe there is any fundamental difference between non-digital and digital technology. Digital technology provides just another set of tools/technologies to help with learning and teaching. As it happens, I was listening to an interview of John Seely Brown this morning. In that interview Seely Brown (or it might have been the interviewer, Steve Hargadon) said something that summed up the point I’m trying to make here. The quote went something along the lines

With digital technology it’s not about the technology, but about the effects it can produce.

So, I would be using the same approach to design a digital pedagogy as I would a non-digital pedagogy. I would be aiming to fulfill the two-pronged approach from the previous question. To fulfil those two prongs I would draw on the advice and insights briefly discussed on this page and more deeply in other literature.

Which is essentially what I did in the first question where I equated the advice from Felder and Soloman with the practices/effects I was currently using this blog and other digital technologies with. i.e. I was looking for digital technologies that provide the effect suggested by the advice from Felder and Soloman.

Profiling questions

Well, I probably wouldn’t be using the Felder and Soloman learning styles questions as in the FAQ Felder explains how these questions have only been validated for college age students. The questions might retain some usefulness for younger students, however, the validity is somewhat questionable. In addition, there is the whole problem of self-reporting. i.e. I’m certain that a percentage of school students might see this as an opportunity to have some fun by answering exactly the opposite to their preference. This raises the question of whether or not there are any similar instruments that have been validated for use in school children. Not to mention validated for use with the cohort of children that I’m likely to teach.

That said, I would probably lean towards drawing on versions of these questions asked in the flow of classroom activity (rather than in a formal survey) combined with classroom observation. If subsequent research revealed validated instruments for school children, I might rely upon them.

I’m guessing that the answer to this question will have to be refined further prior to the completion of assignment 1 for the course.

How does ICT support differences in learning styles?

Again, I’d make the distinction that it is not the technology that supports the differences, but the capabilities for effects that it makes available. (This has me thinking about whether this is to fine/academic a distinction to make, for now I’ll stick with it.) Different types of technology provide effects that were previously not possible. e.g. the Blackboard provided a way for written or drawn information to be shared with a whole classroom. While a wiki allows multimedia information to be shared and modified with any group of people from anywhere with Internet access.

So, ICTs support differences in learning styles by providing effects/capabilities that fulfil some of the advice provided by Felder and Soloman. Especially when such capabilities were previously unavailable, too expensive etc. e.g. a personal blog provides a place for a reflective learner to work on and share their reflections in a way that is concrete and visible to others. Something that isn’t easily possible for a paper-based diary.

From a different perspective, ICTs support differences in learning styles because of their capability to manipulate digital information. For example, it’s possible that a verbal learner may not get much out of this blog post, but there are services that will automatically convert blog posts into audio. Hypermedia makes it easier to support both global and sequential learners in their approach to texts. Hypermedia and services like youtube make it much easier to support visual learners. e.g. 10 years ago (even 5 years ago) the readings for this course would not have been sprinkled with half a dozen videos.

From yet another perspective, ICTs support differences in learning styles to the extent that the learners, teachers, technologies, education systems and policies involved in learning allow ICTs to support differences in learning styles.

How many pages of a course profile are necessary?

This week brings the first formal tasks of my new phase as a teacher in training. We, the students enrolled in the Graduate Diploma in Learning and Teaching, are required to attend four days of a residential school. These four days coincide with Orientiation Week and we’ll be attending bits of standard O-Week, but mostly focusing on the res school.

Apart from pen and paper, the only items were advised to bring are printed versions of the course profiles. Now there was a time when the university provided print copies of the course profiles to all students. But now, mostly to save costs, these are distributed electronically and it is up to individual students to do the printing. Regardless of the course, the course profile follows a fairly standard template, which has me wondering. How many pages of a course profile do I really need to print?

After all, if it’s okay for the institution to save costs, it must be ok for me. Here’s what I found.

Course 1

3 pages of what appears to be reasonably important information, but information that is common can go. Surely it is covered in one of the myriad of guides and primers that we’ve been encouraged to read. In addition, having worked at the institution for 20 years in learning and teaching, I’m familiar with most of it.

Each assignment includes a rubric for marking. It appears to be a direct copy of what might be used by the marker. Even down to the dotted lines for comments, the lecturer’s name and a list of grades to be circled. Can’t see the need for that, especially when for 2 of the assignments that information is the only think on the page (it’s the last bit of information before the next assignment which obviously they want to start on a new page).

16 pages down to 11.

Course 2

This is the minimal course. What I think of the traditional course taught that arises when by a long-term member of staff is teaching a topic he/she knows well. Shall be interesting to see the differences evolve as the term progresses.

For example, the original course profile was only 9 pages long. The same 3 pages first removed from the profile course #1 were removed.

9 pages down to 6.

Course 3

10 pages down to 7 – same three pages.

Course 4

Again, only the same 3 pages: 13 down to 10.


So, overall not that bad, but still 14 pages. With some work on spacing in the layout of the profile more pages could be saved. Of the 3 pages I removed, there’s an argument to remove them, but then there’s also an argument that some students have had problems by not knowing that information.

A proposed link between academic involvement and student evaluation response rate

In the following I ruminate on a possible correlation between academic participation in a course and response rates on end of term student evaluation. There are two points:

  1. It should be an relatively easy correlation to test.
  2. It would offer support for ways in which student evaluation response rates can be increased.

Understanding low response rates for student evaluation is important, at least for some institution, because they are expending significant resources in an attempt to increase response rates. Personally, I think they are generally barking up the wrong tree and that this sort of work might reveal a better tree.

Background – LMS, analytics, correlation and student participation

One aspect of the Indicators project has been using the data that is available from information systems such as an LMS to test correlations.

For example, in an early publication one of the correlations we explored was the link between student participation – measured with clicks on the course site or better yet with posts or reading of the course site discussion forum – within the LMS and student grade. Some work by Dawson (sorry don’t have the reference) had established the pattern that the greater the student LMS participation then the greater the student result. i.e. HD or A students used the LMS more.

We found this pattern with distance education students. As shown in the following graph that shows the connection between participation in a course discussion forum and student grades. (Click on the graph to see it bigger)

Average posts & replies for FLEX students

HD (high distinction – the top grade) distance education students were, on average, posting 13 replies per course discussion forum compared to 11, 9, 6, 5 and 3 for the lower grades.

But we didn’t find that on-campus, international students. As shown in the following graph using the same measure.

Average posts & replies for AIC students

The HD AIC students were actually making, on average, less replies than all other students except those that failed.

It should be remembered that we don’t know why this is happening, we’ve simply established an interesting pattern.

Another interesting pattern we established was that increase staff participation in the LMS course site also influences the correlation between student grade and student LMS participation.

Students in courses with high staff participation (staff clicked on the course site more than 3000 times in a term) gave this pattern.

Average student posts/replies on discussion forums for high staff participation courses

Students in courses with super-low staff participation (staff clicked on the course site less than 100 times in a term) gave this pattern.

Average student posts/replies on discussion forums for super low staff participation courses

Student evaluation response rates

For various reasons it has become important to some university folk to increase response rates on student evaluations. These are the anonymous, generally end-of-term, evaluation forms that ask the student about their course experience.

One institution I’m aware of is reporting figures that suggest an average response rate across all courses of about 16%. 3 of the courses has 100% response rates, but they all had 6 or less students. The highest response rate for a decent size course (n=99) was 68%.

The most visible strategy being used to increase response rates is lots of visible encouragement from the Vice-Chancellor and a 825×305 “banner” image on the LMS home page when students login.

If they want to increase response rates, I think they are barking up the wrong tree.

The right tree?

Based on my experience, observations and the above results, I believe there’s a strong correlation between the sense of responsiveness/participation students get from the academic staff teaching their courses. After all, one (but not the only one) of the reasons why I think it’s easy to get 100% response rates for a course with 6 students is that with 6 students a teacher can be very responsive.

It’s hard to get a computable indicator of that sense of responsiveness from a face-to-face classroom situation, but the above patterns from the indicators project suggest that a proxy figure – though I do admit that it is a somewhat flawed proxy – can be generate from the LMS.

I think it would be interesting to map the response rate on student evaluations with the level of staff participation in the LMS course site. If I’m right, then the best response rates should come from those courses with high staff participation.

There will be confounding situations, but in general I think the pattern would hold. In fact, the confounding situations would be interesting to investigate. What is different about those situations? What lessons might an institution learn? It might be interesting to survey some or all of these staff in order to find out what is the thinking behind their practice, is being responsive high up the list?

Of course, it’s still the wrong tree

In the end, however, there’s no getting away from the problem that such student evaluations are broken and not really worth it.

At the very least, the results of the above might just encourage a few staff to think about how they can improve student perceptions of their level of responsiveness and participation.

bim2: manage marking services – Part II

Time to continue the implementation of support services for manage marking started in the last post. Services left to be implemented are

  • Releasing marked posts.
  • Registering a blog.

Releasing marked posts

Standard order for this stuff

  1. Controller.
  2. Model.
    In this case, the model will be making changes to the database and storing the outcome.
  3. View.


First, parse the parameters. There are two: marker and question. The possible combinations are

  • question and marker empty == release all marked posts for this activity.
  • question set only == release all marked posts for that question.
  • marker set only == release all marked posts for that marker.
  • marker and question = release all marked posts for that question and that marker.

So, have the model processing and updating the database. Also getting list of changes to report in the view.

The view will need to be able to translate student uid and question id into question title and user name. So need to include students and questions in the model.


A simple one, simply suggest success and summarise the changes made. Yep, all done. Need to do a bit more checking

Misc bugs

On manage marking there are a couple of minor bugs to fix

  • Empty table and email button showing up when there are no unregistered students.FIXED
  • The “release” link is appearing within a question heading when there are no questions to release.
    This one is probably going to be a bit more difficult to fix. Actually, the header code is correct in what it is doing. Time to check out how the rows are calculating whether there is anything marked. The display of marker details appears to be working, it’s the calculating of the stats that seems to be letting itself down.

    Ahh, all that and it’s not a code problem, but with the configuration of the users and course members. Mostly.

This is the type of bug that is going to have to be looked for in testing.

Registering a blog

This allows the coordinator to register a blog for unregistered students. It is essentially the same as the process for a student to register, just a slightly different interface. Hopefully, there can be some significant code reuse. Let’s look at how I did it for the student.

The student interface itself is a bit too complex. Registration is handled by some overly complex code within a method of one of the view. i.e. it’s not that stand alone. It is complicated because it also includes processing of the form.

Am going to have to put this into a stand along object that can be easily called elsewhere like this

$form = new process_register_form( $this->factory );

There are going to have to be some differences between using it with a student and with a coordinator, including

  • heading – this (including breadcrumbs) will have to change.
  • return URL – successful processing generates a “return to ??”, ?? will change based on where it’s calling from.

In terms of the heading, it’s even more complex as the student and coordinator should see radically different headings when this is called. In the end, the code looks like this

$form = new process_register_form( $this->controller );
$form->set_header( $this, "view_header", "registeration" );
$form->set_return( "?param=regOK&id=".$this->model->factory->cm->id);

The set_header method is used to tell $form which function to call to display the header that is required. set_return provides the URL params which let $form now where to redirect output when successfully registered.

So, time to add this in the appropriate place for manage_marking. There is some difference here in that it’s not being called from a view, but instead a controller. Another wrinkle to consider. Let’s create a view just for register and stick the call in there, this allows the view to be handled.

Ahh, no another problem, the URL for the register your blog has to be updated. Ahh, that’s the first parameter in the form. Done.

Next problem, when the student is registering their blog, the userid is based on their login, but that can’t happen when the coordinator is doing it. Has to be based on the student parameter. So, will need to pass in the userid for the student with the blog to be registered as a parameter. So this means I need to

  • Generate the form with an extra hidden parameter for coordinator. done
  • Same for student – how do I get id of browser user? (this->factory->userid) done
  • Modify processing of the form to use the hidden parameter. done

What’s next?

So, manage marking is essentially done. Time to move onto “Your students”, some new code there, also a fair bit of work to do, but should get quicker.