Adding more student information to a Moodle course

moreStudentDetails.user.js is a Greasemonkey script I’ve written to provide more details about a student when I’m using Moodle. Originally intended to help when responding to a student query in a course I teach that regularly has 300+ pre-service teachers from a range of backgrounds and locations. The current version produces something like the following image (click on it to see a larger version).

MAV-based more user details

The script adds a link titled [details] to the Moodle page whenever it finds a link to a user profile (see above). When you click on that link a small dialog box pops up with some more student details. For my purposes, I’m particularly interested in what type of pre-service teacher they are and their mode/campus.

This script uses much the same technology as the gradebook fix mentioned in this post and @damoclarky’s Moodle Activity Viewer. The work on these scripts is part of an on-going project to identify some theories/principles that can be used to enhance institutional e-learning (see this paper for early development of these ideas).

The rest of this post is divided into two parts

  1. Recent developments – documents thinking about how to transform this simple script into something that provides more useful and specific process analytics (see this post for a definition of process analytics). Also documents early attempts to share this script via github.
  2. Initial development – a development diary of early steps in developing this script.

Recent developments

Sharing via github

Have just created the BAD repository on github. It currently hosts two scripts

  1. gradebookFix.user.js – briefly mentioned in this post this script modifies the Peoplesoft gradebook to highlight special cases.
  2. moreStudentDetails.user.js – the script described here. Only the client script, not the server at the moment.

Much of this code is still quite ugly and probably not at all useful by others (though the gradebookFix.user.js should be useful by any course examiner from USQ).

Creating the repository at the moment is more about having the scripts under source control, stored off my laptop and to start playing with the process and mechanisms of sharing these types of scripts.

The name “BAD” is based on the BAD (Bricolage/Affordances/Distributed) mindset formulated in the paper.

Extending it to include process analytics

Lockyer et al (2013) define process analytics as analytics that “provide direct insight into learner information processing and application” (p. 1448). i.e. analysis and representations that provide some additional detail about how the learning is progressing. I’m keen to add more of this to the “more student details” script. The following explains what I’d like to add and some reflection about how this might be best done with the technologies available.

As it happens, @Edu_K has just commented on a post and described nicely what I’m trying to achieve

I like your idea of in-built LA functions into the existing tools. This can help their use to adjust teaching “on-the-go” in response to needs of the particular cohort – which is one of the most important abilities of a good educator

The plan

I’m looking to add two additional groups of information about students specific to this course to the dialog box

  1. Activity completion; and,
    Each week of the course has a learning path of set activities. Students get some marks for completing these activities and Moodle’s activity completion functionality is used to track their work. Having a usable summary of each student’s activity completion available in this dialog would help understand where they are up to in the learning path.
  2. Blog post activity.
    The course requires the students to create and post to their own external blog. The BIM Moodle module is used to mirror blog posts and help award marks to students based on # of posts, word count etc. Adding a summary of the student’s blog posts, related statistics and perhaps other analytics (e.g. emotion etc) could also be useful.

The mockup

This will probably involve some fairly advanced jQuery work – something I’m new to – hence the need to start with a mockup. Once the design is sort of working I’ll post this and a subsequent post will pick up the coding.

The initial mockup (ugly colour scheme and all) can be seen in the following image. Or you can actually play with the mockup here.


What the mockup shows in the above is the visual representation of the activities the student has completed (or not), some explanation

  • There are 3 modules.
    Each module in the above is coloured from green (most/all complete) through yellow (a fair bit complete) down to red (not much complete). Initially you can only see the summary of the module completion. But you can drill down.
  • Each module has 3 or 4 weeks.
    The above shows Module 1 expanded to its three weeks. Each of the weeks are also colour coded based on the weekly activities that have been completed.
  • Each week has a number of activities.
    The above shows Week 2 expanded to show its 5 activities. 2 are completed and are in green. 3 aren’t. The completed activities include the date/time when they were completed and also the week of semester in which that date occurs. The real version would have those activity names as links to the actual activity.

Initial development

The following is a description of long gestating approach to solving a problem I have when teaching. i.e. knowing a bit more about my students when I’m replying to a query on a discussion forum in a Moodle course. It describes a modification to the Moodle Activity Viewer (MAV) to solve this problem.

What I did

  1. Fork a new version of the MAV code.
  2. CLIENT: Get MAV running only on my course.
  3. Figure out how it will all work
  4. CLIENT: Get the data to send to the server (user ids) on the current page.
  5. CLIENT: Send that information to the server.
  6. CLIENT: Figure out the popup.
  7. SERVER: Return a collection of HTML to the client.
  8. CLIENT: Add a popup to the moodle page for each user link.

    Yep Damo and Rolley, going with the kludge first up.

Add a new link for people to click on and use that

This does it.

 $(element).after('<a id="user_"'+userID+'" class="makealink"><small>&nbsp;more&nbsp;</small></a>' );

But the problem is that there can be multiple such links (e.g. one around the image on a forum). May not want to add a link on all. Plus there are some other issues with passing values. Here’s what works now.

$(element).after('<a data="'+userID+'" class="showMoreUserDetailsLink"><small>&nbsp;[details]&nbsp;</small></a>' );
    function() {
       var id = $(this).attr("data");

OUTSTANDING: Still have limit the situations where this is added.

Get some data from the server

  1. Create an empty server that returns nothing.
    $html = "<h3>Getting data from the server</h3>" ;
    header('Content-Type: application/json');
    echo json_encode($html) ;
        error_log('html='.json_encode($html)) ;
  2. Update the client to query the server.

    Copied an existing method. Passes the user id and displays information back from the server. Pared back the message length and its working well.

  3. Create the database tables for users for the MAV server.

    Main issue here is that I’m dealing with two separate Moodle databases with different user ids. Two steps required here on my local Moodle database:

    1. Create a table to map between ids.

      Need to extract list of user ids from the institution, match with local and stick in database.

      The enrolled users report and some regular expression magic in vi etc gets me a list of ID and name in a text file.

      Rather than create a new table, adding a column to the mdl_user table on the local server “usqMoodleId” is the kludge”.

    2. Create the table(s) required to store the additional information.
  4. Have the server extract and return real data.

* Modify the server to return specific data for each user
* Map the ids from study desk to my database
* Only add the [details] link for specific links and only for links associated with this course?

Fork a new version

This is a kludge. Not making this pretty so a new directory and start from scratch.

Only run on my course

I’ve the method balmi.getCoursePageLink returns NULL, MAV doesn’t work.

I’ve modified this to return NULL if the Moodle course ID for the page doesn’t match the ID for my current course. Obviously this would need to be more general in the future.

How will it work

Basic plan is

  • Update the initial Moodle page: detect any links to user/view.php and bind a hover event on that link to function.
  • That function will pass the user id to the MAV server, get some HTML back and generate a dialog box.
    • Get the dialog box working

      First test is to modify the links and get the dialog box appearing without any interaction with the server.

      Get the data to send to the server

      The idea is that MAV will extract the Moodle user ids that it finds in the current page. If there aren’t any, then nothing to do. If there is some, it has to bundle those up and send them to the MAV server to get additional data about the user. To do that I have recognise the user profiles and then extract the URL.

      User profile links are typically of the following form

      moodle URL/user/view.php?id=userid&course=courseid

      That should be fairly easy to recognise and the existing balmi.getMoodleLinks should serve as a template

      Change the name to getMoodleUserLinks and fiddle with the regular expressions to focus on the user links. That’s working.

      Some stuffing around to extract the user id thanks to limited knowledge of Javascript.

      As it stands with just these changes, the client is sending the following JSON to the server

      {  "mavVersion":"0.5.4",

      In a proper development I’d actually change all this, but I need to get this working. Actually I will change it slightly.


      Modify the request so that it’s going to the right server.

      New server (API) getUserProfile.php

      Figure out the popup

      This is the bit that will stretch my non-existent JQuery skills. How to modify the Moodle page to add the dialog/popup I want for each bit of user data passed back from the server?

      Apparently, I’ll be using the JQuery dialog widget and apparently the getStudentAccess method is a useful template. Of course that through me a bit until I assumed to use it as a model to modify the requestData method from the original MAV that I’m kludging.

Established versus Affordances: part of the reason institutional e-learning is like teenage sex

The following is an attempt to expand and connect some thoughts from a soon to be presented ASCILITE’2014 paper titled Breaking BAD to bridge the the reality/rhetoric chasm” (this link will eventually have a range of additional resources). The expand part of this post is me trying to experiment with some approaches of explaining what we’re trying to get at. Hopefully with the aim of being convincing. The “connect” part of this post aims to connect with some of the discussion about the LMS that has gone on recently.

Is elearning like teenage sex?

Some context

The paper draws on our experience in an attempt to identify some reasons why institutional e-learning is like teenage sex (you’ll soon see why I’m reluctant to just say bad). In doing so, we’re proposing two mindsets we’ve seen/used that underpin institutional e-learning. They are

  1. the SET mindset – Strategic, Established, and Tree-like.
    The most common (and the cause of all the problems, we think).

  2. the BAD mindset – Bricolage, Affordances, and Distribution.
    Which better describes the mindset we use in our own practice.

Even though these two mindset are incommensurable, we think that institutional e-learning is to SET in it’s thinking and needs to break BAD a little (maybe a lot) more often. This table summarises and compares the two mindsets.

This post – at the title suggests – is going to look at the 2nd of the three components of these mindsets. i.e. How is Information and Communication Technologies (mostly software) (ICT) perceived?

The SET mindset sees ICT as Established. With this mindset, ICTs are a hard technology and cannot be changed. Instead, people and their practices must be modified to fit the fixed functionality of the technology.

The BAD mindset sees ICT as having Affordances for change. ICT is a soft technology that can and should be modified to meet the needs of its users, their context, and what they would like to achieve.

The paper/presentation seeks to illustrate these mindsets and their components by drawing on challenges that @damoclarky (co-author) and I have faced and how we’ve worked around them. The point is that these examples are indicative of broader problems. These won’t be the only examples of such problems, there will be many more. The paper uses two other much larger examples from our experience.

The point is that the SET mindset underpinning most institutional e-learning makes it difficult, if not impossible, to be aware of let alone respond to these problems.

A practical example – Peoplesoft gradebook

My current institution (and my prior institution) had the misfortune to choose to implement the Peoplesoft ERP at the turn of the century. Many of millions of dollars later we are lumbered with using this conglomeration of tools for many tasks, including the processing of final results and grades at the end of semester. A task that is about to start.

In my situation, all of the results for students are entered into a local online assignment management system that is also used for the submission, marking etc of student assignments. Once a student’s assignment is marked, moderated and returned to the student, their mark for that assignment is placed into the Peoplesoft gradebook. Once the final assignment is moderated and returned I can view the Peoplesoft gradebook and see something like the following. Listing all the students, their names/ids (blurred here), the final result and the grade awarded. (Click on the images to see larger versions)


Now the Peoplesoft gradebook has been configured to do something intelligent. It will automatically calculate the students grade based on the result. And it is mostly, but NOT always, correct. You see there are special cases that have not been programmed into the gradebook, including:

  1. rounding up;
    If a student is within 0.5% of a grade borderline (e.g. 84.6%) then they should be upgraded to the next grade.
  2. supplementary;
    If a student is within 5% of a fail mark, then they should be given the appropriate form of supplementary grade.
  3. missing compulsory result;
    If a student has not yet received a mark for a compulsory assignment, then they should be given a “result outstanding” grade.
  4. fail not completed; and,
    If a student has failed a course, but hasn’t completed all of the assessment, then they should get a “fail not completed” grade.
  5. fail not passed;
    If a student hasn’t completed any of the assessment items, then they should get a “fail not pass” grade.

Since none of these special cases are handled by Peoplesoft, the course examiner must do it manually. The course examiner is expected to scroll through the gradebook looking for students who fall into these categories. When identified, the examiner then changes the grade to the appropriate value and may be required to add a note explaining the change.

The course I teach in first semester has 300+ students spread over four different modes. This means I have to scroll through four difference lists (one list per mode), some with 100s of students manually searching for special cases. As shown in the image above, the gradebook page does not show any other information such as whether the student as submitted all assessment items. Or in my case, whether a result for the Professional Experience item (managed by another section) has been received. Not only do I have to manually look through a web page of a 100+ students. I have to be manually checking other data sources in order to make the change.

And my course is by no means the largest course.

To help with this process, various actors within the institution will every semester or so generate sets of instructions about how to use the Peoplesoft gradebook and reminding people of the special cases. Every semester a several other staff members have to run checks on the changes made (or not) by the examiners to identify potential problems.

Why is this a problem?

Human beings are very bad at repetitive, low-level tasks like this. Computers are very good at it. In this case, the Peoplesoft system is only doing part of the necessary job.

This isn’t a particular learner focused example, but the suggestion here is that these types of problem a littered throughout the learner/teacher experience of institutional e-learning. This particular problem is interesting because I’ve just solved it.

Why can’t it be solved?

But I’m not holding my breath for the institution to solve this problem. My argument is that the SET mindset that underpins how it does business will significantly limit its capability to recognise this problem, let alone fix it.

In part, this is because organisations are increasingly seeing ICT as established. i.e. it’s not something they can change. ICT – especially large complex enterprise systems like an ERP or an LMS – is something that should be implemented vanilla, as is. This is best practice advice in the research and practitioner literature. ICT is seen as very hard and expensive to change. Well illustrated by my favourite quote from an IT person participating in a LMS review process

we should seek to change people’s behavior because information technology systems are difficult to change (Sturgess and Nouwens, 2004, n.p)

Even though I detest Peoplesoft, I have to imagine that there is some mechanism within it to program these sorts of special cases into the gradebook. But the system owner of Peoplesoft is that student administration part of the institution. In a separate branch of the organisational tree (hierarchy) than the academics and other staff who are carrying the cost of doing this manually (this links to the 3rd component of SET/BAD).

But a problem here is also the assumption that it is the Peoplesoft gradebook that has to change. It seems obvious. The problem here is that the gradebook isn’t providing some functionality. Thus, the expensive ERP needs to be blamed and fixed. But ERPs are hard (i.e. expensive) to change and can only be changed by qualified people who are expensive and scarce. They can only be used for important – as defined by the system owner – work.

The net effect is that it is very difficult (if not possible) to change the gradebook. It is Established. It’s likely that this view is so fundamental, that the possibility of using ICT to fix this problem wouldn’t even be considered.

This perspective of ICT reminds me of this quote from Churchill.

Churchill on established mindset

The established mindset results in us and what we can (and can’t) do being shaped by our technologies.

How might the BAD mindset can solve the problem?

With a BAD mindset you might get a solution that generates the web page shown in the following image. In that image you can see that two of the rows (students) are now coloured differently. The change in colour represents one of the special cases mentioned above. You’ll also see that the coloured rows have some additional hints to remind the course examiner either: what they need to do; or, what they’ve already done.


This solution did not involve any changes to Peoplesoft. As established above, it is too hard to change and I don’t have the access or knowledge to change it. However, there are other possibilities. At some stage the gradebook generates a web page and sends it to my web browser on my computer. At this stage, I can get some level of affordance for change via Greasemonkey, a plugin for the Firefox browser. Greasemonkey allows you to write scripts (written in Javascript) that can manipulate the presentation and functionality of web pages.

Using Greasemonkey, I’ve been able to write a script that

  1. Recognises when a web page generated by the Peoplesoft gradebook appears;
  2. Looks through each of the students in the table looking for the special cases; and,
  3. When it finds one, the script changes the colour of the row and adds a hint about what should be done by the examiner.

The protean nature of ICT has been leveraged to increase the affordances offered to the user.

In theory, I can share this Greasemonkey script and it can be installed by anyone. Assuming that the default institutional operating system hasn’t been SET in stone so that you can’t install a Firefox plugin.

The Affordance mindset

The term affordance comes in many shapes, sizes and arguments. In this context, an affordance is seen as a relation between the abilities of individuals/organisations and the features of ICT. In the BAD mindset there are two important implications around the idea of affordance, they are

  1. ICTs generally have an affordance for being protean/mutable.
    Echoing Kay’s (1984) discussion of the “protean nature of the computer” (p. 59) as “the first metamedium, and as such has degrees of freedom and expression never before encountered”. But as affordances are relational, this affordance is only perceived by those who can “program”. Echoing Rushkoff’s (2010) sentiment that that digital technology is “biased toward those with the capacity to write code” (Rushkoff, 2010, p. 128). The problem is that most universities are increasingly getting rid of the people who have the ability to “write code” or more generally perceive ICT as protean.
    Since I can write code and have access to Greasemonkey. I can use Greasemonkey to manipulate the output of Peoplesoft to change in to a form of expression that is more appropriate.
  2. ICTs should be continually refined to maximise the affordances they offer.
    i.e. people shouldn’t have to reshape their hand to fit the tool. The tools they have should actively help and transform what is done. This process should tend to recognise the diversity inherent in people.

In short, the affordance mindset not only knows that ICT can be changed, it should be changed to suit the purposes of the people using it. Or to adapt the Churchill quote above to the affordance mindset.

Churchill modified - affordance of ICTs

This mindset is not new. People have always done it. There are in fact entire sections of literature devoted to talking about people doing this. For example, there’s literature on “shadow systems” in the information systems field that sees such practices as a threat/risk to organisational integrity, efficiency and apple pie. There’s a similar literature – mostly in computer science and management – under terms such as work-arounds, make-work and kludges (Koopman & Hoffman, 2003).

The increasingly digital nature of life and the increasing availability and functionality of enabling technologies (like Greasemonkey) are only making the affordance mindset more widely available. The problem is that not many organisations are yet to recognise it.


Our argument is that unless an institution can adopt more of an Affordance approach, rather than an Established approach to ICT, it’s unlikely to make any progress in bridging the chasm between the reality and rhetoric around e-learning. It’s e-learning will continue to be more like teenage sex.

However, we don’t want to stop there. One of the aims of this work is to try and understand how to bridge the chasm. To explore answers to how institutional e-learning can break BAD? What might that look like and what might be the impacts?

Changing any mindset is far from easy.

Can “scratch my itch” become “scratch our itch”?

The above example – like most current examples of the affordance mindset – arise from the work of an individual or small group scratching an itch particular to them. They are interested in solving their own problem with the tools they have to hand (bricolage which is the B in the BAD mindset and the topic of a latter post). A problem with this approach is how can that personal scratching of an itch become something that is usable by more (or perhaps most) people in an organisation?

One approach is for organisations to focus less on the assumption that central ICT staff are able to develop/select “perfect” ICT that don’t need to change, and instead focus on “ways to make work-arounds easier for users to create, document and share” (Koopman & Hoffman, 2003, p. 74) through organisational “settings, and systems – arranged so that invention and prototyping by end-users can flourish” (Ciborra, 1992, p. 305).

This appears to involve at least two steps in terms of technologies

  1. Providing technologies etc. that make it easier to scratch an itch (create).
    e.g. an institution providing appropriate APIs for its data and services.
  2. Providing technologies etc. that make it easier to share “scratches” (document and share)
    e.g. something like User scripts for Greasemonkey. A place where people from across the institution (and perhaps broader) can share, comment and rate “scratches”

The threat of mobile and other closed systems?

In this example and all the other examples we’ve develop so far we have relied on Greasemonkey to provide the duct tape that we use to renovate the institutional systems. Greasemonkey works in a Web environment. The increasing push to mobile devices and away from the web has some implications for this type of work. Specific apps are not a very open system, they don’t offer much in the way of affordances. The apparent death of RSS is indicative of a preference for some technology providers for integrated/closed/established systems, rather than open systems.

Growing digital renovators and builders

Since affordance is a relational concept, simply providing the technology isn’t enough. Every institution will have their own “Fred in the shed” or “Lone ranger” that have been scratching their own itch. But the majority of staff can’t, don’t or won’t. The 2015 Horizon reports identify the apparent low digital fluency of academics as a major impediment to the quality use of ICT in learning and teaching (Johnson, Adams Becker, Cummins, & Estrada, 2014). If academics are perceived as not being able to use the current technology, what hope do they have for being able to modify it?

One perspective might be to consider the implications of what it means for affordance to be relational. Blaming the limited digital fluency of academics is only looking at one side of the relationship. Sure, there may be an issue with digital fluency and the availability heuristic means that just about everyone has a story about a “dumb academic” to illustrate this. However, there may also be an issues (as shown above) with the quality of the technologies (and the support systems, processes and policies surrounding those technologies) they are required to use.

There’s also a question about whether or not current definitions of digital fluency are sufficient. Is a focus on the ability to use the institutionally provided ICT effectively sufficient for fluency? It might appeal to institutional senior management who just want their staff to use the ICT they have been provided with. But, as argued above ICT are protean, a key advantage of ICT is the ability to be modified and changed to suit requirements. Is there a case to be made that being digitally fluent may extend to include the ability to modify ICT to suit your purpose?

Not to suggest that everyone needs to have an undergraduate degree in software engineering. Not everyone in the digital world needs to be a digital builder. But perhaps being digitally fluent does mean that you should have some skills in digital renovation. This capability may be especially important if you are a teacher. Shulman (1987) described the distinguishing knowledge of a teacher as the capacity

to transform the content knowledge he or she possesses into forms that are pedagogically powerful and yet adaptive to the variations in ability and background presented by the students (p. 15)

If learning and teaching are increasingly taking place in a digital world, then having the ability to transform content knowledge into pedagogically powerful yet adaptive forms would seem to imply some capacity as a digital renovator.

Of course this capacity should not be limited to teachers. Learners and learning in a digital world would also seem likely to benefit from the capacity to digitally renovate their digital spaces. Which in turn brings up the question of “their digital spaces”. Not digital spaces provided by the institution, but spaces that learners and teachers own and control.

This touches on some of the issues raised by @audreywatters in The Future of Education: Programmed or Programmable. The differences between the Established and Affordances view of ICT seems to be a partial contributor to the Programmed Instructions and Programmable Web models of ed-tch Audrey mentions. The Established view “hard-codes” the functionality and content of learning. The Affordances view sees ICT as a space for co-creation and connection.

Agency versus structure

In my current institution it’s quite common for the topic of ICT to arise amongst any gathering of staff. The tone of the conversation is almost always negative. It doesn’t work. Things break at the worst possible time and there are sessions of woe sharing arising from the common experience of make work arising from limitations of systems. There are on-going complaints about yet another change in this system, an upgrade of Moodle, a new assignment management system or a new look and feel for course sites. A litany of change being done to staff because of the needs of technology. Technology is shaping us.

Such structure would appear to be sucking the agency of teaching staff. Capable and confident face-to-face teaching staff are struggling with systems that require their teaching practice to be shoe-horned into the inappropriate capabilities offered by the functionality of institutional systems. Teaching staff can see what they’d like to be able to do better, but are unable to do it. The ICT is shaping them and their practice with no apparent capacity to shape the digital space in which their learning and teaching takes place.

How widespread is this view? What impact does it have on their identity and capability as a teacher? What impact does it have on the quality of learning and teaching?

What would happen if they had the capacity to shape the ICT? Even if just a little bit more. Would that increased sense of agency add something to the quality of learning and teaching?

Not ready for digitally fluent staff

The Horizon Report’s identification of limited digital literacy of academic staff as a significant barrier begs a range of questions. Accepting the premise and putting aside questions over what it means to digitally literate or fluent I wonder: Are universities ready for digitally fluent staff? Would digitally fluent staff be will to accept an organisation having an Established view of ICT, or would they expect an Affordances view of ICT?

You want digitally fluent faculty?


Ciborra, C. (1992). From thinking to tinkering: The grassroots of strategic information systems. The Information Society, 8(4), 297–309.

Johnson, L., Adams Becker, S., Estrada, V., & Freeman, A. (2014). NMC Horizon Report: 2014 Higher Education Edition. Austin, Texas. Retrieved from

Koopman, P., & Hoffman, R. (2003). Work-arounds, make-work and kludges. Intelligent Systems, IEEE, 18(6), 70–75. Retrieved from

Shulman, L. (1987). Knowledge and teaching: Foundations of the new reform. Harvard Educational Review, 57(1), 1–21. Retrieved from

Some more tweaks to gradebook

This is a development log of a few additions to the recent fixes to the Peoplesoft gradebook. The following documents attempts to implement the following

  1. Highlight students in the supp range DONE
    Students with a mark between 44.5 and 49.5 need the grade to be set to IM and a note inserted.
  2. PE overrides DONE
    Courses with a professional experience component need to have the default FNC over-ridden when they don’t have a PE mark yet. And a comment also needs inserting.

The second is harder because the page being updated doesn’t contain the PE mark. A bit more challenging to implement.

And thanks to a “communique” there’s a more complete set of “guidelines on the allocation of marks and final grades” which lists

  1. Round up when total marks a close to grade cut-offs – DONE.
  2. Review where the total marks a close to the passing grade cut-off.

    This is the “supp” range task.

  3. Review where the total marks are close to higher grade cut-offs.

    This appears to be a duplication of #1, but includes the phrase “review the performance”. I wonder why if the mark has already been rounded up?

  4. Allocation of supplementary grades. – DONE

    Any student within 5% below the passing mark and who has complete all assessment can be given a supplementary IS/IM. Similar to the “PE Overrides” above in terms of how it would have to work.

Left to do

  1. Handle the process of saving
    Once you save the gradebook, it won’t run the update again to show the changes. More kludging to do I feel. Appears to be due to the fact that the “saving” process takes a long time and this defeats the “pause before running” kludge currently used to update the rows.
  2. Give advice on the different types of fails.
    If a student has submitted all assessment items but failed the course, they should get an F.
    If they submitted some, but not all assessment items, then the grade should be a FNC.
    If they didn’t submit any assessment items, the grade should be a FNP.

    The process used to check for PE could be expanded to handle this.

  3. Checking on institutional MOE
    Can they install Firefox/Greasemonkey?
  4. Checking on the process used by others
    The sequence of steps I use in the gradebook may not match what others use. Observe what they do.

Supp range

Basic task is to

  1. Recognise students in the grade range.

    } else if ( isSuppRange( rawResult )) {
            changeBackground( element, studentNum, "#FFCC00" );
  2. Change the background color.
    Let’s go #FFCC00.
  3. Add in some reminder about adding a note?
    The div win0divSTDNT_GRADE_HDR_EMPLID$3 (where 3 is the student number) seems like a good place to add the warning/explanation.

    Have identified the element in the script. Need to figure out how to add some text into it. Showing up as XrayWrapper object HTMLDivElement. Ahh, just how simple it is when you aren’t ignorant.

        element.getElementById( id ).insertAdjacentHTML('beforeend', newHTML );
  4. Check all the options
    • Award a supplementary grade DONE
    • Upgrade on the border line – no change DONE
    • Upgrade on the border line – change made DONE

PE overrides

The requirement here is to

  1. Detect any student that doesn’t have a result for “practicum”.
  2. Advise that the mark needs to change to a “result outstanding”.

This is complicated because the page on which staff can change the result, does not contain any information about the practicum result. It does appear on the first gradebook page displayed, but not the change page. So the script will need to

  1. Save the practicum result for all the students on the first page.
    It appears that the GM_setValue function is a way to do this. When the first page is loaded, the values will need to be extracted and stored. So sub-steps

    • Detect that the first page has been loaded.
      So how to identify the first page?

      Actually, not really interested in if it is the first page. Just look for if it has the word PRACTICUM in the header of a specific table.

          var description = 1;
          var id = "DERIVED_LAM_ASSIGNMENT_DESCR_" + description + "$0";
          var name = frame.getElementById( id );
          // loop through all the assignment descriptions for first row
          while ( name ) {
              var rawResult = name["textContent"];
              if ( rawResult == "PRACTICUM" ) {
                  return true;
    • Extract the practicum values
      Will need to extract the column number for the practicum results …it will be located in an input box with the id DERIVED_LAM_GRADE_1$0 where 1 is the column (first column with results – and matching the column in which practicum was found) and 0 is the student number in the row.

      Will need to extract the matching ID number so that the practicum result is saved for that student. The student’s ID number is located in a span with the id STDNT_GRADE_HDR_EMPLID$0

          var studentNum = 0;
          var peResultID = "DERIVED_LAM_GRADE_" + column + "$" + studentNum;
          var peResultElement = frame.getElementById( peResultID );
          var studentID = "STDNT_GRADE_HDR_EMPLID$" + studentNum;
          var studentElement = frame.getElementById( studentID );
          // loop through all the rows in the table
          while ( peResultElement ) {
              var rawResult = peResultElement.value;
              var studentRaw = studentElement["textContent"];
              var id = "STUDENT_peResult_" + studentRaw;
              GM_setValue( id, rawResult );
    • Save them.
      The question now is how and what to save. Perhaps the aim here is only to save those students who do NOT have a result? Or should we save them all? i.e. actually save a value for all STUDENT_peResult_id
      The code above has the modified version.
  2. Use that stored information on the change page.
    When we’re checking the other page, need to add in a getValue call to test it.
  3. Should think about deleting the values when the script is on the first page. Just to make sure there aren’t any left overs? But then if you have to go to this page first, then it should be ok as it gets overwriten.
        var keys = GM_listValues();
        for ( var i=0; i < keys.length; i++ ) {
            var t = GM_getValue( keys[i] );
            if ( keys[i].match( /^STUDENT_peResult_/ ) ) {
                GM_deleteValue( keys[i] );
  4. PE results missing for all.
    PE results don’t get entered until late in the process. So if you visit the gradebook before this you get the PE result over-riding everything else. Need to prevent this from happening. i.e. if there are no PE results for anyone, then don’t display this warning.

Fixing one part of the peoplesoft gradebook

The following is a development log of an attempt to fix one aspect of the Peoplesoft gradebook used at my current institution.

Why and what?

The problem

At the end of semester all assignment marks end up in the Peoplesoft gradebook. An old school web information systems that the academic in charge of a course has to use to do some last minute checks and changes. One of those changes is to change the grade for students who are within 0.5 of grade level. e.g. a student with a mark of 49.6 shouldn’t get an F, they should get a C (which is the pass mark).

Peoplesoft won’t do this. The academic has to manually scroll through the list of students (ordered alphabetically by student name) looking for those that in this range. Once found the new grade has to be manually entered into a textbox. This is a problem, especially if your class has a couple of hundred students.

The solution

The solution developed below is a Greasemonkey script that will automate this process. It will, once installed

  1. Detect that the peoplesoft gradebook is being displayed.
  2. Look for any students within 0.5 of a grade level.
  3. For each of these students found
    • Change the background for that row to red.
    • Place the upgraded grade in the appropriate textbox.
  4. Look for any students who have already been upgraded, change the background of their row to green.


Identifying the gradebook

First initial problem is that the Peoplesoft gradebook is using iframes. Which complicates things a little. Especially in identifying the appropriate iframe and then getting the script to only activiate when the appropriate document is loaded. Not to mention no great surprise that we’re talking some really ugly HTML here.

The actual data for each student is spread over a row with XXX main cells each with div elements with specific ids (the $0 appears to increment per student)

  • win0divHCR_PERSON_NM_I_NAME$0 – span HCR_PERSON_NAM_I_NAME$0 contains the name
  • win0divSTDNT_GRADE_HDR_EMPLID$0 – span STDNT_GRADE_HDR_EMPLID$0 – contains the EMPLID
  • input text box with id STDNT_GRADE_HDR_CRSE_GRADE_INPUT$0 is where the changed grade might get entered.

It appears to be part of a form with the URL ending in SA_LEARNING_MANAGEMENT.LAM_CLASS_GRADE.GBL and appearing in an IFRAME with id ptifrmtgtframe – which I assume is a generic iframe used on all the pages.

So the plan appears to be for the script to

  1. Only respond for the broad URL associated with the institutional gradebook.
    Done via the standard Greasemonkey approach.
  2. Only kick into action on the loading of the iframe with id ptifrmtgtframe.
    This appears to work.

    var theFrame;
    theFrame = document.getElementById('ptifrmtgtframe');
    theFrame.addEventListener( "load", my_func, true );
  3. Check to see if the form SA_LEARNING_MANAGEMENT.LAM_CLASS_GRADE.GBL OR perhaps the presence of the ids from the table above
    Have modified the above to pass the frame in and was using that to determine the presence of the textbox. The problem is that there is a further complication to the interface. Jumping to the specific page in the gradebook (there are three) is being done by a “javascript:submitAction_win0(document.win0…..)”. This isn’t showing up as an on load for the frame.

    Found this post which talks about one potential solution but also points to someone who’s been doing this for much longer and in more detail.

  4. Have they included the number of students in the HTML? – no, doesn’t look like it.

A rough attempt to understand what is going on

  1. Faculty centre loads with list of courses.
    The standard entry into gradebookFix is run at this stage – alert is shown. And then the iframes load.
  2. Click on gradebook icon trigger the current iframe load event and shows the three different gradebook icons.
    The my_func function is run via an event listener for onLoad for the ptifrmtgtframe iframe. But this is only run the once as….
  3. Click on the “cumulative grades” doesn’t load a new iframe, calls the javascript:submitAction_win0 method.

The aim is to modify the click on the particular link so that something else happens. How about

  1. Modify onload to look for that link and add a onclick event.
    The id for the link is DERIVED_SSR_LAM_SSS_LINK_ANCHOR3. The problem is that attempting to add an event listener to this is not working. i.e. a call to getElementById is not working. Aghh, that’s because these things aren’t normal Javascript type objects, but special Greasemonkey wrapped stuff.

    var theLink = theFrame["contentDocument"].getElementById('DERIVED_SSR_LAM_SSS_LINK_ANCHOR3');
    theLink.addEventListener( "click", function(){ alert( "CLICK ON LINK CUMULATIVE" ); }, false );
  2. Have a function that is called on click.
    The struggle here will be that the click is actually the start of a query that results in the content being changed. But not necessarily recognised by Greasemonkey.

    Perhaps a timeout and then another bit of code like this might work. This could be tested simply be re-adding the on-click. This will sort of work, but again, is only set when the iframe loads for the first time. If any other navigation happens it won’t re-add any changes in.

    Have added it to the other two main links for gradebook. Possible this will be a sufficient kludge for now.

  3. Looks like we need to capture the submitAction_win0 method after all.
    Nope, have figured a kludge

Identifying the student rows

The following code segment will change the background/font color of the first student’s name

function updateResults(element) {    var name = element.getElementById('win0divHCR_PERSON_NM_I_NAME$0'); = 'red';  = 'white';

Above specifies the names of the different student fields. The difference is the number after the dollar sign – 0 up to the last.

Steps required here

  1. Identify how many students are on the page.
    Will be useful for a for loop to go through each. xpath might offer a possibility? JQuery? A simple while loop could also do the trick. Will go with that.
  2. Determine what to change
    Plan is

    • RED – need attention i.e. marks that should be over-ridden with suggested override in place.
    • GREEN – those that have already been over-ridden previously.
    • no colour/change – correct as is.

All done. Seems to work.

On the difference between “rational”, “possible” and “desirable”

A couple of weeks ago @kateMfD wrote a post asking “What next for the LMS?”. (one of a raft of LMS posts doing the rounds recently). Kate’s point was in part that

The LMS is specifically good at what universities need it to do. Universities have learning management systems for the same reason they have student information systems: because their core institutional business isn’t learning itself, but the governance of the processes that assure that learning has happened in agreed ways.

The brief comment I shared on Kate’s post shared some discussions @beerc and I had 6 or 7 years ago. Back then we were responsible for helping academic staff use the institution’s LMS. I was amazed at how manual the process was and how limited it was in its use of standard checks. For example, it was quite common for a new course site to be pointing to last year’s course profile/synopsis (a PDF). This wasn’t being picked up until a student or two complete all of the following steps

  1. Actually bothered use the link to the course profile form the course site.
  2. Figured out that it was pointing to last year’s course profile.
  3. Was bothered enough by this problem to report it to someone.

Rather than be reactive, it seemed sensible to write a Perl script or two that would “mine” the Blackboard database and identify these types of problems very early in the semester so we could proactively fix them.

At that stage, “management” couldn’t grasp the value of this process and it never went anywhere. I never could understand that.

Fear of management rising

Not long after that – as the learning analytics fad started to rise – Col and I were worried about what management would do once they joined the band wagon. In particular, we wondered when they might identify the problems that ideas like “Web 2.0 tools” (blogs, Second Life etc) or Personal Learning Environments (another fad we were playing with at the time) would pose for learning analytics. i.e. to run “learning analytics” you need to have access to the data and a University generally won’t have access to the data from tools that are personal to the learner and external to the institution.

Given Kate’s identification of management’s purpose around learning – “governance of the processes that assure that learning has happened in agreed ways” – Col and I have been waiting to hear of Universities banning the use of external/personal tools for learning/teaching because it broke their “learning analytics”. Around the same time as Kate’s post, I heard that on southern University was indeed going down that route, and that’s the comment I made on Kate’s post.

Why is this a problem?

This morning @drwitty_knitter replied to my comment with

I would think this is quite common. Universities like to be able to track where money is being spent and what the outcomes are for students. Unless tools have some way to report what students are doing, and how that relates to their curricular goals, it would be hard to justify their use.

And I agree, I think it will becoming increasingly common. But I also still think it’s a really, really bad idea. @beerc, @damoclarky and offered one explanation why this is a bad idea in this ASCILITE’2012 paper i.e.

Insight gained over the last four years exploring learning analytics at one university suggest that the assumptions embodied by managerialism may be an inappropriate foundation for the application of learning analytics into tertiary learning environments

In short, in order to believe it is possible to use analytics to connect what students are doing with their curricular goals can only occur if you make a range of assumptions about the nature of people, learning, and universities that fails to engage effectively with reality. No matter how complex the learning analytics algorithms and systems used, the only way you can achieve the stated purpose is to attempt to reduce the variability of learning and teaching to fit the limitations of the capabilities of the technology.

Which is exactly what is happening when institutions ban the use of personal or external tools.

This won’t be enough. As we show in the ASCILITE paper, even if you limit yourself to the LMS, the diversity of learners and learning; and, the chasm between what happens in the LMS and actual student learning is such that there will still be huge questions about what the analytics can tell you. This will lead to at least two likely outcomes

  1. Management will believe what the analytics tells them and plan future action on this poor foundation; and,
  2. Management won’t believe the analytics and thus will further reduce the variability of learning and teaching to fit the limitations of the capabilities of the technology.

The last option contributes to the problem that Chris Dede identifies in this clip:

that the very, very best of our high-end learning environments have less variety than a bad fast food restaurant

The three paths

In an ASCILITE’2014 paper we identify three paths that might be followed with learning analytics

  1. Do it to.
  2. Do it for.
  3. Do it with.

Our argument is that almost all of the learning analytics work (and increasingly much of what passes for learning and teaching support activities) is following the first two paths. We also argue that this will end badly for the quality of learning and teaching and will contribute to learning analytics being yet another fad.

The “Do it to” path is completely rational if your task is to ensure the quality of learning across the institution. But it’s only possible if you assume that there is no diversity in learning and teaching and that “learning” is the data captured in digital trials left in institutional databases. I don’t think it is either possible or desirable, hence I don’t think it’s rational. YMMV.

What do new views of knowledge & thinking have to say about research on teacher learning?

I’m finally getting/creating a smidgin of time to continue exploring what “distributed” views of knowledge and learning might say about understanding and helping teachers (of all ilks) learn more about what they do. The following is a summary of Putnam and Borko (2000)

Which gives a straight forward overview of the “situated perspective” and how it links to existing research (from the 90s) into teacher learning

What it’s about

Lots of attention is being paid to new ideas about the nature of cognition and learning – i.e. situated cognition, distributed cognition and communities of practice aka the “situative perspective”.

Lots of discussion about using this to help students learn. Less attention paid to teachers either

  1. “to their roles in creating learning experiences consistent with the reform agenda”, or
  2. “how they themselves learn new ways of teaching”. (p. 4)

Putnam and Borko (2000) aim to focus on the latter. Which is exactly where my interest lays. The paper’s focus is

  1. Use the “situative perspective” to understand recent research on teacher learning.
  2. “explore new issues about teacher learning and teacher education that this perspective brings to light”
    Which they divide into three issues

    1. Where to situate teachers’ learning experiences
    2. The nature of discourse communities for teaching and teacher learning
    3. the importance of tools in teachers’ work

    Apparently covered in more detail in Putnam & Borko (1997)

Conceptual themes of the situative perspective

Three conceptual themes, that cognition is

  1. situated in particular physical and social contexts;

    the physical and social contexts in which an activity takes place are an integral part of the activity, and that the activity is an integral part of the learning that takes place within it. How a person learns a particular set of knowledge and skills, and the sit- uation in which a person learns, become a fundamental part of what is learned (Putnam and Borko, 2000, p. 4)

    Hence the push to authentic activities in classrooms. Apply it to inservice teacher education/professional development? What makes for an authentic activity?

    we consider the kinds of thinking and problem-solving skills fostered by an activity to be the key criterion for authenticity (p. 5)

  2. social in nature ;

    interactions with the people in one’s environment are major determinants of both what is learned and how learning takes place…..Individuals participate in numerous discourse communities….(which) provide the cognitive tools…that individuals appropriate (p. 5)

    Generates questions about the type of communities to create in a learning situation – disciplinary communities, or “learn to learn” communities?

  3. distributed across the individual, other persons and tools.
    This section is perhaps a little more underdone than the others.

    Rather than considering cognition solely as a property of individuals, situative theorists posit that it is distributed or “stretched over” (Lave, 1988) the individual, other persons, and various artifacts such as physical and symbolic tools (Salomon, 1993a) (p. 5)

    And the problem is that school’s focus on “tool-free performance, and on deconstexualised skills, educating people to be good learners in school settings alone may not be sufficient to help them become strong out-of-school learners (Resnick, 1987, p. 18) (p. 5)”

Issues arising for teacher learning and teacher education

  1. Where to situate teachers’ learning experiences

    The question is not whether knowledge and learning are situ- ated, but in what contexts they are situated. For some purposes, in fact, situating learning experiences for teachers outside of the classroom may be important-indeed essen- tial-for powerful learning (p. 6)

    The situative perspective encourages a focus on exploring how different settings for teacher learning generate different types of knowing? Broad types

    1. In individual teachers’ classrooms
    2. Teachers bring their classroom experiences to outside workshops.

    The idea of inter-twinning learning with on-going practice.

    Problems include

    1. Scalability
    2. Difficulty of changing mindsets when retaining the connection to the existing situation. To break out of existing mindsets may require entry into a different setting.

    Which generates the problem of integrating new/different ideas back into the existing setting. Which leads to the idea of “follow up”. Brand new experience, and then on-going support to help with integration.

    Moves onto apply this to teacher education and quotes Bird (1992, p 501)

    . But this image re- quires a stable, satisfactory practice that the novice can join. If the aim of teacher education is a reformed practice that is not readily available, and if there is no reinforcing culture to support such practice, then the basic imagery of apprenticeship seems to break down. Teachers’ knowledge is situated, but this truism creates a puzzle for re- form. Through what activities and situations do teachers learn new practices that may not be routinely reinforced in the work setting? (p. 501)

    Talks about the case-based approach as one way to address this. Couple of paragraphs on this.

  2. The nature of discourse communities for teaching and teacher learning

    . These discourse communities play central roles in shaping the way teachers view their world and go about their work. Indeed, patterns of classroom teaching and learning have historically been re- sistant to fundamental change, in part because schools have served as powerful discourse communities that enculturate participants (students, teachers, administrators) into traditional school activities and ways of thinking (Cohen, 1989; Sarason, 1990).

    Also draws on the work of Ball (1994) to talk about how the individualism of teaching makes it difficult to agree on common standards, difficult to disagree and hence limits critique and challenge….”teaching remains a smorgasbord of alternatives with no real sense of community, , there is no basis for comparing or choosing from among alternatives, no basis for real and helpful de- bate. This lack impedes the capacity to grow. (p. 16)” (p. 9)

    Links to various research projects mixing academics and teachers to get the mix of theory and practice to engage in discussions and generate practical solutions. But does mention some problems that arise. e.g. Richardson (1992) “agenda-setting dilemma”

    In looking at pre-service teacher education the suggestion is that they “have focused more on the development of individual knowledge and competencies thought to be important for teaching than on the establishment of discourse communities for prospective teachers” (p. 9). But that if existing professional communities aren’t “reformed” then this causes problems.

  3. the importance of tools in teachers’ work

    The situative perspective provides lenses for ex- amining more thoughtfully the potential of new technolo- gies for supporting and transforming teachers’ work and learning

    Looks at tools from the perspective of: Tools to enhance/transform work of teaching, and Tools to support teachers’ learning. Didn’t find much of interest in that.


Putnam, R., & Borko, H. (2000). What do new views of knowledge and thinking have to say abut research on teacher learning? Educational Researcher, 29(1), 4–15. Retrieved from

Unintended consequences of technology in education

My wife is currently studying engineering. One of her fellow engineering students is also studying some mathematics on the side. He shared the following tale of unintended consequences arising from technology in the education.

In a particular mathematics course the students are set homework. The lecturer will then work through the solutions of these homework problems in the next lecture. This is done by using the solutions manual (provided by the publisher I assume) and the document camera available in the lecture theatre.

Small problem. Can you pick it?

Apparently the lecture is recorded and made available online. Also, the lecturer has a habit of flicking through the solutions manual to find the right page whilst the manual is under the document camera.

It appears that the students have discovered you can pause the online video recording and take a good hard look at what’s revealed.