OEP, institutions and culture

Some colleagues and I are embarking on a project exploring how teacher education might move toward adoption Open Educational Practices (OEP). A project that is currently being driven by a funding from one University, and which might lead to an application for funding from another institution. In part, we’re thinking about how teacher education in each of these two institutions can adopt OEP, what that might look like, what the barriers be, and how might we go about moving toward something like this that won’t fade away once the money runs out or we move on.

As it happens, over the last week or so there’s been an on-going discussion about the role of institutions and/or culture in OER. A discussion that started with Mike Caufield’s reflections and were then picked up by many, including Jim Groom, Stephen Downes and Tim Klapdor. A discussion that provides an interesting way of looking at what we’re thinking about. In the end, I think we may need to draw upon the following from David Wiley and Cable Green, which echoes a discussion Leigh Blackall and I had back in 2010.

Making stuff last – institutions

One of the first posts in this discussion by Mike arose out of a debate around the value of Open Educational Resources as a stepping stone to Open Pedagogy. The idea being that increasingly universities are creating policies etc that are embedding OERs (typically in the form of open textbooks) into organisational practice. However, while all this has been happening open pedagogy (I’ll label this OEP here) has been waiting, waiting for its turn. That waiting has made the people more interested in OEP a touch cranky with the focus on OER and they’re heading off to do their own thing.

The problem Mike identifies comes from his personal experience

But here’s what I know. The death of the Persona Project was the norm, not the exception. It happens all the time. Where I work right now had a great student and teacher wiki up in 2008. But it got nuked in a transition. The Homelessness Awareness wiki I worked on with Sociology students (and demo’d with Jim Groom in 2011) is ghost-towned. The disability research one has been slammed by spam. And even more than that, each time I work with a professor on these things (most recently on a Colonial Theory wiki) we spin up from scratch, and leave it to rot afterwards.

And leads to the following

People make things possible. And we have such great, great people in Open Pedagogy….But institutions, they are what make these things last.

And in a another post, to the question

How can we re-architect our institutions to bring open practice into the center of them, rather see it as a bolt-on?

Making stuff last – culture

Stephen Downes response is

You can’t depend on institutions. And in a sense, you don’t need them. Institutions aren’t what make tests and exams happen year after year. Institutions aren’t what guarantee there will be course outlines and reading lists. What makes this last – the only thing that makes this last – is culture.

And in a more detailed post he adds

I’m not saying we should never build things. What I am saying is that we cannot count on institutions – organized economic and political units – to ensure the lasting value of these things is preserved…Because sooner or later someone is going to object (or forget, or simply retire), and the good work goes down the drain.

Local institutional experience

So how does local institutional experience match up with this discussion.

Institutional moves to be open

Peter brings up the experience of our local institution. An institutional early adopter of open within an Australian context. Peter sums up the situation as

In principle being open is acknowledged as a good thing but in practice it seems not to happen much and to be not easy to accomplish within the institutional processes.

And suggests that at least part of the problem is

It seems likely that is linked to concerns about reputational effects….Thus the interests of the institution seem to be best served by ensuring that what is made open is carefully managed and quality assured to present the best possible impression.

Perhaps indicating that our institution hasn’t yet been successful at achieving what Mike observes

is that OER has done the hard work of bringing OER work to the center of the institution, rooting it in institutional policy and practice in a way that Open Pedagogy hasn’t been able to do

But also highlighting Downes point in that these moves for the institution to be open have been driven by people at the senior levels of the institution. However, that high level interest has resulted in a number of different bolt-on projects, but have yet to translate into changes into organisational policy or practice.

For example, institutional policy still does not make it easy (or even possible) for an academic to place a Creative Commons license on their teaching materials and release it. Institutional policy is such that the university retains copyright. In addition, any such sharing seems to require using the institutional version of Equella. A system not conducive to easy, widespread sharing and discoverability.

My moves within institutions to be open

Archisuit example from Sarah Ross

The 2010 discussion around open and how to get there between Leigh Blackall and I arose out of my work on BIM. A Moodle module that aids teachers manage use of individual student blogs. BIM is perhaps the ed tech equivalent of an Archisuit. An example of a response to a hostile architecture. An example that Mike uses as an example of the sort of workarounds that open pedagogy people have been working on for ages. But then argues that

Being against the institution may be necessary, but it is not where you ultimately want to be. If you want real change, styrofoam padding isn’t going to cut it. Eventually you have to remove the damn bars from the bench.

The difference with BIM is that it is part of the LMS. It’s an accepted part of the institution. Perhaps indicative of how while my current institution hasn’t yet succeeded with embedding open into the policy of the institution. There are glimmers of it within the infrastructure.

However, that still hasn’t encouraged vast swaths of adoption. 8 of the 10 course offerings that have used by BIM in 2014/2015 were courses I taught. On the plus side, I was surprised to find the other two courses and I believe they have continued using BIM this year.

The Moodle Open Book project is another “archisuit” example. The aim is to connect the Moodle Book module (used to manage/display collections of web pages within Moodle) to GitHub and thus enable production of OER and more interestingly OEP. There’s even some “working” code.

But as I talk about both of these workarounds, what I’m struck by is the huge “cultural” leap required to go from not using blogs/github to thinking about how blogs/github might be leveraged in an interesting OEP way. Even the initial development and application of BAM (the non-Moodle predecessor) of BIM was driven by a fairly uninspired pedagogical application – address the student corruption of a “reflective journal” assignment using Word documents.

The impact of culture

That said, I think the adoption of BIM in two other courses at my institution is potentially largely down to a change in broader culture. In this case, not the idea of open, but instead the movement of blogs into a common (even passe) part of contemporary culture. My understanding is that the person who has adopted BIM in their teaching has embarked on projects that have used blogs.

Blogs in 2016 aren’t as strange and unknown as they were in 2007 when the ELI Guide to Blogging came out. In 2006, when I tried to explain BAM, most of the time was spent trying to get people’s head about blogs, blogging, and RSS feeds. In 2016, most people are familiar with the idea of a blog and blog posts. Though I’m guessing they are probably still a bit uncertain about RSS feeds.

If blogs hadn’t caught on like they did, BIM would be dead. Culture plays a part.

Removing the bars from the bench: easy for OER, harder for OEP

As Mike points out “the assumption of the textbook is baked into every nook and cranny of our institutions”. A bit earlier he identifies the proprietary textbook as “the largest structural barrier to open pedagogy”. He congratulates the Open Textbook folk for having “willing to engage on the fronts of policy and practice at once” and suggests that the open pedagogy folk need to engage more in “issues of policy, law, funding, architecture, institutional support” in order to “remove the bars from the bench”. I think it’s going to be much harder for OEP to do this, perhaps even leaning more towards the impossible end.

Textbooks are a core part of universities. Everyone is familiar with them. The institution can talk and deal with textbooks at a general level. Whether they be proprietary or open. They have a collection of pages, making up chapters, making up the book. There are headings, images, activities, etc. They are a model that is understood across all parts of the institution. Hence textbooks are something that can be easily discussed at an institutional level. Sure those strange folk in the Arts will have different content than the Engineers, but the notion of a book is general.

OEP on the other hand is – I think – incredibly more diverse and contextual. My initial experiments with BAM took place almost 10 years ago in another institution in a different discipline. Today I use BIM – the functionality of which is a direct translation of BAM into Moodle (hence the acronym BIM) – at a different institution in a different discipline. I don’t use the BIM functionality. I have a army of kludges that I employ to support the OEP I think works better for my current students. ds106 makes perfect sense in it’s context and purpose, but engineers at my institution are not likely to understand it at all. The type of OEP we might engage in with pre-service teachers is likely to be very different from nursing students. In particular, if our aim with OEP is for the pre-service teachers to engage more with the teaching profession.

The novelty and diversity of OEP would appear to be in stark contrast to the familiarity and standardisation of textbooks and OER. I don’t think institutions (or many people) will deal well with that combination. I’m not sure continuing to ride in the back seat will be sufficient.

That said, if we’re going to do anything around OEP within an institution, we’re going to have to consider Mike’s question, if we want that work to have a chance of surviving.

How can we re-architect our institutions to bring open practice into the center of them, rather see it as a bolt-on?

Both/and, not either/or

But at the same time, I think we also need to ask ourselves a similar question about the culture of teachers and teacher education. While there’s been a significant increase in sharing online amongst edubloggers, Twitter, online resource sharing etc. This still seems to be the minority. There are still schools that constrain the use of online technologies and sharing. There are schools where it is assumed that the school retains copyright of teacher-produced material. In an era of standardised testing and concerns about teacher quality, there are issues around sharing resources, and especially sharing the messy processes involved figuring out how to teach these learners effectively.

Even if (and a big if) we’re able to embed OEP into our courses within our institutions, unless we can connect that work sustainably into teacher practice the full benefits won’t flow.

Which has me wondering, where are the sweet spots in teacher practice and our courses where it would be easier to introduce OEP and make the connection between practice and ivory tower?  What about in your teaching, where are those sweet spots? Is there any overlap?

 

 

 

 

What if our digital technologies were protean?

On Friday the 30th September 2016 I will present the paper – What if our digital technologies were protean? Implications for computational thinking, learning, and teaching – co-written by Elke Schneider and I at the ACCE’2016 conference.

Other resources include:

  • A 1 question poll; and
    An attempt to explore whether people experience their organisational information systems as protean or not.If you haven’t already, do please take the time to complete the poll.
  • Stories of digital modification.
    A copy of the Google doc we originally used to gather the data for the paper. This data was then analysed for themes.

Abstract

Not for the first time, the transformation of global society through digital technologies is driving an increased interest in the use of such technologies in both curriculum and pedagogy. Historically, the translation of such interest into widespread and effective change in learning experiences has been less than successful. This paper explores what might happen to the translation of this interest if the digital technologies within our educational institutions were protean. What if the digital technologies in schools were flexible and adaptable by and to specific learners, teachers, and learning experiences? To provide initial, possible answers to this question, the stories of digital technology modification by a teacher educator and a novice high school teacher are analysed. Analysis reveals that the modification of digital technologies in two very different contexts was driven by the desire to improve learning and/or teaching by: filling holes with the provided digital technologies; modelling to students effective practice with digital technologies; and, to better mirror real world digital technologies. A range of initial implications and questions for practitioners, policy makers, and researchers are drawn from these experiences. It is suggested that recognising and responding to the inherently protean nature of digital technologies may be a key enabler of attempts to harness and integrate digital technologies into both curriculum and pedagogy.

Exploring Moodle Book usage – Part 7a) – when are they modified

In a previous post I generated various representations of when Moodle Book resources were being used and some indications of when they were being created. What I didn’t do in that post was generate a calendar heatmap of when the Book resources were being create and modified. This is of interest because I’m wondering whether or not these resources (web pages) are being modified throughout the semester, or just at the beginning.

The following corrects that. It starts with calendar heatmaps showing when I’ve edited/created the Book resources in my course. I’ve tended – or least eventually developed – a practice of developing and changing the books as the semester progresses. I think I’m strange – turns out that I’m apparently not that strange at all.

EDC3100

Each of the following shows some level of change prior and during semester. Some even show changes after the end of semester.

For most of the semester, the weekend are the days that tend to be busiest in terms of edits. Showing an unhealthy practice of using weekends to catch up.

In S1 I also teach on-campus students, which is typically done during the week. Perhaps that limits the edits that happen during the week in S1.

S1 typically starts early March and finishes late June/July. S2 typically starts late July and finishes early November.

2012 S2

Fair bit of work before semester and on-going.  Fair bit of work on saturday and sunday.

2012 S2 EDC3100 modify heatmap

2013 S1

Lot of work in the leadup. Not so much during the early part of the semester.

2013 S1 EDC3100 modify heatmap

2013 S2

More front ended activity before and early in semester.  Late in the semester not much.

2013 S2 EDC3100 modify heatmap

2014 S1

More weekend editing.

2014 S1 EDC3100 modify heatmap

2014 S2

A generally lighter collection of updates.

2014 S2 EDC3100 modify heatmap

2015 S1

More before semester, lightish during.  Much of the work during is occurring late in the week.

2015 S1 EDC3100 modify heatmap

2015 S2

A more even spread across the week.

2012 S2 EDC3100 modify heatmap
Courses other than EDC3100

So what about updates in all the other courses?

Well, that is a surprise.  Indications are that at least someone is modifying a Book resource most days throughout the year.  Even in some circumstances well before or well after the year.

The question with these now is whether this spread is due to the number of book resources or number of courses using the book.  A topic for further exploration.  Perhaps by doing a heat map showing the % of courses that have books being modified?

2012

2012 all courses modify heatmap

2013

2013 modify - all courses

2014

2014 all courses modify heatmap

2015

2015 all courses modify heatmap

Your experience of organisational digital technology?

What is your experience of the digital technologies provided by the organisations for which you work?

If you’d like to share, please complete the poll below, more detail below.

About the poll

The poll is a semi-serious attempt to gather the perceptions of how people perceive organisational digital technologies. The idea (and the text from the two poll options) comes from this conference paper. The presentation will be on Friday 30th September with additional presentation resources coming to this blog soon.

Exploring Moodle Book usage – Part 7 – When are they used?

The last post in this series looked briefly at the contents of Moodle Book resources. This post is going to look at when the book resources are used, including:

  • What time of day are the books used?
  • When in the semester are they used?

By the end I spent a bit of time exploring the usage of the Book resources in the course I teach.

What time of day are they used?

This is a fairly simple, perhaps useless, exploration of when during the day. More out of general interest and laying the ground work for the code for the next question.

Given the huge disparity in the number of views versus print versus updates, there will be separate graphs for each. Meaning 3 graphs per year.  For my own interest and for the sake of comparison, I’ve included a fourth graph which is the same analysis for the big 2015 offering of the course I teach.  This is the course that perhaps makes the largest use of the Book and also the offering in which  I did lots of updates.

The graphs below show the number of events that occurred in each hour of the day. 12pm to 1am, 1am to 2am,…and so on.  Click on the graphs to see expanded versions.

There is no graph for prints per hour for 2012 as there were none in the database. This appears likely to be a bug that needs to be addressed.

Overall findings from time of day

Growth – The maximum number of events has grown each year (as expected given earlier indications of growth).

  • max views per hour: 2012 just less than 35K to 2015 over 150K
  • max prints per hour: 2013 just over 400 to 2015 over 1500
  • max updates per hour: 2012 just over 500 to to 2015 over 6000.

Similarity – The overall of shapes of the graphs stay the same, suggesting a consistent pattern in interaction.

This is especially the case for the viewing events. Starting with a low number from midnight to 1am, a on-going drop in events until 5am when it grows until the maximum per hour between 11am and midday. Then there is a general drop away until 7pm to 8pm when it grows again until dropping away after 9pm

Views per hour each year

2012
2012 views per hour

2013
2013 views per hour

2014
2014 views per hour

2015

2015 views per hour

EDC3100 2015 S1

EDC3100 2015 1 views per hour

Prints per hour each year

2012

2012 prints per hour

2013

2013 prints per hour

2014

2014 prints per hour

2015

2015 prints per hour

EDC3100 2015 S1

EDC3100 2015 1 prints per hour

Updates per hour each year

2012

2012 updates per hour

2013

2013 updates per hour

2014

2014 updates per hour

2015

2015 updates per hour

EDC3100 2015 S1

EDC3100 2015 1 updates per hour

Calendar Heatmaps

A calendar heatmap is a fairly common method of representing “how much of something” is happening each day of the year. The following aims to generate calendar heatmaps using the same data shown in the above graphs. The plan is to use the method/code outlined on this page.

It requires the generation of a two-column CSV file. First column the date in YYYYMMDD format and the 2nd column the “how much of something” for that day. See the example data on the blog post.  Looks like it might be smart enough to figure out the dates involved.  Let’s see.

It is, but doing all of the years together doesn’t work all that well given the significant increase in numbers of courses using the Book as time progresses and the requirement for the heatmap to use the same scale for all years. As a result the 2012 usage doesn’t show up all that well. Hence each of the years were mapped on separate heatmaps.

The following calendar heatmaps show how often the Book resources were viewed on each day. The events counted are only those for Book resources from courses offered in the given year. In 2012, 2013 and 2014 this means that there is a smattering of views of a books early in the following year (semester 3 stretches from Nov to Feb). There is no similar usage for the 2015 books because the data does not include any 2016 events.

The darker the colour the greater the use. In the 2012 image below you should be able to see a tool tip showing a value of 81 (out of 100) that is quite dark, but not the darkest.

2012

The 2012 map seems to establish the pattern.  Heavy use at the start of semester with a gradual reduction through semester. A few upticks during semester and toward the end of semester.

I no longer have easy access to specific dates for 2012 and 2013. The 2014 heatmap has some specific dates which should broadly apply to these earlier years.
2012 Book usage

2013

2013 Book usage - calendar heatmap

2014

The institution maintains a web page that shows the important dates for 2014, it includes:

  • March 3 – Semester 1 starts.
    Course websites open 2 weeks before this date – 17th Feb
  • June 16 – Semester 1 exams start.
  • July 21 – Semester 2 starts
    Course websites open 2 weeks prior – 7th July.
  • November 3 – Semester 2 exams start.
  • November 17 – Semester 3 starts.

Screen Shot 2016-09-11 at 4.52.36 pm

2015

The semester 1 2015 offering of my course had the following due dates for its 3 assignments

  1. 30th March – which appears to coincide with a heavy usage day.
  2. 4th May – also a slightly heavy usage day, but not as heavy.
  3. 15th June – two somewhat heavy usage days before and on this date.

Raising the question of what the heatmap for that course might look like – see below

Screen Shot 2016-09-11 at 4.53.10 pm

EDC3100 – S1, 2015

Focusing just on my course the increase in usage just before the due date for the assignments is more obvious. One of the reasons for this is that all the Assessment information for the course is included in a Moodle Book resource.
EDC3100 S1 2015 book usage - calendar heatmap
Other time periods relevant to this course are:

  • April 6 to 17 – the two week mid-semester break; and,
    Which correspond to two of the lightest periods of usage of book resources.
  • May 18 to June 5 – a three week period when most of the students are on Professional Experience within schools.
    Which also corresponds to a light period of usage.

The two heaviest days of usage are the 9th and 10th of March. The start of Week 2 of semester. It’s a time when the pressure is on to get a blog created and registered and start completing learning paths.

After the peak of the first three weeks, usage of the Book resources drops to around 50% per day.

Questions to arise from this

  • Does the learning journal assessment item for EDC3100 change when students interact with the course site?
  • Is the pattern of usage (down to 50% a day) indicative of students turning off, or becoming more familiar with the approach?
  • Does the high level of usage indicate

It also begs the question about whether particular offerings of the course show any differences.

2012 – S2

The 2012 S2 pattern is quite a bit different. It is a bit more uneven and appears to continue well after the semester is finished.  This is due to this being the first semester the course used the Book module and also because there was a semester 3 offering of the course for a few students that used the same resources.
EDC3100 2012 2 - Book usage

The 2012 heatmap also shows a trend that continues. i.e. usage of the Book resources continue well past the end of semester. It’s not heavy usage, but is still there.

Question: is that just me, or does it include students?

2013 – S1

2013 S1 is a bit different as well. Lighter use at the start of semester. A bit heavier usage around assignment due dates. My guess is that this was still early in the evolution of how the Book was being used.

EDC3100 2013 S1 - Book usage

2013 – S2

This map seems to be evolving toward the heavy use at the start of semester.
EDC3100 2013 S2 - Book usage

2014 – S1

And now the pattern is established. Heavy use at the start of semester and in the lead up to Assignment 1. A slight uptick then for Assignments 2 and 3. With the light usage around Professional Experience evident.

EDC3100 2014 S1 - Book usage

2014 – S2

EDC3100 2014 S2 - Book usage

2015 – S2

  EDC3100 2015 S2 - Book usage
What about just the students?

The following shows just the student usage for the 2013 S1 offering. Not a huge difference to the “all role” version above suggesting that it is students who are doing most of the viewing. But it does confirm that the on-going usage of the Book resources past the end of the semester are students who appear to have found some value for the information post the course.

EDC3100 2013 1 - Just students

Which comes first? Pedagogy or technology?

Miranda picks up on a common point around the combination of technology and pedagogy with this post titled Pedagogy First then Technology. I disagree. If you have to think in simple sequential terms, then I think pedagogy should be the last consideration, not the first. The broader problem though is our tendency to want limit ourselves to the sequential

Here’s why.

The world and how we think isn’t sequential

The learning and teaching literature is replete with sequential processes such as ADDIE, Backwards Design, Constructive Alignment etc. It’s replete with such models because that’s what academics and experts tend to do. Develop models. The problem is that all models are wrong, but some of them are useful in certain situations for certain purposes.

Such models attempt to distill what is important from a situation to allow us to focus on that and achieve something useful. The only trouble is that the act of distillation throws something away. It’s an approach that suffers from a problem identified by Sir Samuel Vimes in Feet of Clay by the late Terry Pratchett

What arrogance! What an insult to the rich and chaotic variety of the human experience.

Very few, if any, human beings engage in anything complex or creative (such as designing learning) by following a sequential process.  We are not machines. In a complex task within a complex environment you learn as much, if not more, by engaging in the process as you do planning what you will do beforehand.

Sure, if the task you are thinking about is quite simple, or if it is quite complicated and you have a lot of experience and expertise around that task, then you can perhaps follow a sequential process. However, if you are a teacher pondering how to transform learning through the use of digital technology (or using something else), then your task is neither simple, nor is it complicated, nor is it something you likely have experience or expertise with.

A sequential process to explain why technology first

Technologies for Children is the title of a book that is designed to help teachers develop the ability to help learners engage with the Australian Curriculum – Technologies learning area. A curriculum that defines two subjects: Design and Technologies, and Digital Technologies. In the second chapter (Fleer, 2016) the author shares details of how one year 4/5 teacher integrates this learning area into her class. It includes examples of “a number of key statements that reflected the technological processes and production skills” (Fleer, 2016, p. 37) that are then turned into learner produced wall charts. The following example wall chart is included in Fleer (2016, p. 37). Take note of the first step.

When we evaluate, investigate, generate designs, generate project plans, and make/produce we:

  1. Collaboratively play (investigate) with the materials.
  2. Evaluate the materials and think about how they could be used.
  3. Generate designs and create a project plan for making the item.
  4. Produce of make the item.
  5. Evaluate the item.
  6. Write about the item and talk with others.
  7. Display the item.

Before you can figure out what you are going to do with a digital technology, you need to be fully aware of how the technology works, what it can do, what are the costs of doing that, what it can’t…etc. Once you’ve got a good handle on what the digital technology can do, then you can figure out interesting and effective ways to transform learning using the technology. i.e. pedagogy is the last consideration.

This is not to suggest that pedagogy is less important because it comes last. Pedagogy is the ultimate goal

But all models are wrong

But of course all models are wrong. This model is (arguably) only appropriate if you are not familiar with digital technology. If you know all about digital technology or the specific digital technology you are considering, then  your need to play with the digital technology first is lessened.  Maybe you can leap straight to pedagogy.

The trouble is that most teachers that I know have fairly limited knowledge of digital technologies. In fact, I think many of the supposed IT experts within our institutions and the broader institution have somewhat limited understandings of the true nature of digital technologies. I’ve argued that this limited understanding is directly impacting the quality of the use of digital technology for learning and teaching.

The broader problem with this “technology first” model – as with the “pedagogy first” model – is the assumption that we engage in any complex task using a simple, sequential process. Even the 7 step sequential process above is unlikely to capture “the rich and chaotic variety” of how we evaluate, investigate and generate designs for using digital technology for learning and teaching. A teacher is just as likely to “play (investigate)” with a new digital technology by trying out in a small safe to fail experiment to see how it plays out. Perhaps this is repeated over a few cycles until the teacher is more comfortable with how the digital technology works in the specific context, with the specific learners.

References

Fleer, M. (2016). Key ideas in the technologies curriculum. In Technologies for Children (pp. 35–70). Cambridge University Press.

Making course activity more transparent: A proposed use of MAV

As part of the USQ Technology Demonstrator Project (a bit more here) we’ll soon be able to play with the Moodle Activity Viewer. As described the VC, the Technology Demonstrator Project entails

The demonstrator process is 90 days and is a trial of a product that will improve an educator’s professional practice and ultimately motivate and provide significant enhancement to the student learning journey,

The process develops a case study which in turn is evaluated by the institution to determine if there is sufficient value to continue or perhaps scale up the project.  As part o the process I need to “articulate what it is you hope to achieve/demonstrate by using MAV”.

The following provides some background/rationale/aim on the project and MAV. It concludes with an initial suggestion for how MAV might be used.

Rationale and aim

In short, it’s difficult to form a good understanding of which resources and activities students are engaging with (or not) on a Moodle course site. In particular, it’s difficult to form a good understanding of how they are engaging within those resources and activities. Making it easier for teaching staff to visualise and explore student engagement with resources and activities will help improve their understanding of student engagement. This improved understanding could lead to re-thinking course and activity design. It could enhance the “student learning journey”.

It’s hard to visualise what’s happening

Digital technologies are opaque. Turkle (1995) talks about how what is going on within these technologies are hidden from the user. This is a problem that confronts university teaching staff using a Learning Management System. Being able to identify what resources and activities within a course website students are engaging with,which resources they are not, and identifying which students are engaging can take a significant amount of time.

For example, testing at USQ in 2014 (for this presentation) found that once you knew which reports to run on Moodle you had to step through a number of different reports. Many of these reports include waiting for minutes (in 2016 the speed is better) with a blank page while the server responds to the request. After that delay, you can’t actually focus only on student activity (staff activity is included) and it won’t work for all modules. In addition, the visualisation that is provided is limited to tabular data – like the following.

EDC3100 2016 S1 - Week 0 activity

Other limitations of the standard reports, include:

  • Identifying how many students, rather than clicks have accessed each resource/activity.
  • Identify which students have/haven’t accessed each resource/activity.
  • Generate the same report within an activity/resource to understand how students have engaged within the activity/resource.

Michael de Raadt has developed the Heatmap block for Moodle (inspired by MAV) which addresses many of the limitations of the standard Moodle report. However, it does not (yet) enable the generation of a activity report within an activity/resource.

The alternative – Moodle Activity Viewere (MAV)

This particular project will introduce and scaffold the use of the Moodle Activity Viewer (MAV) by USQ staff. The following illustrates MAV’s advantages.

MAV modifies any standard Moodle page by overlaying a heat map on it.  The following image shows part of a 2013 course site of mine with the addition of MAV’s heatmap. The “hotter” (more red) a link has been coloured, the most times it has been clicked upon. In addition, the number of clicks on any link has been added in brackets.

A switch of a MAV option will modify the heatmap to show the number of students, rather than clicks. If you visit this page, you will see an image of the entire course site with a MAV heatmap showing the number of students.

EDC3100 S2, 2013 - heat map

The current major advantage of MAV is that the heatmap will work on any standard Moodle links that appear on any Moodle page. Meaning you can view a specific resource (e.g. a Moodle Book resource) or an activity (e.g. a discussion forum) and use the MAV heatmap to understand student engagement with that activity.

The following image (click on it to see larger versions) shows the MAV heatmap on a discussion forum from the 2013 course site above.  This forum is the “introduce yourself” activity for the course. It shows that the most visited forum post was my introduction, visited by 87 students. Most of the other introductions were visited by significantly less students.

This illustrate a potential failure for this activity design. Students aren’t reading many other introductions. Perhaps suggesting a need to redesign this activity.
Forum students

Using MAV

At CQU, MAV is installed and teaching staff can choose to use it, or not. I’m unaware of how much shared discussion occurs around what MAV reveals. However, given that I’ve co-authored a paper titled “TPACK as shared practice: Toward a research agenda” (Jones, Heffernan, & Albion, 2015) I am interested in exploring if MAV can be leveraged in a way that is more situated, social and distributed.  Hence the following approach, which is all very tentative and initial.  Suggestions welcome.

The approach is influenced by the Visitor and Resident Mapping approach developed by Dave White and others. We (I believe I can talk with my co-authors) found using an adapted version of the mapping process for this paper to be very useful.

  1. Identify a group of teaching staff and have them identify courses of interest.
    Staff from within a program or other related group of courses would be one approach. But a diverse group of courses might help challenge assumptions.
  2. Prepare colour print outs of their course sites, both with and without the MAV heatmap.
  3. Gather them in a room/time and ask them to bring along laptops (or run it in a computer lab)
  4. Ask them to mark up the clear (no MAV heatmap) print out of their course site to represent their current thoughts on student engagement.
    This could include

    • Introducing them to the idea of heatmaps, engagment.
    • Some group discussion about why and what students might engage with.
    • Development of shared predictions.
    • A show and tell of their highlighted maps.
  5. Handout the MAV heatmap versions of their course site and ask them to analyse and compare.
    Perhaps including:

    • Specific tasks for them to respond to
      1. How closely aligned is the MAV map and your prediction?
      2. What are the major differences?
      3. Why do you think that might be?
      4. What else would you like to know to better explain?
    • Show and tell of the answers
  6. Show the use of MAV live on a course site
    Showing

    1. changing between # of clicks or # students
    2. focus on specific groups of students
    3. generating heatmaps on particular activities/resources and what that might reveal
  7. Based on this capability, engage in some group generation of questions that MAV might be able to help answer.
  8. Walk through the process of installing MAV on their computer(s) (if required)
  9. Allow time for them to start using MAV to answer questions that interest them.
  10. What did you find?
    Group discussion around what people found, what worked, what didn’t etc.  Including discussion of what might need to be changed about their course/learning design.
  11. Final reflections and evaluation