Fixing one part of the peoplesoft gradebook

The following is a development log of an attempt to fix one aspect of the Peoplesoft gradebook used at my current institution.

Why and what?

The problem

At the end of semester all assignment marks end up in the Peoplesoft gradebook. An old school web information systems that the academic in charge of a course has to use to do some last minute checks and changes. One of those changes is to change the grade for students who are within 0.5 of grade level. e.g. a student with a mark of 49.6 shouldn’t get an F, they should get a C (which is the pass mark).

Peoplesoft won’t do this. The academic has to manually scroll through the list of students (ordered alphabetically by student name) looking for those that in this range. Once found the new grade has to be manually entered into a textbox. This is a problem, especially if your class has a couple of hundred students.

The solution

The solution developed below is a Greasemonkey script that will automate this process. It will, once installed

  1. Detect that the peoplesoft gradebook is being displayed.
  2. Look for any students within 0.5 of a grade level.
  3. For each of these students found
    • Change the background for that row to red.
    • Place the upgraded grade in the appropriate textbox.
  4. Look for any students who have already been upgraded, change the background of their row to green.

How?

Identifying the gradebook

First initial problem is that the Peoplesoft gradebook is using iframes. Which complicates things a little. Especially in identifying the appropriate iframe and then getting the script to only activiate when the appropriate document is loaded. Not to mention no great surprise that we’re talking some really ugly HTML here.


The actual data for each student is spread over a row with XXX main cells each with div elements with specific ids (the $0 appears to increment per student)

  • win0divHCR_PERSON_NM_I_NAME$0 – span HCR_PERSON_NAM_I_NAME$0 contains the name
  • win0divSTDNT_GRADE_HDR_EMPLID$0 – span STDNT_GRADE_HDR_EMPLID$0 – contains the EMPLID
  • win0divSTDNT_GRADE_HDR_GRADE_AVG_CURRENT$0 – span STDNT_GRADE_HDR_GRADE_AVG_CURRENT$0 – has the result.
  • win0divSTDNT_GRADE_HDR_COURSE_GRADE_CALC$0 – span STDNT_GRADE_HDR_COURSE_GRADE_CALC$0 – has the grade
  • input text box with id STDNT_GRADE_HDR_CRSE_GRADE_INPUT$0 is where the changed grade might get entered.

It appears to be part of a form with the URL ending in SA_LEARNING_MANAGEMENT.LAM_CLASS_GRADE.GBL and appearing in an IFRAME with id ptifrmtgtframe – which I assume is a generic iframe used on all the pages.

So the plan appears to be for the script to

  1. Only respond for the broad URL associated with the institutional gradebook.
    Done via the standard Greasemonkey approach.
  2. Only kick into action on the loading of the iframe with id ptifrmtgtframe.
    This appears to work.

    var theFrame;
    theFrame = document.getElementById('ptifrmtgtframe');
    theFrame.addEventListener( "load", my_func, true );
    
  3. Check to see if the form SA_LEARNING_MANAGEMENT.LAM_CLASS_GRADE.GBL OR perhaps the presence of the ids from the table above
    Have modified the above to pass the frame in and was using that to determine the presence of the textbox. The problem is that there is a further complication to the interface. Jumping to the specific page in the gradebook (there are three) is being done by a “javascript:submitAction_win0(document.win0…..)”. This isn’t showing up as an on load for the frame.

    Found this post which talks about one potential solution but also points to someone who’s been doing this for much longer and in more detail.

  4. Have they included the number of students in the HTML? – no, doesn’t look like it.

A rough attempt to understand what is going on

  1. Faculty centre loads with list of courses.
    The standard entry into gradebookFix is run at this stage – alert is shown. And then the iframes load.
  2. Click on gradebook icon trigger the current iframe load event and shows the three different gradebook icons.
    The my_func function is run via an event listener for onLoad for the ptifrmtgtframe iframe. But this is only run the once as….
  3. Click on the “cumulative grades” doesn’t load a new iframe, calls the javascript:submitAction_win0 method.

The aim is to modify the click on the particular link so that something else happens. How about

  1. Modify onload to look for that link and add a onclick event.
    The id for the link is DERIVED_SSR_LAM_SSS_LINK_ANCHOR3. The problem is that attempting to add an event listener to this is not working. i.e. a call to getElementById is not working. Aghh, that’s because these things aren’t normal Javascript type objects, but special Greasemonkey wrapped stuff.

    var theLink = theFrame["contentDocument"].getElementById('DERIVED_SSR_LAM_SSS_LINK_ANCHOR3');
    
    theLink.addEventListener( "click", function(){ alert( "CLICK ON LINK CUMULATIVE" ); }, false );
    
  2. Have a function that is called on click.
    The struggle here will be that the click is actually the start of a query that results in the content being changed. But not necessarily recognised by Greasemonkey.

    Perhaps a timeout and then another bit of code like this might work. This could be tested simply be re-adding the on-click. This will sort of work, but again, is only set when the iframe loads for the first time. If any other navigation happens it won’t re-add any changes in.

    Have added it to the other two main links for gradebook. Possible this will be a sufficient kludge for now.

  3. Looks like we need to capture the submitAction_win0 method after all.
    Nope, have figured a kludge

Identifying the student rows

The following code segment will change the background/font color of the first student’s name

function updateResults(element) {    var name = element.getElementById('win0divHCR_PERSON_NM_I_NAME$0');
    name.style.backgroundColor = 'red';
    name.style.color  = 'white';
}

Above specifies the names of the different student fields. The difference is the number after the dollar sign – 0 up to the last.

Steps required here

  1. Identify how many students are on the page.
    Will be useful for a for loop to go through each. xpath might offer a possibility? JQuery? A simple while loop could also do the trick. Will go with that.
  2. Determine what to change
    Plan is

    • RED – need attention i.e. marks that should be over-ridden with suggested override in place.
    • GREEN – those that have already been over-ridden previously.
    • no colour/change – correct as is.

All done. Seems to work.

On the difference between “rational”, “possible” and “desirable”

A couple of weeks ago @kateMfD wrote a post asking “What next for the LMS?”. (one of a raft of LMS posts doing the rounds recently). Kate’s point was in part that

The LMS is specifically good at what universities need it to do. Universities have learning management systems for the same reason they have student information systems: because their core institutional business isn’t learning itself, but the governance of the processes that assure that learning has happened in agreed ways.

The brief comment I shared on Kate’s post shared some discussions @beerc and I had 6 or 7 years ago. Back then we were responsible for helping academic staff use the institution’s LMS. I was amazed at how manual the process was and how limited it was in its use of standard checks. For example, it was quite common for a new course site to be pointing to last year’s course profile/synopsis (a PDF). This wasn’t being picked up until a student or two complete all of the following steps

  1. Actually bothered use the link to the course profile form the course site.
  2. Figured out that it was pointing to last year’s course profile.
  3. Was bothered enough by this problem to report it to someone.

Rather than be reactive, it seemed sensible to write a Perl script or two that would “mine” the Blackboard database and identify these types of problems very early in the semester so we could proactively fix them.

At that stage, “management” couldn’t grasp the value of this process and it never went anywhere. I never could understand that.

Fear of management rising

Not long after that – as the learning analytics fad started to rise – Col and I were worried about what management would do once they joined the band wagon. In particular, we wondered when they might identify the problems that ideas like “Web 2.0 tools” (blogs, Second Life etc) or Personal Learning Environments (another fad we were playing with at the time) would pose for learning analytics. i.e. to run “learning analytics” you need to have access to the data and a University generally won’t have access to the data from tools that are personal to the learner and external to the institution.

Given Kate’s identification of management’s purpose around learning – “governance of the processes that assure that learning has happened in agreed ways” – Col and I have been waiting to hear of Universities banning the use of external/personal tools for learning/teaching because it broke their “learning analytics”. Around the same time as Kate’s post, I heard that on southern University was indeed going down that route, and that’s the comment I made on Kate’s post.

Why is this a problem?

This morning @drwitty_knitter replied to my comment with

I would think this is quite common. Universities like to be able to track where money is being spent and what the outcomes are for students. Unless tools have some way to report what students are doing, and how that relates to their curricular goals, it would be hard to justify their use.

And I agree, I think it will becoming increasingly common. But I also still think it’s a really, really bad idea. @beerc, @damoclarky and offered one explanation why this is a bad idea in this ASCILITE’2012 paper i.e.

Insight gained over the last four years exploring learning analytics at one university suggest that the assumptions embodied by managerialism may be an inappropriate foundation for the application of learning analytics into tertiary learning environments

In short, in order to believe it is possible to use analytics to connect what students are doing with their curricular goals can only occur if you make a range of assumptions about the nature of people, learning, and universities that fails to engage effectively with reality. No matter how complex the learning analytics algorithms and systems used, the only way you can achieve the stated purpose is to attempt to reduce the variability of learning and teaching to fit the limitations of the capabilities of the technology.

Which is exactly what is happening when institutions ban the use of personal or external tools.

This won’t be enough. As we show in the ASCILITE paper, even if you limit yourself to the LMS, the diversity of learners and learning; and, the chasm between what happens in the LMS and actual student learning is such that there will still be huge questions about what the analytics can tell you. This will lead to at least two likely outcomes

  1. Management will believe what the analytics tells them and plan future action on this poor foundation; and,
  2. Management won’t believe the analytics and thus will further reduce the variability of learning and teaching to fit the limitations of the capabilities of the technology.

The last option contributes to the problem that Chris Dede identifies in this clip:

that the very, very best of our high-end learning environments have less variety than a bad fast food restaurant

The three paths

In an ASCILITE’2014 paper we identify three paths that might be followed with learning analytics

  1. Do it to.
  2. Do it for.
  3. Do it with.

Our argument is that almost all of the learning analytics work (and increasingly much of what passes for learning and teaching support activities) is following the first two paths. We also argue that this will end badly for the quality of learning and teaching and will contribute to learning analytics being yet another fad.

The “Do it to” path is completely rational if your task is to ensure the quality of learning across the institution. But it’s only possible if you assume that there is no diversity in learning and teaching and that “learning” is the data captured in digital trials left in institutional databases. I don’t think it is either possible or desirable, hence I don’t think it’s rational. YMMV.

What do new views of knowledge & thinking have to say about research on teacher learning?

I’m finally getting/creating a smidgin of time to continue exploring what “distributed” views of knowledge and learning might say about understanding and helping teachers (of all ilks) learn more about what they do. The following is a summary of Putnam and Borko (2000)

Which gives a straight forward overview of the “situated perspective” and how it links to existing research (from the 90s) into teacher learning

What it’s about

Lots of attention is being paid to new ideas about the nature of cognition and learning – i.e. situated cognition, distributed cognition and communities of practice aka the “situative perspective”.

Lots of discussion about using this to help students learn. Less attention paid to teachers either

  1. “to their roles in creating learning experiences consistent with the reform agenda”, or
  2. “how they themselves learn new ways of teaching”. (p. 4)

Putnam and Borko (2000) aim to focus on the latter. Which is exactly where my interest lays. The paper’s focus is

  1. Use the “situative perspective” to understand recent research on teacher learning.
  2. “explore new issues about teacher learning and teacher education that this perspective brings to light”
    Which they divide into three issues

    1. Where to situate teachers’ learning experiences
    2. The nature of discourse communities for teaching and teacher learning
    3. the importance of tools in teachers’ work

    Apparently covered in more detail in Putnam & Borko (1997)

Conceptual themes of the situative perspective

Three conceptual themes, that cognition is

  1. situated in particular physical and social contexts;

    the physical and social contexts in which an activity takes place are an integral part of the activity, and that the activity is an integral part of the learning that takes place within it. How a person learns a particular set of knowledge and skills, and the sit- uation in which a person learns, become a fundamental part of what is learned (Putnam and Borko, 2000, p. 4)

    Hence the push to authentic activities in classrooms. Apply it to inservice teacher education/professional development? What makes for an authentic activity?

    we consider the kinds of thinking and problem-solving skills fostered by an activity to be the key criterion for authenticity (p. 5)

  2. social in nature ;

    interactions with the people in one’s environment are major determinants of both what is learned and how learning takes place…..Individuals participate in numerous discourse communities….(which) provide the cognitive tools…that individuals appropriate (p. 5)

    Generates questions about the type of communities to create in a learning situation – disciplinary communities, or “learn to learn” communities?

  3. distributed across the individual, other persons and tools.
    This section is perhaps a little more underdone than the others.

    Rather than considering cognition solely as a property of individuals, situative theorists posit that it is distributed or “stretched over” (Lave, 1988) the individual, other persons, and various artifacts such as physical and symbolic tools (Salomon, 1993a) (p. 5)

    And the problem is that school’s focus on “tool-free performance, and on deconstexualised skills, educating people to be good learners in school settings alone may not be sufficient to help them become strong out-of-school learners (Resnick, 1987, p. 18) (p. 5)”

Issues arising for teacher learning and teacher education

  1. Where to situate teachers’ learning experiences

    The question is not whether knowledge and learning are situ- ated, but in what contexts they are situated. For some purposes, in fact, situating learning experiences for teachers outside of the classroom may be important-indeed essen- tial-for powerful learning (p. 6)

    The situative perspective encourages a focus on exploring how different settings for teacher learning generate different types of knowing? Broad types

    1. In individual teachers’ classrooms
    2. Teachers bring their classroom experiences to outside workshops.

    The idea of inter-twinning learning with on-going practice.

    Problems include

    1. Scalability
    2. Difficulty of changing mindsets when retaining the connection to the existing situation. To break out of existing mindsets may require entry into a different setting.

    Which generates the problem of integrating new/different ideas back into the existing setting. Which leads to the idea of “follow up”. Brand new experience, and then on-going support to help with integration.

    Moves onto apply this to teacher education and quotes Bird (1992, p 501)

    . But this image re- quires a stable, satisfactory practice that the novice can join. If the aim of teacher education is a reformed practice that is not readily available, and if there is no reinforcing culture to support such practice, then the basic imagery of apprenticeship seems to break down. Teachers’ knowledge is situated, but this truism creates a puzzle for re- form. Through what activities and situations do teachers learn new practices that may not be routinely reinforced in the work setting? (p. 501)

    Talks about the case-based approach as one way to address this. Couple of paragraphs on this.

  2. The nature of discourse communities for teaching and teacher learning

    . These discourse communities play central roles in shaping the way teachers view their world and go about their work. Indeed, patterns of classroom teaching and learning have historically been re- sistant to fundamental change, in part because schools have served as powerful discourse communities that enculturate participants (students, teachers, administrators) into traditional school activities and ways of thinking (Cohen, 1989; Sarason, 1990).

    Also draws on the work of Ball (1994) to talk about how the individualism of teaching makes it difficult to agree on common standards, difficult to disagree and hence limits critique and challenge….”teaching remains a smorgasbord of alternatives with no real sense of community, , there is no basis for comparing or choosing from among alternatives, no basis for real and helpful de- bate. This lack impedes the capacity to grow. (p. 16)” (p. 9)

    Links to various research projects mixing academics and teachers to get the mix of theory and practice to engage in discussions and generate practical solutions. But does mention some problems that arise. e.g. Richardson (1992) “agenda-setting dilemma”

    In looking at pre-service teacher education the suggestion is that they “have focused more on the development of individual knowledge and competencies thought to be important for teaching than on the establishment of discourse communities for prospective teachers” (p. 9). But that if existing professional communities aren’t “reformed” then this causes problems.

  3. the importance of tools in teachers’ work

    The situative perspective provides lenses for ex- amining more thoughtfully the potential of new technolo- gies for supporting and transforming teachers’ work and learning

    Looks at tools from the perspective of: Tools to enhance/transform work of teaching, and Tools to support teachers’ learning. Didn’t find much of interest in that.

References

Putnam, R., & Borko, H. (2000). What do new views of knowledge and thinking have to say abut research on teacher learning? Educational Researcher, 29(1), 4–15. Retrieved from http://www.jstor.org/stable/1176586

Unintended consequences of technology in education

My wife is currently studying engineering. One of her fellow engineering students is also studying some mathematics on the side. He shared the following tale of unintended consequences arising from technology in the education.

In a particular mathematics course the students are set homework. The lecturer will then work through the solutions of these homework problems in the next lecture. This is done by using the solutions manual (provided by the publisher I assume) and the document camera available in the lecture theatre.

Small problem. Can you pick it?

Apparently the lecture is recorded and made available online. Also, the lecturer has a habit of flicking through the solutions manual to find the right page whilst the manual is under the document camera.

It appears that the students have discovered you can pause the online video recording and take a good hard look at what’s revealed.

A perspective on why institutional e-learning is so bad

It’s about time to tilt at the same windmill again. For as long as I can remember I’ve thought institutional e-learning was done badly. Here’s another attempt to explain why and map out a way forward. The following is based heavily on on this paper that will be presented at ASCILITE’2014 and is a slightly re-worked version of something I shared as part of my current institution’s attempts to formulate operational plans.

The argument is that institutional e-learning is based entirely on the wrong mindset. To see any hope of improvement it needs to integrate a little of another largely incommensurable mindset. I use a problem specific to my practice below to illustrate the argument. My co-author shares a different problem in the paper that illustrates the same point, but his problem is potentially more interesting.

My problem

I teach EDC3100, ICT and Pedagogy a third year core course in the Bachelor of Education. The first semester enrolment typically consists of 300 plus pre-service teachers; studying to become every type of teacher from early childhood, primary, secondary, and VET; located at each of USQ’s campuses, Malaysia and about 170 of the students via online learning. Some of these students – due to exemptions – are in their first semester of University study. Others are into the 6th, 7th and beyond year of study at USQ.

As a course that is teaching teachers about how to use Information and Communication Technologies to enhance/transform their pedagogy, the course requires all students to make heavy use of ICTs. Many of these students are not “digital natives”. Even those with years of online study at USQ show surprising levels of digital illiteracy. Hence there are lots of questions from students that need answering. Almost all of these questions are asked on discussion forums.

When I respond to a question on a course discussion forum it’s often important/useful to cater the response to the specifics of the student. In particular, it’s not unusual to see the need to customise a response based on

  1. The student’s mode (at which campus or online) of study.
  2. What “type” of teacher are they studying to become (early childhood, primary etc).
  3. Whether they also have a specific learning area/discipline.
    e.g. HPE students often have a specific set of challenges around ICTs and the secondaries have specific learning areas. Then the secondary students are focusing on two or so disciplines.
  4. Whether this is the student’s first semester of study.
  5. The physical location of online students.
    Students in other states or overseas often use different curricula etc.

Challenge: If you are teaching in a University, can you find out this type of information about your students?

At my institution I have access to an LMS and a student records systems. This information is mostly not in the LMS, and if it is it will require navigating to another page to find it. While it is in the student records system the default access provided to academics does not allow them to access that information.

The silly solution I’ve used for the last 3 years has been

  1. At the start of semester, ask the Faculty staff member with the appropriate permission and ask them to generate a spreadsheet providing this information for all students.
  2. I then have that spreadsheet open when replying to student queries and when needed I check the spreadsheet.

As you might imagine this doesn’t happen as often as it should because it takes time. Of course the spreadsheet is almost straight away out of date as students add and drop the course.

My new solution

What I did instead was modify my web browser so that when it sees any page provided by the Study Desk that contains a link to a user profile it will add a new link to the page that is close by the user profile link. When I click on this new link (and it’s for a student in EDC3100) a dialog box will pop up with additional information about the student (what they are studying, their mode of study, how many courses they’ve completed and their city/post code/country).

The following figure shows what it looks like. Note:

  1. The [details] links near the author’s name and photo.
    I haven’t spent the time to tidy this kludge up.
  2. The dialog box and how the forum post is somewhat greyed out.
    The idea is that I can check this information, click Ok and then reply to the query.
MAV-based more user details by David T Jones, on Flickr

 

Currently, this only works via a web browser running on my laptop. It’s a personal solution. It is based on a particular set of technologies developed by and currently being used at CQUni for a strategic project around retention.

Why can’t USQ have more solutions like this? SET in our ways

The argument is that USQ (or any other university) is generally unable to achieve a solution like this due to the usually implicit mindset that underpins how it operates. My co-author and I have tried to make this mindset explicit as the “SET framework” based on how the institution answers three questions:

  1. What work gets done? – Strategic
    i.e. there is a strategic plan and a sequence of operational plans that define what is acceptable. The assumption is that the organisation has identified some ideal future state (enshrined in the plans) and what work can be done is judged against how well it helps the organisation achieve this pre-defined state. Any work that isn’t in the plan is deemed inefficient or unimportant.
    This particular problem could be aligned with the existing institutional plans, but there’s a question of how easily this could be achieved.
  2. How is ICT perceived? – Established
    This quote from an IT person at CQU from around 2003/4 sums up this perspective nicely, “we should seek to change people’s behavior because information technology systems are difficult to change” (Sturgess & Nouwens, 2004, n.p). This view of ICTs is that it is really hard to change them and instead people and their practices should change. This is especially prevalent with “enterprise systems” where best practice advice is to implement them as “vanilla” (i.e. no change to the technology).
    Peoplesoft (USQ’s student records system) is a horrendously difficult and expensive system to modify. Moodle – as open source software – is theoretically easier to modify but still requires some significant technical skill (i.e. expensive and rare) to modify properly. Even Moodle is very difficult to modify if your modifications require contextually specific changes to the systems in-built assumptions. e.g. modifying the Moodle discussion forum to show whether an EDC3100 students is studying to become a HPE teacher is not likely to happen.
  3. How the world is perceived? – Tree-like
    By this I mean hierarchical tree in the computer science sense. Strategic approaches will always use logical decomposition to reduce a big, really hard problem into lots of little easy to solve problems and assumes that you can just put all those little solutions together again and solve the big problem. You can see this in org charts, how learning and teaching is broken down into programs, courses, topics, learning objectives, attributes etc, and information systems.
    Each of the little boxes in the tree become responsible for a specific task. e.g. the development and support of IT is meant to be done in the IT box. Teaching education is done by the education teaching box etc. Managing student records is done in the Peoplesoft box which is the responsibility of the Student Administration box.

    The problem is that it’s really, really hard to move between boxes. If I wanted my problem solved, it would have to recognised by the folk in my box that are part of the IT governance process. They would take my problem (along with everyone else’s) up the hierarchy to someone/group who can make a judgement. A small problem like this is almost certainly going to be starved of attention as the focus is on achieving strategic goals. If it does get attention there’s the challenge of trying to bridge the two different boxes in which Peoplesoft and Moodle reside. etc.

    The more likely outcomes is that I’m not going to bother (at least not with the formal structure)

How is this possible? Breaking BAD

The solution I’ve developed is possible due to a different mindset that provides different answers to the three questions above. We’ve labelled this mindset the “BAD framework”.

It answers the three questions this way

  1. What work gets done? – Bricolage
    Rather than trying to achieve some pre-defined perfect state, bricolage focuses on solving concrete, contextualised problems using the resources that are to hand.I had a problem that I needed to solve so I figured out how I could solve it with the resources I had to hand. This year I’ve been able to improve my solution because I had access to new and better resources. But not resources that were provided by USQ. A good set of APIs would be a great help.
  2. How is ICT perceived? – Affordances
    ICTs are seen as protean. They can and should be manipulated, modified and re-worked to help people achieve what they want to achieve. There is no such thing as a “perfect design” or “perfect system”, the diversity and rapid change inherent in learning and teaching makes such an idea as nonsensical.
    In this case, I’ve been able to use the spreadsheet manually generated from Peopelsoft, Perl, Postgres, PHP, Greasemonkey, the Firefox web browser and the well designed HTML created by Moodle to manipulate and change the appearance and functionality of the Study Desk pages.
  3. How the world is perceived? – Distributed
    The world is (and universities are) complex, dynamic, and consists of interdependent assemblages of diverse actors (human and not) connected via complex networks. The ability to quickly construct and traverse those connections are essential to learning, understanding and action. Poldolny & Page (1998) apply the distributed view to governance and organisations and describe it as meaning that two or more actors are able to undertake repeated interactions over a period of time without having a centralised authority responsible for resolving any issues arising from those interactions.
    Rather than the typical tree-like structure of server (Study Desk) and client (my laptop), my solution draws on a network of technologies some on my laptop and some on university servers. I’ve used that distributed technologies to make connections not previously possible and hence I’m no able to do more than previously.

Implications

The paper that goes into this in more detail closes with this

The suggestion here is not that institutions should see the BAD framework as a replacement for the SET framework, but rather that they should engage in some bricolage and explore how contextually appropriate mixtures of both frameworks can help bridge their e-learning reality/rhetoric chasm. Perhaps universities need to break a little BAD?

Hence the suggestion for institutions is to figure out whether we want to break a little BAD and how that might be done.

However, as argued above it takes more than just having good technology. The fundamental mindset that underpins much of how an organisation does business needs to be questioned. This is hard.

The paper also raises the following as potential examples of how existing conceptions might need to be challenged

rather than require the IT division to formally approve and develop all applications of ICT, their focus should perhaps turn (at least in part) to enabling and encouraging “ways to make work-arounds easier for users to create, document and share” (Koopman & Hoffman, 2003, p. 74) through organisational “settings, and systems arranged so that invention and prototyping by end-users can flourish” (Ciborra, 1992, p. 305).

rather than academic staff development focusing on ensuring that the appropriate knowledge is embedded in the heads of teaching staff (e.g. formal teaching qualifications), there should be a shift to a focus on ensuring that the appropriate knowledge is embedded within the network of actors – both people and artefacts – distributed within and perhaps outside the institution.

Rather than accept “the over-hyped, pre-configured digital products and practices that are being imported continually into university settings” (Selwyn, 2013, p. 3), perhaps universities should instead actively contribute to “a genuine grassroots interest needs to be developed in the co-creation of alternative educational technologies. In short, mass participation is needed in the development of “digital technology for university educators by university educators” (p. 3).

The argument isn’t that we should throw out Moodle or other systems. Instead there needs to be mechanisms by which we can harness the complete knowledge distributed across the institution to extend and modify those existing technologies into something that is unique to the institutional context.

For me, I’d love to see what happens if institutional e-learning was characterised by

Widespread and on-going bricolage by a widely distributed collection of individuals and groups (students, teachers and others) from across the entire institution all connected via various means and learning from and building upon each others work. An institutional context that provides a range of functionality that supports and enables this on-going engagement with bricolage and recognises that this is where its competitive advantage will come from. An institutional context that is actively trying to make it easier to connect to actors from across and outside the institution and grow the knowledge embedded in those connections. All of this knowledge being used to manipulate and modify technologies to achieve new and interesting learning experiences.

 

 

Breaking BAD to bridge the reality/rhetoric chasm

The following is a copy of a paper accepted at ASCILITE’2014 paper written by myself and Damien Clark (CQUniversity – @damoclarky).

Abstract

The reality of using digital technologies to enhance learning and teaching has a history of falling short of the rhetoric. Past attempts at bridging this chasm have tried: increasing the perceived value of teaching; improving the pedagogical and technological knowledge of academics; redesigning organisational policies, processes and support structures; and, designing and deploying better pedagogical techniques and technologies. Few appear to have had any significant, widespread impact, perhaps because of the limitations of the (often implicit) theoretical foundations of the institutional implementation of e-learning. Using a design-based research approach, this paper develops an alternate theoretical framework (the BAD framework) for institutional e-learning and uses that framework to analyse the development, evolution, and very different applications of the Moodle Activity Viewer (MAV) at two separate universities. Based on this experience it is argued that the reality/rhetoric chasm is more likely to be bridged by interweaving the BAD framework into existing practice.

Keywords: bricolage, learning analytics, e-learning, augmented browsing, Moodle.

Introduction

In a newspaper article (Laxon, 2013) Professor Mark Brown makes the following comment on the quality of contemporary University e-learning:

E-learning’s a bit like teenage sex. Everyone says they’re doing it but not many people really are and those that are doing it are doing it very poorly. (n.p).

E-learning – defined by the OECD (2005) as the use of information and communications technology (ICT) to support and enhance learning and teaching – has been around for so long that there have been numerous debates about replacing it with other phrases. Regardless of the term used, there "has been a long-standing tendency in education for digital technologies to eventually fall short of the exaggerated expectations" (Selwyn, 2012, n.p.). Writing in the early 1990s Geoghagen (1994) seeks to understand why a three decade long “vision of a pedagogical utopia” (n.p.) promised by instructional technologies has failed to eventuate. Ten years on, Salmon (2005) notes that e-learning within universities is still struggling to move beyond projects driven by innovators and engage a significant percentage of students and staff. Even more recently, concerns remain about how much technology is being used to effectively enhance student learning (Kirkwood & Price, 2013). Given that "Australian universities have made very large investments in corporate educational technologies" (Holt et al., 2013, p. 388) it is increasingly important to understand and address the reality/rhetoric chasm around e-learning.

Not surprisingly the literature provides a variety of answers to this complex question. Weimer (2007) observes that academics come to the task of teaching with immense amounts of content knowledge, but little or no knowledge of teaching and learning, beyond perhaps their personal experience. A situation which may not change significantly given that academics are expected to engage equally in research and teaching and yet work towards promotion criteria that are perceived to primarily value achievements in research (Zellweger, 2005). It has been argued that the limitations of the Learning Management System (LMS) – the most common university e-learning tool – make the LMS less than suitable for more effective learner-centred approaches and is contributing to growing educator dissatisfaction (Rahman & Dron, 2012). It’s also been argued that the "limited digital fluency of lecturers and professors is a great challenge" (Johnson, Adams Becker, Cummins, & Estrada, 2014, p. 3) for the creative leveraging of emerging technologies. Another contributing factor is likely to be Selwyn’s (2008) suggestion that educational technologists have failed to be cognisant of "the more critical analyses of technology that have come to the fore in other social science and humanities disciplines" (p. 83). Of particular interest here is the observation of Goodyear et al (2014) that the "influence of the physical setting (digital and material) on learning activity is often important, but is under-researched and under-theorised: it is often taken for granted" (p. 138).

This paper reports on the initial stages of a design-based research project that aims to bridge the e-learning reality/rhetoric chasm by exploring and harnessing alternative theoretical foundations for the institutional implementation of e-learning. The paper starts comparing and contrasting two different theoretical foundations of institutional e-learning. The SET framework is suggested as a description of the mostly implicit assumptions underpinning most contemporary approaches. The BAD framework is proposed as an alternative and perhaps complementary framework that better captures the reality of what happens and if effectively integrated into institutional practices may help bridge the chasm. The development of a technology – the Moodle Activity Viewer (MAV) – and its use at two different universities is then used to illustrate the benefits and limitations of the SET and BAD frameworks, and how the two can be fruitfully combined. The paper closes with some discussion of implications and future work.

Breaking BAD versus SET in your ways

The work described here is part of an on-going cycle of design-based research that aims to develop new artefacts and theories that can help bridge the e-learning reality/rhetoric chasm. We believe that bridging this chasm is of theoretical and practical significance to the sector and to us personally. The interventions we describe in the following sections arose out of our day-to-day work and were informed by a range of theoretical perspectives. This section offers a brief description of the theoretical frameworks that have informed and been refined by this work. This is important as design-based research should depart from a problem (McKenney & Reeves, 2013), be grounded in practice, theory-driven and seek to refine both theory and practice (Wang & Hannafin, 2005). The frameworks described here are important because they identify a mindset (the SET framework) that contributes significantly to the on-going difficulty in bridging the e-learning reality/rhetoric chasm, and offers an alternate mindset (the BAD framework) that provides principles that can help bridge the chasm. The SET and BAD frameworks are broadly incommensurable ways of answering three important, inter-related questions about the implementation of e-learning. While the SET framework represents the most commonly accepted mindset used in practice, both frameworks are evident in both the literature and in practice. Table 1 provides an overview of both frameworks.

Table 1: The BAD and SET frameworks for e-learning implementation

Question

SET

BAD

What work gets done?

Strategy – following a global plan intended to achieve a pre-identified desired future state.

Bricolage – local piecemeal action responding to emerging contingencies.

 

How ICT is perceived?

Established – ICT is a hard technology and cannot be changed. People and their practices must be modified to fit the fixed functionality of the technology. 

Affordances – ICT is a soft technology that can be modified to meet the needs of its users, their context, and what they would like to achieve.

How you see the world?

Tree-like – the world is relatively stable and predictable. It can be understood through logical decomposition into a hierarchy of distinct black boxes.

Distributed – the world is complex, dynamic, and consists of interdependent assemblages of diverse actors (human and not) connected via complex networks.

What work gets done: Bricolage or Strategic

The majority of contemporary Australian universities follow a strategic approach to deciding what work gets done. Numerous environmental challenges and influences have led to universities being treated as businesses with an increasing prevalence of managers using "strategic control and a focus on outputs which can be quantified and compared" (Reid, 2009, p. 575) to manage academic activities. A strategic approach involves the creation of a vision identifying a desired future state and the development of operational plans to bring about the desired future state. The only work that is deemed acceptable is that which fits within the established operational plan and is seen to contribute to the desired future state. All other work is deemed inefficient. The strategic approach is evident at all levels of institutional e-learning. Inglis (2007) describes how government required Australian universities to have institutional learning and teaching strategic plans published on their websites. The strategic or planning-by-objectives (e.g. learning outcomes, graduate attributes) approach also underpins how course design is largely assumed to occur with Visscher-Voerman and Gustafson (2004) finding that it underpins "a majority of the instructional design models in the literature" (p. 77). The strategic approach is so ingrained that it is often forgotten that these ideas have not always existed (Kezar, 2001), have significant flaws, and that there is at least one alternate perspective.

Bricolage, "the art of creating with what is at hand" (Scribner, 2005, p. 297) or "designing immediately" (BŸscher, Gill, Mogensen, & Shapiro, 2001, p. 23) involves the manipulation and creative repurposing of existing, and often unlikely, resources into new arrangements to solve a concrete, contextualized problem. Ciborra (1992) argues that bricolage – defined as the "capability of integrating unique ideas and practical design solutions at the end-user level" (p. 299) – is more important in developing organisational applications of ICT that provide competitive advantage than traditional strategic approaches. Scribner (2005) and other authors have used bricolage to understand the creative and considered repurposing of readily available resources that teachers use to engage in the difficult task of helping people learn. Bricolage is not without its problems. There are risks associated with extremes of both the strategic and bricolage approaches to how work gets done (Jones, Luck, McConachie, & Danaher, 2005). In the context of institutional e-learning, the problem is that at the moment the strategic is crowding out bricolage. For example, Groom and Lamb (2014) observe that the cost of supporting an enterprise learning tool (e.g. LMS) limits resources for user-driven innovation, in part because it draws "attention and users away" (n.p) from the strategic tool (i.e. LMS). The demands of sustaining the large and complex strategic tool dominates priorities and leads to "IT organizationsÉdefined by what’s necessary rather than what’s possible" (Groom & Lamb, 2014, n.p). There would appear to be some significant benefit to exploring a dynamic and flexible interplay between the strategic and bricolage approaches to deciding what work gets done.

How ICT is perceived: Affordances or Established

The established view sees ICT as a hard technology (Dron, 2013). What can be done with hard technology is fixed in advance either by embedding it in the technology or "in inflexible human processes, rules and procedures needed for the technology’s operation" (Dron, 2013, p. 35). An example of this is the IT person quoted by Sturgess and Nouwens (2004) as suggesting in the context of an LMS evaluation process that "we should seek to change people’s behavior because information technology systems are difficult to change" (n.p). This way of perceiving ICTs assumes that the functionality provided by technology is established and cannot be changed. This creates the problem identified by Rushkoff (2010) where "instead of optimizing our machines for humanity – or even the benefit of some particular group – we are optimizing humans for machinery" (p. 15). Perhaps in no small way the established view of ICT in e-learning contributes to Dede’s (2008) observation that "widely used instructional technology applications have less variety in approach than a low-end fast-food restaurant" (p. 58). The established view of ICT challenges Kay’s (1984) discussion of the "protean nature of the computer" (p. 59) as "the first metamedium, and as such has degrees of freedom and expression never before encountered" (p. 59). The problem is that digital technology is "biased toward those with the capacity to write code" (Rushkoff, 2010, p. 128) and increasingly those who can code have been focused on avoiding it.

The established view of ICT represents a narrow view of technological change and human agency. When unable to achieve a desired outcome, people will use the available knowledge and resources to create an alternative path, they will create a workaround (Koopman & Hoffman, 2003). For example, Hannon (2013) talks about the "hidden effort" (p. 175) of "meso-level practitioners – teaching academics, learning technologies, and academic developers" (p. 175) to bridge the gaps created by centralised technologies. The established view represents the designer-centred idea of achieving "perfect" software (Koopman & Hoffman, 2003), rather than recognising the need for on-going adaptation due to the diversity, complexity and on-going change inherent in university e-learning. The established view also ignores Kay’s (1984) description of the computer as offering "degrees of freedom and expression never before encountered" (p. 59). The established view does not leverage the affordance of ICT for change and freedom. Following Goodyear et al (2014), affordances are not a feature of a technology, but rather it is a relationship between the technology and the people using the technology. Within university e-learning the affordance for change has been limited due to both the perceived nature of the technology – best practice guidelines for integrated systems such as LMS and ERP recommend vanilla implementation (Robey, Ross, & Boudreau, 2002) – and the people – the apparent low digital fluency of academics (Johnson, Adams Becker, Cummins, & Estrada, 2014, p. 3). However, this is changing. There are faculty and students who are increasingly digitally fluent (e.g. the authors of this paper) and easily capable of harnessing the advent of technologies that "help to make bricolage an attainable reality" (BŸscher et al., 2001, p. 24) such as the IMS LTI standards, APIs (Lane, 2014) and augmented browsing (Dai, Tsai, Tsai, & Hsu, 2011). An affordances perspective of ICT seeks to leverage the capacity for ICT to be manipulated so that it offers the best possible affordances for learners and teachers. A move away from the established "design of an artefact towards emergent design of technology-in-use, particularly by the users" (Johri, 2011, p. 212).

How you see the world: Distributed or Tree-like

The methods used to solve most of the large and complex problems that make up institutional e-learning rely upon a tree-like or hierarchical conception of the world. To manage a university it is broken up into a tree-like structure consisting of divisions, faculties, schools, and so on. The organisation of the formal learning and teaching done at the university relies upon a tree-like structure of degrees, majors/minors, courses or units, learning outcomes, weeks, lectures, tutorials, etc. The information systems used to enable formal learning and teaching mirror the tree-like structure of the organisation with separation into different systems responsible for student records, learning management, learning content management etc. The individual information systems themselves are broken up into tree-like structures reliant on modular design. These tree-like structures are the result of the reliance on methods that use analysis and logical decomposition to reduce larger complex wholes into smaller more easily understood and manageable parts (Truex, Baskerville, & Travis, 2000). These methods produce tree-like structures of independent, largely black-boxed components that interact through formally approved mechanisms that typically involve oversight or approval from further up the hierarchy. For example, a request for a new feature in an LMS must wend its way up the tree-like governance structure until it is considered at the institutional level, compared against institutional priorities and ranked against other requests, before possibly being passed down to the other organisational black-box that can fulfill that request. There are numerous limitations associated with tree-like structures. For example, Holt et al (2013) identify just one of these limitations when they argue that the growing complexity of institutional e-learning means that no one leader at the top of a hierarchical tree has the knowledge to "possibly contend with the complexity of issues" (p. 389).

The solution suggested by Holt et al (2013) is distributed leadership which is in turn based on broader theoretical foundations of distributed cognition, social learning, as well as network and activity theories. A theoretical foundation that can be seen in a broad array of distributed ways of looking at the world. For example, in terms of learning, Siemens’ (2008) lists the foundations of connectivism: as activity theory; distributed and embodied cognition; complexity; and network theory. At the core of connectivism is the "thesis that knowledge is distributed across a network of connections and therefore learning consists of the ability to construct and traverse those networks" (Downes, 2011, n.p). Johri (2011) links much of this same foundation to socio-materiality and suggests that it offers "a key theoretical perspective that can be leveraged to advance research, design and use of learning technologies" (p. 210). Poldolny & Page (1998) apply the distributed view to governance and organisations and describe it as meaning that two or more actors are able to undertake repeated interactions over a period of time without having a centralised authority responsible for resolving any issues arising from those interactions. Rather than the responsibility and capability for specific actions being seen as belonging to any particular organisational member or group (tree-like), the responsibility and capability is distributed across a network of individuals, groups and technologies. The distributed view sees institution e-learning as a complex, dynamic, and interdependent assemblages of diverse actors (both human and not) distributed in complex networks.

It is our argument that being aware of the differences in thinking between the SET and BAD frameworks offers insight that can guide the design of interventions that are more likely to bridge the e-learning reality/rhetoric chasm. The following sections describe the development and adaptation of the Moodle Activity Viewer (MAV) at both CQUni and USQ as an example of what is possible when breaking BAD.

Breaking BAD and the development of MAV

The second author works for Learning and Teaching Services at CQUniversity (CQUni). In late 2012, he was working on a guide for teaching staff titled "How can I enhance my teaching practice?". In contributing to the "Designing effective course structure" section of this guide, the author asked a range of rhetorical questions including "How do you know which resources your students access the most, and the least?". Providing an answer to this question for the reader took more effort than expected. There are reports available in Moodle 2.2 (the version being used by CQUni at the time) that can be used to answer this question. However, they suffer from a number of limitations including: duplicated report names; unclear differences between reports; usage values include both staff and student activity; poor speed of generation; and, a tabular format. It was apparent that these limitations were acting as a barrier to reflection on course design. This was especially problematic, as the institution had placed increased emphasis on generating and responding to student feedback (CQUniversity, 2012). Annual course enhancement reports – introduced in 2010 – required teaching staff to respond to feedback from students and highlight enhancements to be made for the course’s next offering (CQUniversity, 2011). Information about activity and resource usage on the course Moodle site was seen by some to be useful in completing these reports. However, there was no apparent strategic or organisational imperative to address issues with the Moodle reports and it appeared likely that the aging version of Moodle (version 2.2) would persist for some time given other organisational priorities. As a stopgap solution the author and a colleague engaged in some bricolage and began writing SQL queries for the Moodle database and generating Excel spreadsheets. Whilst this approach provided more useful data, the spreadsheets were manually generated on request and the teaching staff had to bridge the conceptual gap between the information within the Excel spreadsheet and their Moodle course site.

In the months following, the author started thinking about a better approach. While CQUni had implemented a range of customisations to the institution’s Moodle instance, substantial changes required a clear understanding of the final requirements, alignment with strategic imperatives, and support of the senior management. At this stage of the process it was not overly clear what the final requirements of a solution would be, hence more experimentation was required to better understand the problem and possible solutions, prior to making the case for modifying Moodle.  While the author did not have the ability to change the institution’s version of Moodle itself, he did have access to: a copy of the Moodle database; access to a server computer; and software development abilities. Any bridging of this particular gap would need to draw on available resources (bricolage) and not disturb or impact critical high-availability services such as Moodle. Given uncertainty about what functionality might best enable reflection on course design any potential solution would also need to enable a significant level of agility and experimentation (bricolage).

The technical solution that seemed to best fulfill these requirements was augmented browsing. Dai et al (2011) define augmented browsing as "an effective means for dynamically adding supplementary information to a webpage without having users navigate away from the page" (p. 2418). The use of augmented browsing to add functionality to a LMS is not new.  Leony et al (2012) created a browser add-on that embeds learning analytics graphs directly within the Moodle LMS course home page. Dawson et al (2011) used what is known as bookmarklets to generate interactive sociograms to visualise student learning networks as part of SNAPP.  The problems that drove SNAPP’s use of augmented browsing – complex and difficult to interpret LMS reports and the difficulty of getting suggestions from teaching staff integrated into an institution LMS (Dawson et al., 2011) – mirror those faced at CQU.

Through a process of bricolage the Moodle Activity Viewer (MAV) was developed as an add-on for the Firefox web browser. More specifically, the MAV is built upon another popular Firefox add-on called Greasemonkey, and in Greasemonkey terms MAV is known as a userscript.  However, for the purposes of this paper, the MAV will be referred to more generally as an add-on to the browser. The intent was that the MAV would generate a heat map and embed it directly onto any web page produced by Moodle. A heat map shades each of the links in a web page with a spectrum of colours where the deeper red shades indicate links that are being clicked on more often (see Figure 1). The implementation of the MAV is completely separate from the institutional Moodle instance meaning its use has no impact on the production Moodle environment. Once the MAV add-on is installed into Firefox, and with it turned on, any web page from a Moodle course site can have a heat map overlaid on all Moodle links in that page. This process starts with the MAV add-on recognising a newly loaded page as belonging to a Moodle course site. When this occurs the MAV will generate a query asking for usage figures associated with every relevant Moodle link on that web page. This query is sent to the MAV server hosted on an available server computer. The MAV server translates the query into appropriate queries that will extract the necessary information from the Moodle database. As implemented at CQU, the MAV server relies on a copy of the Moodle database that is updated daily. While not necessary, use of a copy of the Moodle database ensures that there is no risk of disrupting the production Moodle instance.

The MAV add-on can be configured to generate overlays based on the number of clicks on a link, or the number of students who have clicked on a link. It can also be configured to limit the overlays to particular groups of students or to a particular student. When used on the main course page, MAV provides an overview of how students are using all of the course resources. Looking at a discussion forum page with the MAV enabled allows the viewer to analyse which threads or messages are receiving the most attention. Hence MAV can provide a simple form of process analytics (Lockyer, Heathcote, & Dawson, 2013).

An initial proof-of-concept implementation of the MAV was developed by April 2013. A few weeks later this implementation was demonstrated to the “Moodle 2 Project Board” to seek approval to continue development. The plan was to engage in small trials with academic staff and evolve the tool. The intent was that this would generate a blueprint for the implementation of heat maps within Moodle itself.  The low-risk nature of the approach contributed to approval to continue. However, by July 2013, the institution downsized through an organisational restructure and resources in the IT department were subsequently reduced.  As part of this restructure, and in an effort to reduce costs, the IT Department set to reduce the level of in-house systems development in favour of more established “vanilla” systems (off-the-shelf with limited or no customisations).  This new strategy made it unlikely that the MAV would be re-implemented directly within Moodle, and the augmented browsing approach might be viable longer term. As the MAV was being developed and refined, it was being tested by a small group of teaching staff within the creator’s team. Then in September 2013, the first official trial was launched making the MAV available to all staff within one of CQUniversity’s schools. 

How MAV works by David T Jones, on Flickr

Figure 1: How MAV works (Click on the image to see larger version)

Early in March 2012, prior to the genesis of the MAV, the second author and a colleague developed a proposal for a student retention project. It was informed by ongoing research into learning analytics at the institution and motivated by a strategic institutional imperative to improve student retention (CQUniversity, 2011).  It was not until October 2013 – after the commencement of the first trial of the MAV – that a revised version of the proposal received final approval and the project commenced in November under the name EASICONNECT.  Part of the EASICONNECT project was the inclusion of an early alerts system for disengaged students called EASI (Early Alert Student Indicators) to identify disengaged students early, and provide simple tools to nudge the students to re-engage, with the hope of improving student retention. In 2013, between the proposal submission and final approval of the EASICONNECT Project, EASI under a different name (Student Support Indicators – SSI) was created as a proof-of-concept and used in a series of small term-based trials, evolving similarly to the MAV. One of the amendments made to the approved proposal by the project sponsor (management) was the inclusion of the MAV as a project deliverable in the EASICONNECT project.

Neither EASI nor the MAV were strictly the results of strategic plans. Both systems arose from bricolage being undertaken by two members of CQUni’s Learning and Teaching Services that was later recognised as contributing to the strategic aims of the institution. With the eventual approval of the EASICONNECT project, the creators of EASI and the MAV worked more closely together on these tools and the obvious linkages between them were developed further. Initially this meant modifying the MAV so staff participating in the EASI trial could easily navigate from the MAV to EASI. In Term 1, 2014 EASI introduced links for each student in a course, that when clicked, would open the Moodle course site with the MAV enabled only for the selected student. While EASI showed a summary of the number of clicks made by the student in the course site, the MAV could then contextualise this information, revealing where those clicks took place directly within Moodle. In Term 2, 2014 a feature often requested by teaching staff was added to the MAV that would identify students who had and hadn’t clicked on links. The MAV also provided an option for staff to open EASI to initiate an email nudge to either group of students. Figure 2 provides a comparison of week-to-week usage of MAV between term 1 and 2, of 2014. The graphs show usage in terms of the number of page views and number of staff using the system, with the Term 2 figures including up until the end of Week 10 (of 15).

Both MAV and its sister project EASI were initiated as a form of bricolage. It was only later that both projects enjoyed the synthesised environment of a strategic project that provided the space and institutional permission for this work to scale and continue to merge. MAV arose due to the limited affordances offered by the LMS and the promise that different ICT could be harnessed to enhance the perceived affordances. Remembering that affordances are not something innate to a tool, but are instead co-constitutive between tool, user and context; the on-going use of bricolage allowed the potential affordances of the tool to evolve in response to use by teaching staff. Through this approach MAV has been able to evolve from potentially offering affordances of value to teaching staff as part of "design for reflection and redesign" (Dimitriadis & Goodyear, 2013) to also offering potential affordances for "design for orchestration" (Dimitriadis & Goodyear, 2013).

Figure 2: 2014 MAV usage at CQUni: Comparison between T1 and T2 (Click on images to see larger versions of the graphs)

MAV Usage - page views by David T Jones, on Flickr
MAV usage - # staff by David T Jones, on Flickr

Implementing MAV as a browser add-on also enables a break from the tree-like conceptions that underpin the design of large integrated systems like an LMS. The tree-like conception is so evident in the Moodle LMS that it is visible in the name. Moodle is an acronym for Modular Object-Oriented Dynamic Learning Environment. With Modular capturing the fact that "Moodle is built in a highly modular fashion" (Dougiamas & Taylor, 2003, p. 173), meaning that logical decomposition is used to break the large integrated system into small components or modules. This modular architecture allows the rapid development and addition of independent plugins and is a key enabler of the flexibility of Moodle. However, this is based on each of the modules being largely independent of each other, which has the consequence of making it more difficult to have functionality that crosses modular boundaries, such as taking usage information from the logging systems and integrating that information into all of the modules that work together to produce a web page generated by Moodle.

Extending MAV at another institution

In 2012 the first author commenced work within the Faculty of Education at the University of Southern Queensland (USQ). The majority of the allocated teaching load involved two offerings of EDC3100, ICTs and Pedagogy. EDC3100 is a large (300+ on-campus and online students first semester, and ~100 totally online second semester) core, third year course for Bachelor of Education (BEdu) students. The author expected that USQ would have high quality systems and processes to support large, online courses. This was due to USQ’s significant reputation in the practice and research of distance and online education; it’s then stated vision "To be recognised as a world leader in open and flexible higher education" (USQ, 2012, p. 5); and the observation that "by 2012 up to 70% of students in the Bachelor of Education were studying at least some subjects online" (Albion, 2014, p. 1163). The experience of teaching EDC3100 quickly revealed an e-learning reality/rhetoric chasm.

As a core course EDC3100 students study at all of USQ’s campuses, a Malaysian partner, and online from across Australia and the world. The students are studying to become teachers in early childhood, primary, secondary and VET settings. The course is designed so that the “Study Desk” (the Moodle course site) is an essential source of information and support for all students. The course design makes heavy use of discussion forums for a range of learning activities. Given the size and diversity of the student population there are times when it is beneficial for teaching staff to customise their responses to the student’s context and specialisation. For instance, an example from the Australian Curriculum may be appropriate for a primary or lower secondary pre-service teacher based in Australia, but inappropriate for a VET pre-service teacher. Whilst the Moodle discussion forum draws on user profiles to identify authors of posts, the available information is limited to that provided centrally via the institution and by the users. For EDC3100 this means that a student’s campus is apparent through their membership of the Moodle groups automatically created by USQ’s systems, however, seeing this requires navigating away from the discussion forum. The student’s specialisation is not visible in Moodle. The only way this information is available is to ask an administrative staff member with the appropriate student records access to generate a spreadsheet (and then update the spreadsheet as students add and drop the course) that includes this specific information. The lack of easy access to this information constrains the ability of teaching staff to effectively intervene.

One explanation for the existence of this gap is the limitations of the SET approach to institutional e-learning systems. The tree-based practice of logical decomposition results in distinct tasks – such as the management of student demographic and enrolment data (Peoplesoft), and the practice of online learning (Moodle) – being supported by different information systems with different data models and owned by different organisational units. Logical decomposition allows each of these individual systems and their owners to focus on the efficiency of their primary task. However, it comes at the cost of making it more difficult to both recognise and respond to requirements that go across the tasks (e.g. teaching). It is even more difficult when the requirement is specific to a subset of the organisation. For example, ensuring that information about the specialisation of BEdu students is evident in Moodle is only of interest to some of the staff teaching into the BEdu. Even if this barrier could be overcome, modifying the Moodle discussion forum to make this type of information more visible would be highly unlikely due to the cost, difficulty and (quite understandable) reluctance to make changes to enterprise software inherent in the established-view of technology.

To address this need the MAV add-on was modified to recognise USQ Moodle web pages that contain links to student profiles (e.g. a forum post). On recognising such a page the modified version of MAV queries a database populated using the manually provided spreadsheet described above. MAV uses that information to add to each student profile link a popup dialog that provides student information such as specialisation and campus without leaving the page. Adding different information (e.g. activity completion, GPA etc.) to this dialog can proceed without the approval of any centralised authority. The MAV server and the database run on the author’s laptop and the author has the skill to modify the database and write new code for both the MAV server and client. As such it’s an example of Podonly and Page’s (1998) distributed approach to governance. The only limitation is whether or not the necessary information can be retrieved in a format that can be easily imported into the database.

Conclusions, implications and future work

Future work will focus on continuing an on-going cycle of design-based research exploring how and with what impacts the BAD framework can be fruitfully integrated into the practice of institutional e-learning. To aid this process we are exploring how MAV, its various modifications, and descendants can be effectively developed and shared within and between institutions. As a first step, the CQU MAV code has been released on GitHub (https://github.com/damoclark/mav), development is occurring in the open and interested collaborators are welcome. A particular interest is in exploring and evaluating the use of MAV to implement scaffolding and context-sensitive conglomerations. Proposed in Jones (2012) a conglomeration seeks to enhance the affordances offered by any standard e-learning tool (e.g. a discussion forum) with a range of additional and often contextually specific information and functionality. Both uses of MAV described above are simple examples of a conglomeration. Of particular interest is whether these conglomerations can be used to explore whether Goodyear’s (2009) idea that "research-based evidence and the fruits of successful teaching experience can be embodied in the resources that teachers use at design time" can be extended to institutional e-learning tools.

Perhaps the biggest challenge to this work arises from the observation that the SET framework forms the foundation for current institutional practice and that the SET and BAD frameworks are largely incommensurable. At CQU, MAV has benefited from recognition and support of senior management; yet, it still challenges the assumptions of those operating solely through the SET framework. The incommensurable nature of the SET and BAD frameworks imply that any attempts to fruitfully merge the two will need to deal with existing, and sometimes strongly held assumptions and mindsets. For example, rather than require the IT division to formally approve and develop all applications of ICT, their focus should perhaps turn (at least in part) to enabling and encouraging "ways to make work-arounds easier for users to create, document and share" (Koopman & Hoffman, 2003, p. 74) through organisational "settings, and systems É arranged so that invention and prototyping by end-users can flourish" (Ciborra, 1992, p. 305). Similarly, rather than academic staff development focusing on ensuring that the appropriate knowledge is embedded in the heads of teaching staff (e.g. formal teaching qualifications), there should be a shift to a focus on ensuring that the appropriate knowledge is embedded within the network of actors – both people and artefacts – distributed within and perhaps outside the institution. Rather than accept "the over-hyped, pre-configured digital products and practices that are being imported continually into university settings" (Selwyn, 2013, p. 3), perhaps universities should instead actively contribute to "a genuine grassroots interest needs to be developed in the co-creation of alternative educational technologies.  In short, mass participation is needed in the development of “digital technology for university educators by university educators” (p. 3).

Biggs (2012) conceptualises the job of a teacher as being responsible for creating a learning context in which "all students are more likely to use the higher order learning processes which ‘academic’ students use spontaneously" (p. 39). If this perspective is taken one step back, then it is the responsibility of a university to create an institutional context in which all teaching staff are more likely to create the type of learning context which ‘good’ teachers create spontaneously. The on-going existence of the e-learning reality/rhetoric chasm suggests many universities are yet to achieve this goal. This paper has argued that this is due in part to the institutional implementation of e-learning being based on a limited SET of theoretical conceptions. The paper has compared the SET framework with the BAD framework and argued that the BAD framework provides a more promising theoretical foundation for bridging this chasm. It has illustrated the strengths and weaknesses of these two frameworks through a description of the origins and on-going use of the Moodle Activity Viewer (MAV) at two institutions. The suggestion here is not that institutions should see the BAD framework as a replacement for the SET framework, but rather that they should engage in some bricolage and explore how contextually appropriate mixtures of both frameworks can help bridge their e-learning reality/rhetoric chasm. Perhaps universities need to break a little BAD?

References

Albion, P. (2014). From Creation to Curation: Evolution of an Authentic’Assessment for Learning’Task. In M. Searson & M. Ochoa (Eds.), Society for Information Technology & Teacher Education International Conference (pp. 1160-1168). Chesapapeake, VA: AACE.

Biggs, J. (2012). What the student does: teaching for enhanced learning. Higher Education Research & Development, 31(1), 39-55. doi:10.1080/07294360.2012.642839

BŸscher, M., Gill, S., Mogensen, P., & Shapiro, D. (2001). Landscapes of practice: bricolage as a method for situated design. Computer Supported Cooperative Work, 10(1), 1-28.

Ciborra, C. (1992). From thinking to tinkering: The grassroots of strategic information systems. The Information Society, 8(4), 297-309.

CQUniversity. (2011). CQUniversity Annual Report 2010 (p. 136). Rockhampton.

CQUniversity. (2012). CQUniversity Annual Report 2011 (p. 84). Rockhampton.

Dai, H. J., Tsai, W. C., Tsai, R. T. H., & Hsu, W. L. (2011). Enhancing search results with semantic annotation using augmented browsing. IJCAI Proceedings – International Joint Conference on Artificial Intelligence, 22(3), 2418-2423.

Dawson, S., Bakharia, A., Lockyer, L., & Heathcote, E. (2011). “Seeing” networks : visualising and evaluating student learning networks Final Report 2011. Canberra: Australian Learning and Teaching Council.

Dede, C. (2008). Theoretical perspectives influencing the use of information technology in teaching and learning. In J. Voogt & G. Knezek (Eds.), International Handbook of Information Technology in Primary and Secondary Education (pp. 43-62). New York: Springer.

Dimitriadis, Y., & Goodyear, P. (2013). Forward-oriented design for learning : illustrating the approach. Research in Learning Technology, 21, 1-13. Retrieved from http://www.researchinlearningtechnology.net/index.php/rlt/article/view/20290

Downes, S. (2011). “Connectivism” and Connective Knowledge. Retrieved from http://www.huffingtonpost.com/stephen-downes/connectivism-and-connecti_b_804653.html

Dron, J. (2013). Soft is hard and hard is easy: learning technologies and social media. Form@ Re-Open Journal per La Formazione in Rete, 13(1), 32-43. Retrieved from http://fupress.net/index.php/formare/article/view/12613

Geoghegan, W. (1994). Whatever happened to instructional technology? Paper presented at the 22nd Annual Conference of The International Business Schools Computing Association. Baltimore, MD.

Goodyear, P. (2009). Teaching, technology and educational design: The architecture of productive learning environments (pp. 1-37). Sydney. Retrieved from http://www.olt.gov.au/system/files/resources/Goodyear%2C P ALTC Fellowship report 2010.pdf

Goodyear, P., Carvalho, L., & Dohn, N. B. (2014). Design for networked learning: framing relations between participants’ activities and the physical setting. In S. Bayne, M. de Laat, T. Ryberg, & C. Sinclair (Eds.), Ninth International Conference on Networked Learning 2014 (pp. 137-144). Edinburgh, Scotland. Retrieved from http://www.networkedlearningconference.org.uk/abstracts/pdf/goodyear.pdf

Groom, J., & Lamb, B. (2014). Reclaiming innovation. EDUCAUSE Review, 1-12. Retrieved from http://www.educause.edu/visuals/shared/er/extras/2014/ReclaimingInnovation/default.html

Hannon, J. (2013). Incommensurate practices: sociomaterial entanglements of learning technology implementation. Journal of Computer Assisted Learning, 29(2), 168-178. doi:10.1111/j.1365-2729.2012.00480.x

Holt, D., Palmer, S., Munro, J., Solomonides, I., Gosper, M., Hicks, M., É Hollenbeck, R. (2013). Leading the quality management of online learning environments in Australian higher education. Australasian Journal of Educational Technology, 29(3), 387-402. Retrieved from http://www.ascilite.org.au/ajet/submission/index.php/AJET/article/view/84

Inglis, A. (2007). Approaches taken by Australian universities to documenting institutional e-learning strategies. In R. J. Atkinson, C. McBeath, S.K. Soong, & C. Cheers (Eds.), ICT: Providing Choices for Learners and Learning. Proceedings ASCILITE Singapore 2007 (pp. 419-427). Retrieved from http://www.ascilite.org.au/conferences/singapore07/procs/inglis.pdf

Johnson, L., Adams Becker, S., Cummins, M., & Estrada, V. (2014). 2014 NMC Technology Outlook for Australian Tertiary Education: A Horizon Project Regional Report. Austin, Texas. Retrieved from http://www.nmc.org/publications/2014-technology-outlook-au

Johri, A. (2011). The socio-materiality of learning practices and implications for the field of learning technology. Research in Learning Technology, 19(3), 207-217. Retrieved from http://researchinlearningtechnology.net/coaction/index.php/rlt/article/view/17110

Jones, D. (2012). The life and death of Webfuse : principles for learning and leading into the future. In M. Brown, M. Hartnett, & T. Stewart (Eds.), Future challenges, sustainable futures. Proceedings ascilite Wellington 2012 (pp. 414-423). Wellington, NZ.

Jones, D., Luck, J., McConachie, J., & Danaher, P. A. (2005). The teleological brake on ICTs in open and distance learning. In 17th Biennial Conference of the Open and Distance Learning Association of Australia. Adelaide.

Kay, A. (1984). Computer Software. Scientific American, 251(3), 53-59.

Kezar, A. (2001). Understanding and Facilitating Organizational Change in the 21st Century: Recent Research and Conceptulizations. ASHE-ERIC Higher Education Report, 28(4).

Kirkwood, A., & Price, L. (2013). Technology-enhanced learning and teaching in higher education: what is “enhanced” and how do we know? A critical literature review. Learning, Media and Technology, (August), 1-31. doi:10.1080/17439884.2013.770404

Koopman, P., & Hoffman, R. (2003). Work-arounds, make-work and kludges. Intelligent Systems, IEEE, 18(6), 70-75.

Lane, K. (2014). The University of API (p. 28). Retrieved from http://university.apievangelist.com/white-paper.html

Laxon, A. (2013, September 14). Exams go online for university students. The New Zealand Herald.

Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing Pedagogical Action: Aligning Learning Analytics With Learning Design. American Behavioral Scientist, 57(10), 1439-1459. doi:10.1177/0002764213479367

McKenney, S., & Reeves, T. C. (2013). Systematic Review of Design-Based Research Progress: Is a Little Knowledge a Dangerous Thing? Educational Researcher, 42(2), 97-100. doi:10.3102/0013189X12463781

OECD. (2005). E-Learning in Tertiary Education: Where do we stand? (p. 289). Paris, France: Centre for Educational Research and Innovation, Organisation for Economic Co-operation and Development. Retrieved from http://new.sourceoecd.org/education/9264009205

Podolny, J., & Page, K. (1998). Network forms of organization. Annual Review of Sociology, 24, 57-76.

Rahman, N., & Dron, J. (2012). Challenges and opportunities for learning analytics when formal teaching meets social spaces. In 2nd International Conference on Learning Analytics and Knowledge (pp. 54-58). Vancourver, British Columbia: ACM Press. doi:10.1145/2330601.2330619

Reid, I. C. (2009). The contradictory managerialism of university quality assurance. Journal of Education Policy, 24(5), 575-593. doi:10.1080/02680930903131242

Robey, D., Ross, W., & Boudreau, M.-C. (2002). Learning to implement enterprise systems: An exploratory study of the dialectics of change. Journal of Management Information Systems, 19(1), 17-46.

Rushkoff, D. (2010). Program or be programmed: Ten commands for a digital age. New York: OR Books.

Salmon, G. (2005). Flying not flapping: a strategic framework for e-learning and pedagogical innovation in higher education institutions. ALT-J, Research in Learning Technology, 13(3), 201-218.

Scribner, J. (2005). The problems of practice: Bricolage as a metaphor for teachers’ work and learning. Alberta Journal of Educational Research, 51(4), 295-310. Retrieved from http://ajer.journalhosting.ucalgary.ca/ajer/index.php/ajer/article/view/587

Selwyn, N. (2008). From state‐of‐the‐art to state‐of‐the‐actual? Introduction to a special issue. Technology, Pedagogy and Education, 17(2), 83-87. doi:10.1080/14759390802098573

Selwyn, N. (2012). Social media in higher education. The Europa World of Learning. Retrieved from http://www.educationarena.com/pdf/sample/sample-essay-selwyn.pdf

Selwyn, N. (2013). Digital technologies in universities: problems posing as solutions? Learning, Media and Technology, 38(1), 1-3. doi:10.1080/17439884.2013.759965

Siemens, G. (2008). What is the unique idea in Connectivism? Retrieved July 13, 2014, from http://www.connectivism.ca/?p=116

Sturgess, P., & Nouwens, F. (2004). Evaluation of online learning management systems. Turkish Online Journal of Distance Education, 5(3). Retrieved from http://tojde.anadolu.edu.tr/tojde15/articles/sturgess.htm

Truex, D., Baskerville, R., & Travis, J. (2000). Amethodical systems development: the deferred meaning of systems development methods. Accounting Management and Information Technologies, 10, 53-79.

USQ. (2012). University of Southern Queensland 2011 Annual Report. Toowoomba. doi:10.1037/e543872012-001

Visscher-Voerman, I., &Gustafson, K. (2004). Paradigms in the theory and practice of education and training design. Educational Technology Research and Development, 52(2), 69-89.

Wang, F., & Hannafin, M. (2005). Design-Based Research and Technology-Enhanced Learning Environments. Educational Technology Research and Development, 53(4), 5-23.

Weimer, M. (2007). Intriguing connections but not with the past. International Journal for Academic Development, 12(1), 5-8.

Zellweger, F. (2005). Strategic Management of Educational Technology: The Importance of Leadership and Management. Riga, Latvia.

Searching for a phrase and some research

This is a plea for help. I’m certain I remember a particular phrase/concept that arose from some research around educational technology from 10+ years ago (may have been as long as 30 years ago).

It was a phrase/concept that was used to look critically at the tendency for education to create special education versions of real software. i.e. rather than use a standard bit of software – e.g. Word (*shudder*) – they would have to use a word processor made specially for education (K12 mostly I believe).

Does this ring any bells for you? Can you point me in a useful direction?

Thanks.