Breaking BAD to bridge the reality/rhetoric chasm

The following is a copy of a paper accepted at ASCILITE’2014 (and nominated for best paper) written by myself and Damien Clark (CQUniversity – @damoclarky). The official conference version of the paper is available as a PDF.

Presentation slides available on Slideshare.

The source code for the Moodle Activity Viewer is available on github. As are some of the scripts produced at USQ.

Abstract

The reality of using digital technologies to enhance learning and teaching has a history of falling short of the rhetoric. Past attempts at bridging this chasm have tried: increasing the perceived value of teaching; improving the pedagogical and technological knowledge of academics; redesigning organisational policies, processes and support structures; and, designing and deploying better pedagogical techniques and technologies. Few appear to have had any significant, widespread impact, perhaps because of the limitations of the (often implicit) theoretical foundations of the institutional implementation of e-learning. Using a design-based research approach, this paper develops an alternate theoretical framework (the BAD framework) for institutional e-learning and uses that framework to analyse the development, evolution, and very different applications of the Moodle Activity Viewer (MAV) at two separate universities. Based on this experience it is argued that the reality/rhetoric chasm is more likely to be bridged by interweaving the BAD framework into existing practice.

Keywords: bricolage, learning analytics, e-learning, augmented browsing, Moodle.

Introduction

In a newspaper article (Laxon, 2013) Professor Mark Brown makes the following comment on the quality of contemporary University e-learning:

E-learning’s a bit like teenage sex. Everyone says they’re doing it but not many people really are and those that are doing it are doing it very poorly. (n.p).

E-learning – defined by the OECD (2005) as the use of information and communications technology (ICT) to support and enhance learning and teaching – has been around for so long that there have been numerous debates about replacing it with other phrases. Regardless of the term used, there “has been a long-standing tendency in education for digital technologies to eventually fall short of the exaggerated expectations” (Selwyn, 2012, n.p.). Writing in the early 1990s Geoghagen (1994) seeks to understand why a three decade long “vision of a pedagogical utopia” (n.p.) promised by instructional technologies has failed to eventuate. Ten years on, Salmon (2005) notes that e-learning within universities is still struggling to move beyond projects driven by innovators and engage a significant percentage of students and staff. Even more recently, concerns remain about how much technology is being used to effectively enhance student learning (Kirkwood & Price, 2013). Given that “Australian universities have made very large investments in corporate educational technologies” (Holt et al., 2013, p. 388) it is increasingly important to understand and address the reality/rhetoric chasm around e-learning.

Not surprisingly the literature provides a variety of answers to this complex question. Weimer (2007) observes that academics come to the task of teaching with immense amounts of content knowledge, but little or no knowledge of teaching and learning, beyond perhaps their personal experience. A situation which may not change significantly given that academics are expected to engage equally in research and teaching and yet work towards promotion criteria that are perceived to primarily value achievements in research (Zellweger, 2005). It has been argued that the limitations of the Learning Management System (LMS) – the most common university e-learning tool – make the LMS less than suitable for more effective learner-centred approaches and is contributing to growing educator dissatisfaction (Rahman & Dron, 2012). It’s also been argued that the “limited digital fluency of lecturers and professors is a great challenge” (Johnson, Adams Becker, Cummins, & Estrada, 2014, p. 3) for the creative leveraging of emerging technologies. Another contributing factor is likely to be Selwyn’s (2008) suggestion that educational technologists have failed to be cognisant of “the more critical analyses of technology that have come to the fore in other social science and humanities disciplines” (p. 83). Of particular interest here is the observation of Goodyear et al (2014) that the “influence of the physical setting (digital and material) on learning activity is often important, but is under-researched and under-theorised: it is often taken for granted” (p. 138).

This paper reports on the initial stages of a design-based research project that aims to bridge the e-learning reality/rhetoric chasm by exploring and harnessing alternative theoretical foundations for the institutional implementation of e-learning. The paper starts comparing and contrasting two different theoretical foundations of institutional e-learning. The SET framework is suggested as a description of the mostly implicit assumptions underpinning most contemporary approaches. The BAD framework is proposed as an alternative and perhaps complementary framework that better captures the reality of what happens and if effectively integrated into institutional practices may help bridge the chasm. The development of a technology – the Moodle Activity Viewer (MAV) – and its use at two different universities is then used to illustrate the benefits and limitations of the SET and BAD frameworks, and how the two can be fruitfully combined. The paper closes with some discussion of implications and future work.

Breaking BAD versus SET in your ways

The work described here is part of an on-going cycle of design-based research that aims to develop new artefacts and theories that can help bridge the e-learning reality/rhetoric chasm. We believe that bridging this chasm is of theoretical and practical significance to the sector and to us personally. The interventions we describe in the following sections arose out of our day-to-day work and were informed by a range of theoretical perspectives. This section offers a brief description of the theoretical frameworks that have informed and been refined by this work. This is important as design-based research should depart from a problem (McKenney & Reeves, 2013), be grounded in practice, theory-driven and seek to refine both theory and practice (Wang & Hannafin, 2005). The frameworks described here are important because they identify a mindset (the SET framework) that contributes significantly to the on-going difficulty in bridging the e-learning reality/rhetoric chasm, and offers an alternate mindset (the BAD framework) that provides principles that can help bridge the chasm. The SET and BAD frameworks are broadly incommensurable ways of answering three important, inter-related questions about the implementation of e-learning. While the SET framework represents the most commonly accepted mindset used in practice, both frameworks are evident in both the literature and in practice. Table 1 provides an overview of both frameworks.

Table 1: The BAD and SET frameworks for e-learning implementation
Question SET BAD
What work gets done? Strategy – following a global plan intended to achieve a pre-identified desired future state. Bricolage – local piecemeal action responding to emerging contingencies.

 

How ICT is perceived? Established – ICT is a hard technology and cannot be changed. People and their practices must be modified to fit the fixed functionality of the technology. Affordances – ICT is a soft technology that can be modified to meet the needs of its users, their context, and what they would like to achieve.
How you see the world? Tree-like – the world is relatively stable and predictable. It can be understood through logical decomposition into a hierarchy of distinct black boxes. Distributed – the world is complex, dynamic, and consists of interdependent assemblages of diverse actors (human and not) connected via complex networks.

What work gets done: Bricolage or Strategic

The majority of contemporary Australian universities follow a strategic approach to deciding what work gets done. Numerous environmental challenges and influences have led to universities being treated as businesses with an increasing prevalence of managers using “strategic control and a focus on outputs which can be quantified and compared” (Reid, 2009, p. 575) to manage academic activities. A strategic approach involves the creation of a vision identifying a desired future state and the development of operational plans to bring about the desired future state. The only work that is deemed acceptable is that which fits within the established operational plan and is seen to contribute to the desired future state. All other work is deemed inefficient. The strategic approach is evident at all levels of institutional e-learning. Inglis (2007) describes how government required Australian universities to have institutional learning and teaching strategic plans published on their websites. The strategic or planning-by-objectives (e.g. learning outcomes, graduate attributes) approach also underpins how course design is largely assumed to occur with Visscher-Voerman and Gustafson (2004) finding that it underpins “a majority of the instructional design models in the literature” (p. 77). The strategic approach is so ingrained that it is often forgotten that these ideas have not always existed (Kezar, 2001), have significant flaws, and that there is at least one alternate perspective.

Bricolage, “the art of creating with what is at hand” (Scribner, 2005, p. 297) or “designing immediately” (BŸscher, Gill, Mogensen, & Shapiro, 2001, p. 23) involves the manipulation and creative repurposing of existing, and often unlikely, resources into new arrangements to solve a concrete, contextualized problem. Ciborra (1992) argues that bricolage – defined as the “capability of integrating unique ideas and practical design solutions at the end-user level” (p. 299) – is more important in developing organisational applications of ICT that provide competitive advantage than traditional strategic approaches. Scribner (2005) and other authors have used bricolage to understand the creative and considered repurposing of readily available resources that teachers use to engage in the difficult task of helping people learn. Bricolage is not without its problems. There are risks associated with extremes of both the strategic and bricolage approaches to how work gets done (Jones, Luck, McConachie, & Danaher, 2005). In the context of institutional e-learning, the problem is that at the moment the strategic is crowding out bricolage. For example, Groom and Lamb (2014) observe that the cost of supporting an enterprise learning tool (e.g. LMS) limits resources for user-driven innovation, in part because it draws “attention and users away” (n.p) from the strategic tool (i.e. LMS). The demands of sustaining the large and complex strategic tool dominates priorities and leads to “IT organizationsÉdefined by what’s necessary rather than what’s possible” (Groom & Lamb, 2014, n.p). There would appear to be some significant benefit to exploring a dynamic and flexible interplay between the strategic and bricolage approaches to deciding what work gets done.

How ICT is perceived: Affordances or Established

The established view sees ICT as a hard technology (Dron, 2013). What can be done with hard technology is fixed in advance either by embedding it in the technology or “in inflexible human processes, rules and procedures needed for the technology’s operation” (Dron, 2013, p. 35). An example of this is the IT person quoted by Sturgess and Nouwens (2004) as suggesting in the context of an LMS evaluation process that “we should seek to change people’s behavior because information technology systems are difficult to change” (n.p). This way of perceiving ICTs assumes that the functionality provided by technology is established and cannot be changed. This creates the problem identified by Rushkoff (2010) where “instead of optimizing our machines for humanity – or even the benefit of some particular group – we are optimizing humans for machinery” (p. 15). Perhaps in no small way the established view of ICT in e-learning contributes to Dede’s (2008) observation that “widely used instructional technology applications have less variety in approach than a low-end fast-food restaurant” (p. 58). The established view of ICT challenges Kay’s (1984) discussion of the “protean nature of the computer” (p. 59) as “the first metamedium, and as such has degrees of freedom and expression never before encountered” (p. 59). The problem is that digital technology is “biased toward those with the capacity to write code” (Rushkoff, 2010, p. 128) and increasingly those who can code have been focused on avoiding it.

The established view of ICT represents a narrow view of technological change and human agency. When unable to achieve a desired outcome, people will use the available knowledge and resources to create an alternative path, they will create a workaround (Koopman & Hoffman, 2003). For example, Hannon (2013) talks about the “hidden effort” (p. 175) of “meso-level practitioners – teaching academics, learning technologies, and academic developers” (p. 175) to bridge the gaps created by centralised technologies. The established view represents the designer-centred idea of achieving “perfect” software (Koopman & Hoffman, 2003), rather than recognising the need for on-going adaptation due to the diversity, complexity and on-going change inherent in university e-learning. The established view also ignores Kay’s (1984) description of the computer as offering “degrees of freedom and expression never before encountered” (p. 59). The established view does not leverage the affordance of ICT for change and freedom. Following Goodyear et al (2014), affordances are not a feature of a technology, but rather it is a relationship between the technology and the people using the technology. Within university e-learning the affordance for change has been limited due to both the perceived nature of the technology – best practice guidelines for integrated systems such as LMS and ERP recommend vanilla implementation (Robey, Ross, & Boudreau, 2002) – and the people – the apparent low digital fluency of academics (Johnson, Adams Becker, Cummins, & Estrada, 2014, p. 3). However, this is changing. There are faculty and students who are increasingly digitally fluent (e.g. the authors of this paper) and easily capable of harnessing the advent of technologies that “help to make bricolage an attainable reality” (BŸscher et al., 2001, p. 24) such as the IMS LTI standards, APIs (Lane, 2014) and augmented browsing (Dai, Tsai, Tsai, & Hsu, 2011). An affordances perspective of ICT seeks to leverage the capacity for ICT to be manipulated so that it offers the best possible affordances for learners and teachers. A move away from the established “design of an artefact towards emergent design of technology-in-use, particularly by the users” (Johri, 2011, p. 212).

How you see the world: Distributed or Tree-like

The methods used to solve most of the large and complex problems that make up institutional e-learning rely upon a tree-like or hierarchical conception of the world. To manage a university it is broken up into a tree-like structure consisting of divisions, faculties, schools, and so on. The organisation of the formal learning and teaching done at the university relies upon a tree-like structure of degrees, majors/minors, courses or units, learning outcomes, weeks, lectures, tutorials, etc. The information systems used to enable formal learning and teaching mirror the tree-like structure of the organisation with separation into different systems responsible for student records, learning management, learning content management etc. The individual information systems themselves are broken up into tree-like structures reliant on modular design. These tree-like structures are the result of the reliance on methods that use analysis and logical decomposition to reduce larger complex wholes into smaller more easily understood and manageable parts (Truex, Baskerville, & Travis, 2000). These methods produce tree-like structures of independent, largely black-boxed components that interact through formally approved mechanisms that typically involve oversight or approval from further up the hierarchy. For example, a request for a new feature in an LMS must wend its way up the tree-like governance structure until it is considered at the institutional level, compared against institutional priorities and ranked against other requests, before possibly being passed down to the other organisational black-box that can fulfill that request. There are numerous limitations associated with tree-like structures. For example, Holt et al (2013) identify just one of these limitations when they argue that the growing complexity of institutional e-learning means that no one leader at the top of a hierarchical tree has the knowledge to “possibly contend with the complexity of issues” (p. 389).

The solution suggested by Holt et al (2013) is distributed leadership which is in turn based on broader theoretical foundations of distributed cognition, social learning, as well as network and activity theories. A theoretical foundation that can be seen in a broad array of distributed ways of looking at the world. For example, in terms of learning, Siemens’ (2008) lists the foundations of connectivism: as activity theory; distributed and embodied cognition; complexity; and network theory. At the core of connectivism is the “thesis that knowledge is distributed across a network of connections and therefore learning consists of the ability to construct and traverse those networks” (Downes, 2011, n.p). Johri (2011) links much of this same foundation to socio-materiality and suggests that it offers “a key theoretical perspective that can be leveraged to advance research, design and use of learning technologies” (p. 210). Poldolny & Page (1998) apply the distributed view to governance and organisations and describe it as meaning that two or more actors are able to undertake repeated interactions over a period of time without having a centralised authority responsible for resolving any issues arising from those interactions. Rather than the responsibility and capability for specific actions being seen as belonging to any particular organisational member or group (tree-like), the responsibility and capability is distributed across a network of individuals, groups and technologies. The distributed view sees institution e-learning as a complex, dynamic, and interdependent assemblages of diverse actors (both human and not) distributed in complex networks.

It is our argument that being aware of the differences in thinking between the SET and BAD frameworks offers insight that can guide the design of interventions that are more likely to bridge the e-learning reality/rhetoric chasm. The following sections describe the development and adaptation of the Moodle Activity Viewer (MAV) at both CQUni and USQ as an example of what is possible when breaking BAD.

Breaking BAD and the development of MAV

The second author works for Learning and Teaching Services at CQUniversity (CQUni). In late 2012, he was working on a guide for teaching staff titled “How can I enhance my teaching practice?”. In contributing to the “Designing effective course structure” section of this guide, the author asked a range of rhetorical questions including “How do you know which resources your students access the most, and the least?”. Providing an answer to this question for the reader took more effort than expected. There are reports available in Moodle 2.2 (the version being used by CQUni at the time) that can be used to answer this question. However, they suffer from a number of limitations including: duplicated report names; unclear differences between reports; usage values include both staff and student activity; poor speed of generation; and, a tabular format. It was apparent that these limitations were acting as a barrier to reflection on course design. This was especially problematic, as the institution had placed increased emphasis on generating and responding to student feedback (CQUniversity, 2012). Annual course enhancement reports – introduced in 2010 – required teaching staff to respond to feedback from students and highlight enhancements to be made for the course’s next offering (CQUniversity, 2011). Information about activity and resource usage on the course Moodle site was seen by some to be useful in completing these reports. However, there was no apparent strategic or organisational imperative to address issues with the Moodle reports and it appeared likely that the aging version of Moodle (version 2.2) would persist for some time given other organisational priorities. As a stopgap solution the author and a colleague engaged in some bricolage and began writing SQL queries for the Moodle database and generating Excel spreadsheets. Whilst this approach provided more useful data, the spreadsheets were manually generated on request and the teaching staff had to bridge the conceptual gap between the information within the Excel spreadsheet and their Moodle course site.

In the months following, the author started thinking about a better approach. While CQUni had implemented a range of customisations to the institution’s Moodle instance, substantial changes required a clear understanding of the final requirements, alignment with strategic imperatives, and support of the senior management. At this stage of the process it was not overly clear what the final requirements of a solution would be, hence more experimentation was required to better understand the problem and possible solutions, prior to making the case for modifying Moodle.  While the author did not have the ability to change the institution’s version of Moodle itself, he did have access to: a copy of the Moodle database; access to a server computer; and software development abilities. Any bridging of this particular gap would need to draw on available resources (bricolage) and not disturb or impact critical high-availability services such as Moodle. Given uncertainty about what functionality might best enable reflection on course design any potential solution would also need to enable a significant level of agility and experimentation (bricolage).

The technical solution that seemed to best fulfill these requirements was augmented browsing. Dai et al (2011) define augmented browsing as “an effective means for dynamically adding supplementary information to a webpage without having users navigate away from the page” (p. 2418). The use of augmented browsing to add functionality to a LMS is not new.  Leony et al (2012) created a browser add-on that embeds learning analytics graphs directly within the Moodle LMS course home page. Dawson et al (2011) used what is known as bookmarklets to generate interactive sociograms to visualise student learning networks as part of SNAPP.  The problems that drove SNAPP’s use of augmented browsing – complex and difficult to interpret LMS reports and the difficulty of getting suggestions from teaching staff integrated into an institution LMS (Dawson et al., 2011) – mirror those faced at CQU.

Through a process of bricolage the Moodle Activity Viewer (MAV) was developed as an add-on for the Firefox web browser. More specifically, the MAV is built upon another popular Firefox add-on called Greasemonkey, and in Greasemonkey terms MAV is known as a userscript.  However, for the purposes of this paper, the MAV will be referred to more generally as an add-on to the browser. The intent was that the MAV would generate a heat map and embed it directly onto any web page produced by Moodle. A heat map shades each of the links in a web page with a spectrum of colours where the deeper red shades indicate links that are being clicked on more often (see Figure 1). The implementation of the MAV is completely separate from the institutional Moodle instance meaning its use has no impact on the production Moodle environment. Once the MAV add-on is installed into Firefox, and with it turned on, any web page from a Moodle course site can have a heat map overlaid on all Moodle links in that page. This process starts with the MAV add-on recognising a newly loaded page as belonging to a Moodle course site. When this occurs the MAV will generate a query asking for usage figures associated with every relevant Moodle link on that web page. This query is sent to the MAV server hosted on an available server computer. The MAV server translates the query into appropriate queries that will extract the necessary information from the Moodle database. As implemented at CQU, the MAV server relies on a copy of the Moodle database that is updated daily. While not necessary, use of a copy of the Moodle database ensures that there is no risk of disrupting the production Moodle instance.

The MAV add-on can be configured to generate overlays based on the number of clicks on a link, or the number of students who have clicked on a link. It can also be configured to limit the overlays to particular groups of students or to a particular student. When used on the main course page, MAV provides an overview of how students are using all of the course resources. Looking at a discussion forum page with the MAV enabled allows the viewer to analyse which threads or messages are receiving the most attention. Hence MAV can provide a simple form of process analytics (Lockyer, Heathcote, & Dawson, 2013).

An initial proof-of-concept implementation of the MAV was developed by April 2013. A few weeks later this implementation was demonstrated to the “Moodle 2 Project Board” to seek approval to continue development. The plan was to engage in small trials with academic staff and evolve the tool. The intent was that this would generate a blueprint for the implementation of heat maps within Moodle itself.  The low-risk nature of the approach contributed to approval to continue. However, by July 2013, the institution downsized through an organisational restructure and resources in the IT department were subsequently reduced.  As part of this restructure, and in an effort to reduce costs, the IT Department set to reduce the level of in-house systems development in favour of more established “vanilla” systems (off-the-shelf with limited or no customisations).  This new strategy made it unlikely that the MAV would be re-implemented directly within Moodle, and the augmented browsing approach might be viable longer term. As the MAV was being developed and refined, it was being tested by a small group of teaching staff within the creator’s team. Then in September 2013, the first official trial was launched making the MAV available to all staff within one of CQUniversity’s schools.

How MAV works by David T Jones, on FlickrFigure 1: How MAV works (Click on the image to see larger version)

Early in March 2012, prior to the genesis of the MAV, the second author and a colleague developed a proposal for a student retention project. It was informed by ongoing research into learning analytics at the institution and motivated by a strategic institutional imperative to improve student retention (CQUniversity, 2011).  It was not until October 2013 – after the commencement of the first trial of the MAV – that a revised version of the proposal received final approval and the project commenced in November under the name EASICONNECT.  Part of the EASICONNECT project was the inclusion of an early alerts system for disengaged students called EASI (Early Alert Student Indicators) to identify disengaged students early, and provide simple tools to nudge the students to re-engage, with the hope of improving student retention. In 2013, between the proposal submission and final approval of the EASICONNECT Project, EASI under a different name (Student Support Indicators – SSI) was created as a proof-of-concept and used in a series of small term-based trials, evolving similarly to the MAV. One of the amendments made to the approved proposal by the project sponsor (management) was the inclusion of the MAV as a project deliverable in the EASICONNECT project.

Neither EASI nor the MAV were strictly the results of strategic plans. Both systems arose from bricolage being undertaken by two members of CQUni’s Learning and Teaching Services that was later recognised as contributing to the strategic aims of the institution. With the eventual approval of the EASICONNECT project, the creators of EASI and the MAV worked more closely together on these tools and the obvious linkages between them were developed further. Initially this meant modifying the MAV so staff participating in the EASI trial could easily navigate from the MAV to EASI. In Term 1, 2014 EASI introduced links for each student in a course, that when clicked, would open the Moodle course site with the MAV enabled only for the selected student. While EASI showed a summary of the number of clicks made by the student in the course site, the MAV could then contextualise this information, revealing where those clicks took place directly within Moodle. In Term 2, 2014 a feature often requested by teaching staff was added to the MAV that would identify students who had and hadn’t clicked on links. The MAV also provided an option for staff to open EASI to initiate an email nudge to either group of students. Figure 2 provides a comparison of week-to-week usage of MAV between term 1 and 2, of 2014. The graphs show usage in terms of the number of page views and number of staff using the system, with the Term 2 figures including up until the end of Week 10 (of 15).

Both MAV and its sister project EASI were initiated as a form of bricolage. It was only later that both projects enjoyed the synthesised environment of a strategic project that provided the space and institutional permission for this work to scale and continue to merge. MAV arose due to the limited affordances offered by the LMS and the promise that different ICT could be harnessed to enhance the perceived affordances. Remembering that affordances are not something innate to a tool, but are instead co-constitutive between tool, user and context; the on-going use of bricolage allowed the potential affordances of the tool to evolve in response to use by teaching staff. Through this approach MAV has been able to evolve from potentially offering affordances of value to teaching staff as part of “design for reflection and redesign” (Dimitriadis & Goodyear, 2013) to also offering potential affordances for “design for orchestration” (Dimitriadis & Goodyear, 2013).

Figure 2: 2014 MAV usage at CQUni: Comparison between T1 and T2 (Click on images to see larger versions of the graphs)
MAV Usage - page views by David T Jones, on Flickr
MAV usage - # staff by David T Jones, on Flickr

Implementing MAV as a browser add-on also enables a break from the tree-like conceptions that underpin the design of large integrated systems like an LMS. The tree-like conception is so evident in the Moodle LMS that it is visible in the name. Moodle is an acronym for Modular Object-Oriented Dynamic Learning Environment. With Modular capturing the fact that “Moodle is built in a highly modular fashion” (Dougiamas & Taylor, 2003, p. 173), meaning that logical decomposition is used to break the large integrated system into small components or modules. This modular architecture allows the rapid development and addition of independent plugins and is a key enabler of the flexibility of Moodle. However, this is based on each of the modules being largely independent of each other, which has the consequence of making it more difficult to have functionality that crosses modular boundaries, such as taking usage information from the logging systems and integrating that information into all of the modules that work together to produce a web page generated by Moodle.

Extending MAV at another institution

In 2012 the first author commenced work within the Faculty of Education at the University of Southern Queensland (USQ). The majority of the allocated teaching load involved two offerings of EDC3100, ICTs and Pedagogy. EDC3100 is a large (300+ on-campus and online students first semester, and ~100 totally online second semester) core, third year course for Bachelor of Education (BEdu) students. The author expected that USQ would have high quality systems and processes to support large, online courses. This was due to USQ’s significant reputation in the practice and research of distance and online education; it’s then stated vision “To be recognised as a world leader in open and flexible higher education” (USQ, 2012, p. 5); and the observation that “by 2012 up to 70% of students in the Bachelor of Education were studying at least some subjects online” (Albion, 2014, p. 1163). The experience of teaching EDC3100 quickly revealed an e-learning reality/rhetoric chasm.

As a core course EDC3100 students study at all of USQ’s campuses, a Malaysian partner, and online from across Australia and the world. The students are studying to become teachers in early childhood, primary, secondary and VET settings. The course is designed so that the “Study Desk” (the Moodle course site) is an essential source of information and support for all students. The course design makes heavy use of discussion forums for a range of learning activities. Given the size and diversity of the student population there are times when it is beneficial for teaching staff to customise their responses to the student’s context and specialisation. For instance, an example from the Australian Curriculum may be appropriate for a primary or lower secondary pre-service teacher based in Australia, but inappropriate for a VET pre-service teacher. Whilst the Moodle discussion forum draws on user profiles to identify authors of posts, the available information is limited to that provided centrally via the institution and by the users. For EDC3100 this means that a student’s campus is apparent through their membership of the Moodle groups automatically created by USQ’s systems, however, seeing this requires navigating away from the discussion forum. The student’s specialisation is not visible in Moodle. The only way this information is available is to ask an administrative staff member with the appropriate student records access to generate a spreadsheet (and then update the spreadsheet as students add and drop the course) that includes this specific information. The lack of easy access to this information constrains the ability of teaching staff to effectively intervene.

One explanation for the existence of this gap is the limitations of the SET approach to institutional e-learning systems. The tree-based practice of logical decomposition results in distinct tasks – such as the management of student demographic and enrolment data (Peoplesoft), and the practice of online learning (Moodle) – being supported by different information systems with different data models and owned by different organisational units. Logical decomposition allows each of these individual systems and their owners to focus on the efficiency of their primary task. However, it comes at the cost of making it more difficult to both recognise and respond to requirements that go across the tasks (e.g. teaching). It is even more difficult when the requirement is specific to a subset of the organisation. For example, ensuring that information about the specialisation of BEdu students is evident in Moodle is only of interest to some of the staff teaching into the BEdu. Even if this barrier could be overcome, modifying the Moodle discussion forum to make this type of information more visible would be highly unlikely due to the cost, difficulty and (quite understandable) reluctance to make changes to enterprise software inherent in the established-view of technology.

To address this need the MAV add-on was modified to recognise USQ Moodle web pages that contain links to student profiles (e.g. a forum post). On recognising such a page the modified version of MAV queries a database populated using the manually provided spreadsheet described above. MAV uses that information to add to each student profile link a popup dialog that provides student information such as specialisation and campus without leaving the page. Adding different information (e.g. activity completion, GPA etc.) to this dialog can proceed without the approval of any centralised authority. The MAV server and the database run on the author’s laptop and the author has the skill to modify the database and write new code for both the MAV server and client. As such it’s an example of Podonly and Page’s (1998) distributed approach to governance. The only limitation is whether or not the necessary information can be retrieved in a format that can be easily imported into the database.

Conclusions, implications and future work

Future work will focus on continuing an on-going cycle of design-based research exploring how and with what impacts the BAD framework can be fruitfully integrated into the practice of institutional e-learning. To aid this process we are exploring how MAV, its various modifications, and descendants can be effectively developed and shared within and between institutions. As a first step, the CQU MAV code has been released on GitHub (https://github.com/damoclark/mav), development is occurring in the open and interested collaborators are welcome. A particular interest is in exploring and evaluating the use of MAV to implement scaffolding and context-sensitive conglomerations. Proposed in Jones (2012) a conglomeration seeks to enhance the affordances offered by any standard e-learning tool (e.g. a discussion forum) with a range of additional and often contextually specific information and functionality. Both uses of MAV described above are simple examples of a conglomeration. Of particular interest is whether these conglomerations can be used to explore whether Goodyear’s (2009) idea that “research-based evidence and the fruits of successful teaching experience can be embodied in the resources that teachers use at design time” can be extended to institutional e-learning tools.

Perhaps the biggest challenge to this work arises from the observation that the SET framework forms the foundation for current institutional practice and that the SET and BAD frameworks are largely incommensurable. At CQU, MAV has benefited from recognition and support of senior management; yet, it still challenges the assumptions of those operating solely through the SET framework. The incommensurable nature of the SET and BAD frameworks imply that any attempts to fruitfully merge the two will need to deal with existing, and sometimes strongly held assumptions and mindsets. For example, rather than require the IT division to formally approve and develop all applications of ICT, their focus should perhaps turn (at least in part) to enabling and encouraging “ways to make work-arounds easier for users to create, document and share” (Koopman & Hoffman, 2003, p. 74) through organisational “settings, and systems É arranged so that invention and prototyping by end-users can flourish” (Ciborra, 1992, p. 305). Similarly, rather than academic staff development focusing on ensuring that the appropriate knowledge is embedded in the heads of teaching staff (e.g. formal teaching qualifications), there should be a shift to a focus on ensuring that the appropriate knowledge is embedded within the network of actors – both people and artefacts – distributed within and perhaps outside the institution. Rather than accept “the over-hyped, pre-configured digital products and practices that are being imported continually into university settings” (Selwyn, 2013, p. 3), perhaps universities should instead actively contribute to “a genuine grassroots interest needs to be developed in the co-creation of alternative educational technologies.  In short, mass participation is needed in the development of “digital technology for university educators by university educators” (p. 3).

Biggs (2012) conceptualises the job of a teacher as being responsible for creating a learning context in which “all students are more likely to use the higher order learning processes which ‘academic’ students use spontaneously” (p. 39). If this perspective is taken one step back, then it is the responsibility of a university to create an institutional context in which all teaching staff are more likely to create the type of learning context which ‘good’ teachers create spontaneously. The on-going existence of the e-learning reality/rhetoric chasm suggests many universities are yet to achieve this goal. This paper has argued that this is due in part to the institutional implementation of e-learning being based on a limited SET of theoretical conceptions. The paper has compared the SET framework with the BAD framework and argued that the BAD framework provides a more promising theoretical foundation for bridging this chasm. It has illustrated the strengths and weaknesses of these two frameworks through a description of the origins and on-going use of the Moodle Activity Viewer (MAV) at two institutions. The suggestion here is not that institutions should see the BAD framework as a replacement for the SET framework, but rather that they should engage in some bricolage and explore how contextually appropriate mixtures of both frameworks can help bridge their e-learning reality/rhetoric chasm. Perhaps universities need to break a little BAD?

References

Albion, P. (2014). From Creation to Curation: Evolution of an Authentic’Assessment for Learning’Task. In M. Searson & M. Ochoa (Eds.), Society for Information Technology & Teacher Education International Conference (pp. 1160-1168). Chesapapeake, VA: AACE.

Biggs, J. (2012). What the student does: teaching for enhanced learning. Higher Education Research & Development, 31(1), 39-55. doi:10.1080/07294360.2012.642839

BŸscher, M., Gill, S., Mogensen, P., & Shapiro, D. (2001). Landscapes of practice: bricolage as a method for situated design. Computer Supported Cooperative Work, 10(1), 1-28.

Ciborra, C. (1992). From thinking to tinkering: The grassroots of strategic information systems. The Information Society, 8(4), 297-309.

CQUniversity. (2011). CQUniversity Annual Report 2010 (p. 136). Rockhampton.

CQUniversity. (2012). CQUniversity Annual Report 2011 (p. 84). Rockhampton.

Dai, H. J., Tsai, W. C., Tsai, R. T. H., & Hsu, W. L. (2011). Enhancing search results with semantic annotation using augmented browsing. IJCAI Proceedings – International Joint Conference on Artificial Intelligence, 22(3), 2418-2423.

Dawson, S., Bakharia, A., Lockyer, L., & Heathcote, E. (2011). “Seeing” networks : visualising and evaluating student learning networks Final Report 2011. Canberra: Australian Learning and Teaching Council.

Dede, C. (2008). Theoretical perspectives influencing the use of information technology in teaching and learning. In J. Voogt & G. Knezek (Eds.), International Handbook of Information Technology in Primary and Secondary Education (pp. 43-62). New York: Springer.

Dimitriadis, Y., & Goodyear, P. (2013). Forward-oriented design for learning : illustrating the approach. Research in Learning Technology, 21, 1-13. Retrieved from http://www.researchinlearningtechnology.net/index.php/rlt/article/view/20290

Downes, S. (2011). “Connectivism” and Connective Knowledge. Retrieved from http://www.huffingtonpost.com/stephen-downes/connectivism-and-connecti_b_804653.html

Dron, J. (2013). Soft is hard and hard is easy: learning technologies and social media. Form@ Re-Open Journal per La Formazione in Rete, 13(1), 32-43. Retrieved from http://fupress.net/index.php/formare/article/view/12613

Geoghegan, W. (1994). Whatever happened to instructional technology? Paper presented at the 22nd Annual Conference of The International Business Schools Computing Association. Baltimore, MD.

Goodyear, P. (2009). Teaching, technology and educational design: The architecture of productive learning environments (pp. 1-37). Sydney. Retrieved from http://www.olt.gov.au/system/files/resources/Goodyear%2C P ALTC Fellowship report 2010.pdf

Goodyear, P., Carvalho, L., & Dohn, N. B. (2014). Design for networked learning: framing relations between participants’ activities and the physical setting. In S. Bayne, M. de Laat, T. Ryberg, & C. Sinclair (Eds.), Ninth International Conference on Networked Learning 2014 (pp. 137-144). Edinburgh, Scotland. Retrieved from http://www.networkedlearningconference.org.uk/abstracts/pdf/goodyear.pdf

Groom, J., & Lamb, B. (2014). Reclaiming innovation. EDUCAUSE Review, 1-12. Retrieved from http://www.educause.edu/visuals/shared/er/extras/2014/ReclaimingInnovation/default.html

Hannon, J. (2013). Incommensurate practices: sociomaterial entanglements of learning technology implementation. Journal of Computer Assisted Learning, 29(2), 168-178. doi:10.1111/j.1365-2729.2012.00480.x

Holt, D., Palmer, S., Munro, J., Solomonides, I., Gosper, M., Hicks, M., É Hollenbeck, R. (2013). Leading the quality management of online learning environments in Australian higher education. Australasian Journal of Educational Technology, 29(3), 387-402. Retrieved from http://www.ascilite.org.au/ajet/submission/index.php/AJET/article/view/84

Inglis, A. (2007). Approaches taken by Australian universities to documenting institutional e-learning strategies. In R. J. Atkinson, C. McBeath, S.K. Soong, & C. Cheers (Eds.), ICT: Providing Choices for Learners and Learning. Proceedings ASCILITE Singapore 2007 (pp. 419-427). Retrieved from http://www.ascilite.org.au/conferences/singapore07/procs/inglis.pdf

Johnson, L., Adams Becker, S., Cummins, M., & Estrada, V. (2014). 2014 NMC Technology Outlook for Australian Tertiary Education: A Horizon Project Regional Report. Austin, Texas. Retrieved from http://www.nmc.org/publications/2014-technology-outlook-au

Johri, A. (2011). The socio-materiality of learning practices and implications for the field of learning technology. Research in Learning Technology, 19(3), 207-217. Retrieved from http://researchinlearningtechnology.net/coaction/index.php/rlt/article/view/17110

Jones, D. (2012). The life and death of Webfuse : principles for learning and leading into the future. In M. Brown, M. Hartnett, & T. Stewart (Eds.), Future challenges, sustainable futures. Proceedings ascilite Wellington 2012 (pp. 414-423). Wellington, NZ.

Jones, D., Luck, J., McConachie, J., & Danaher, P. A. (2005). The teleological brake on ICTs in open and distance learning. In 17th Biennial Conference of the Open and Distance Learning Association of Australia. Adelaide.

Kay, A. (1984). Computer Software. Scientific American, 251(3), 53-59.

Kezar, A. (2001). Understanding and Facilitating Organizational Change in the 21st Century: Recent Research and Conceptulizations. ASHE-ERIC Higher Education Report, 28(4).

Kirkwood, A., & Price, L. (2013). Technology-enhanced learning and teaching in higher education: what is “enhanced” and how do we know? A critical literature review. Learning, Media and Technology, (August), 1-31. doi:10.1080/17439884.2013.770404

Koopman, P., & Hoffman, R. (2003). Work-arounds, make-work and kludges. Intelligent Systems, IEEE, 18(6), 70-75.

Lane, K. (2014). The University of API (p. 28). Retrieved from http://university.apievangelist.com/white-paper.html

Laxon, A. (2013, September 14). Exams go online for university students. The New Zealand Herald.

Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing Pedagogical Action: Aligning Learning Analytics With Learning Design. American Behavioral Scientist, 57(10), 1439-1459. doi:10.1177/0002764213479367

McKenney, S., & Reeves, T. C. (2013). Systematic Review of Design-Based Research Progress: Is a Little Knowledge a Dangerous Thing? Educational Researcher, 42(2), 97-100. doi:10.3102/0013189X12463781

OECD. (2005). E-Learning in Tertiary Education: Where do we stand? (p. 289). Paris, France: Centre for Educational Research and Innovation, Organisation for Economic Co-operation and Development. Retrieved from http://new.sourceoecd.org/education/9264009205

Podolny, J., & Page, K. (1998). Network forms of organization. Annual Review of Sociology, 24, 57-76.

Rahman, N., & Dron, J. (2012). Challenges and opportunities for learning analytics when formal teaching meets social spaces. In 2nd International Conference on Learning Analytics and Knowledge (pp. 54-58). Vancourver, British Columbia: ACM Press. doi:10.1145/2330601.2330619

Reid, I. C. (2009). The contradictory managerialism of university quality assurance. Journal of Education Policy, 24(5), 575-593. doi:10.1080/02680930903131242

Robey, D., Ross, W., & Boudreau, M.-C. (2002). Learning to implement enterprise systems: An exploratory study of the dialectics of change. Journal of Management Information Systems, 19(1), 17-46.

Rushkoff, D. (2010). Program or be programmed: Ten commands for a digital age. New York: OR Books.

Salmon, G. (2005). Flying not flapping: a strategic framework for e-learning and pedagogical innovation in higher education institutions. ALT-J, Research in Learning Technology, 13(3), 201-218.

Scribner, J. (2005). The problems of practice: Bricolage as a metaphor for teachers’ work and learning. Alberta Journal of Educational Research, 51(4), 295-310. Retrieved from http://ajer.journalhosting.ucalgary.ca/ajer/index.php/ajer/article/view/587

Selwyn, N. (2008). From state‐of‐the‐art to state‐of‐the‐actual? Introduction to a special issue. Technology, Pedagogy and Education, 17(2), 83-87. doi:10.1080/14759390802098573

Selwyn, N. (2012). Social media in higher education. The Europa World of Learning. Retrieved from http://www.educationarena.com/pdf/sample/sample-essay-selwyn.pdf

Selwyn, N. (2013). Digital technologies in universities: problems posing as solutions? Learning, Media and Technology, 38(1), 1-3. doi:10.1080/17439884.2013.759965

Siemens, G. (2008). What is the unique idea in Connectivism? Retrieved July 13, 2014, from http://www.connectivism.ca/?p=116

Sturgess, P., & Nouwens, F. (2004). Evaluation of online learning management systems. Turkish Online Journal of Distance Education, 5(3). Retrieved from http://tojde.anadolu.edu.tr/tojde15/articles/sturgess.htm

Truex, D., Baskerville, R., & Travis, J. (2000). Amethodical systems development: the deferred meaning of systems development methods. Accounting Management and Information Technologies, 10, 53-79.

USQ. (2012). University of Southern Queensland 2011 Annual Report. Toowoomba. doi:10.1037/e543872012-001

Visscher-Voerman, I., &Gustafson, K. (2004). Paradigms in the theory and practice of education and training design. Educational Technology Research and Development, 52(2), 69-89.

Wang, F., & Hannafin, M. (2005). Design-Based Research and Technology-Enhanced Learning Environments. Educational Technology Research and Development, 53(4), 5-23.

Weimer, M. (2007). Intriguing connections but not with the past. International Journal for Academic Development, 12(1), 5-8.

Zellweger, F. (2005). Strategic Management of Educational Technology: The Importance of Leadership and Management. Riga, Latvia.

Leadership as defining what’s successful

After spending a few days visiting friends and family in Central Queensland – not to mention enjoying the beach – a long 7+ hour drive home provided an opportunity for some thinking. I’ve long had significant qualms about the notion of leadership, especially as it is increasingly being understood and defined by the current corporatisation of universities and schools. The rhetoric is increasingly strong amongst schools with the current fashion for assuming that Principals can be the saviour of schools that have broken free from the evils of bureaucracy. I even work within an institution where a leadership research group is quite active amongst the education faculty.

On the whole, my experience of leadership in organisations has been negative. At the best the institution bumbles along through bad leadership. I’m wondering whether or not questioning this notion of leadership might form an interesting future research agenda. The following is an attempt to make concrete some thinking from the drive home, spark some comments, and set me up for some more (re-)reading. It’s an ill-informed mind dump sparked somewhat by some early experiences on return from leave.

Fisherman’s beach by David T Jones, on Flickr

In the current complex organisational environment, I’m thinking that “leadership” is essentially the power to define what success is, both prior to and after the fact. I wonder whether any apparent success attributed to the “great leader” is solely down to how they have defined success? I’m also wondering how much of that success is due to less than ethical or logical definitions of success?

The definition of success prior to the fact is embodied in the current model of process assumed by leaders, i.e. telological processes. Where the great leader must define some ideal future state (e.g. adoption of Moodle, Peoplesoft, or some other system; an organisational restructure that creates “one university”; or, perhaps even worse, a new 5 year strategic plan etc.) behind which the weight of the institution will then be thrown. All roads and work must lead to the defined point of success.

This is the Dave Snowden idea of giving up the evolutionary potential of the present for the promise of some ideal future state. A point he’ll often illustrate with this quote from Seneca

The greatest loss of time is delay and expectation, which depend upon the future. We let go the present, which we have in our power, and look forward to that which depends upon chance, and so relinquish a certainty for an uncertainty.

Snowden’s use of this quote comes from the observation that some systems/situations are examples of Complex Adaptive Systems (CAS). These are systems where traditional expectations of cause and effect don’t hold. When you intervene in such systems you cannot predict what will happen, only observe it in retrospect. In such systems the idea you can specify up front where you want to go is little more than wishful thinking. So defining success – in these systems – prior to the fact is a little silly. It questions the assumptions of such leadership, including that they can make a difference.

So when the Executive Dean of a Faculty – that includes programs in information technology and information systems – is awarded “ICT Educator of the Year” for the state because of the huge growth in student numbers, is it because of the changes he’s made? Or is it because he was lucky enough to be in power at (or just after) the peak of the IT boom? The assumption is that this leader (or perhaps his predecessor) made logical contributions and changes to the organisation to achieve this boom in student numbers. Or perhaps they made changes simply to enable the organisation to be better placed to handle and respond to the explosion in demand created by external changes.

But perhaps rather than this single reason for success (great leadership), it was instead there were simply a large number of small factors – with no central driving intelligence or purpose – that enabled this particular institution to achieve what it achieved. Similarly, when a few years later the same group of IT related programs had few if any students, it wasn’t because this “ICT Educator of the Year” had failed. Nor was it because of any other single factor, but instead hundreds and thousands of small factors both internally and externally (some larger than others).

The idea that there can be a single cause (or a single leader) for anything in a complex organisational environment seems to be faulty. But because it is demanded of them, leaders must spend more time attempting to define and convince people of their success. In essence then, successful leadership becomes more about your ability to define and promulgate widely acceptance of this definition of success.

KPIs and accountability galloping to help

This need to define and promulgate success is aided considerably by simple numeric measures. The number of student applications; DFW rates; numeric responses on student evaluation of courses – did you get 4.3?; journal impact factors and article citation metrics; and, many many more. These simple figures make it easy for leaders to define specific perspectives on success. This is problematic and it’s many problems are well known. For example,

  • Goodhart’s law – “When a measure becomes a target, it ceases to be a good measure.”
  • Campbell’s law – “The more any quantitative social indicator (or even some qualitative indicator) is used for social decision-making, the more subject it will be to corruption pressures and the more apt it will be to distort and corrupt the social processes it is intended to monitor.”
  • the Lucas critique.

For example, you have the problem identified by Tutty et al (2008) where rather than improve teaching, institutional quality measures “actually encourage inferior teaching approaches” (p. 182). It’s why you have the LMS migration project receiving an institutional award for quality etc, even though for the first few weeks of the first semester it was largely unavailable to students due to dumb technical decisions by the project team and required a large additional investment in consultants to fix.

Would this project have received the award if a senior leader in the institution (and the institutional itself) heavily reliant upon the project being seen as a success?

Would the people involved in giving the project the award have reasonable reasons for thinking it award winning? Is success of the project and of leadership all about who defines what perspective is important?

Some other quick questions

Some questions for me to consider.

  • Where does this perspective sit within the plethora of literature on leadership and organisational studies? Especially within the education literature? How much of this influenced by earlier reading of “Managing without Leadership: Towards a Theory of Organizational Functioning”
  • Given the limited likelihood of changing how leadership is practiced within the current organisational and societal context, how do you act upon any insights this perspective might provide? i.e. how the hell do I live (and heaven forbid thrive) in such a context?

References

Tutty, J., Sheard, J., & Avram, C. (2008). Teaching in the current higher education environment: perceptions of IT academics. Computer Science Education, 18(3), 171–185.

Documenting the gap between “start of art” and “state of the actual”

Came across Perrotta et al (2013) in my morning random ramblings through my PLN and was particular struck by this

a rising awareness of a gap between ‘state of art’ experimental studies on learning and technology and the ‘state of the actual’ (Selwyn, 2011), that is, the messy realities of schooling where compromise, pragmatism and politics take centre stage, and where the technological transformation promised by enthusiasts over the last three decades failed to materialize. (pp. 261-262)

For my own selfish reasons (i.e. I have work within the “state of the actual”) my research interests are in understanding and figuring out how to improve the “state of the actual”. My Moodlemoot’AU 2013 presentation next week is an attempt to establish the rationale and map out one set of interventions I’m hoping to undertake. This post is about an attempt to make explicit some on-going thinking about this and related work. In particular, I’m trying to come up with a research project to document the “state of the actual” with the aim of trying to figure out how to intervene, but also, hopefully, to inform policy makers.

Some questions I need to think about

  1. What literature do I need to look at that documents the reality of working with current generation university information systems?
  2. What’s a good research method – especially data capture – to get the detail of the state of the actual?

Why this is important

A few observations can and have been made about the quality of institutional learning and teaching, especially university e-learning. These are

  1. It’s not that good.

    This is the core problem. It needs to be better.

  2. The current practices being adopted to remedy these problems aren’t working.

    Doing more of the same isn’t going to fix this problem. It’s time to look elsewhere.

  3. The workload for teaching staff is high and increasing.

    This is my personal problem, but I also think it’s indicative of a broader issue. i.e. much of the current practices aimed at improving quality assume a “blame the teacher” approach. Sure there are some pretty poor academics, but the most of the teachers I know are trying the best they can.

My proposition

Good TPACK == Good learning and teaching

Good quality learning and teaching requires good TPACK – Technological Pedagogical and Content Knowledge. The quote I use in the abstract for the Moodlemoot presentation offers a good summary (emphasis added)

Quality teaching requires developing a nuanced understanding of the complex relationships between technology, content, and pedagogy, and using this understanding to develop appropriate, context-specific strategies and representations. Productive technology integration in teaching needs to consider all three issues not in isolation, but rather within the complex relationships in the system defined by the three key elements. (Mishra & Koehler, 2006, p. 1029)

For some people the above is obvious. You can’t have quality teaching without a nuanced and context specific understanding of the complex relationships between technology, pedagogy and context. Beyond this simple statement there are a lot of different perspectives on the nature of this understanding, the nature of the three components and their relationships. For now, I’m not getting engaged in those. Instead, I’m simply arguing that

the better the quality of the TPACK, then the better the quality of the learning and teaching

Knowledge is not found (just) in the teacher

The current organisational responses to improving the quality of learning and teaching is almost entirely focused on increasing the level of TPACK held by the teacher. This is done by a variety of means

  1. Require formal teaching qualifications for all teachers.

    Because obviously, if you have a teaching qualification then you have better TPACK and the quality of your teaching will be better. Which is obviously way the online courses taught by folk from the Education disciplines are the best.

  2. Running training sessions introducing new tools.
  3. “Scaffolding” staff by requiring them to follow minimum standards and other policies.

This is where I quote Loveless (2011)

Our theoretical understandings of pedagogy have developed beyond Shulman’s early characteristics of teacher knowledge as static and located in the individual. They now incorporate understandings of the construction of knowledge through distributed cognition, design, interaction, integration, context, complexity, dialogue, conversation, concepts and relationships. (p. 304)

Better tools == Better TPACK == Better quality learning and teaching

TPACK isn’t just found in the head of the academic. It’s found in the tools, the interaction etc they engage in. The problem that interests me is that the quality of the tools etc found in the “state of the actual” within university e-learning is incredibly bad. Especially in terms of helping the generation of TPACK.

Norman (1993) argues “that technology can make us smart” (p. 3) through our ability to create artifacts that expand our capabilities. Due, however, to the “machine-centered view of the design of machines and, for that matter, the understanding of people” (Norman, 1993, p. 9) our artifacts, rather than aiding cognition, “more often interferes and confuses than aids and clarifies” (p. 9). Without appropriately designed artifacts “human beings perform poorly or cannot perform at all” (Dickelman, 1995, p. 24). Norman (1993) identifies the long history of tool/artifact making amongst human beings and suggests that

The technology of artifacts is essential for the growth in human knowledge and mental capabilities (p. 5)

Documenting the “state of the actual”

So, one of the questions I’m interested in is just how well are the current artifacts being used in institutional e-learning helping “the growth in human knowledge and mental capabilities”?

For a long time, I’ve talked with a range of people about a research project that would aim to capture the experiences of those at the coal face to answer this question. The hoops I am having to currently jump through in trying to bring together a raft of disparate information systems to finalise results for 300+ students has really got me thinking about this process.

As a first step, I’m thinking I’ll take the time to document this process. Not to mention my next task which is the creation/modification of three course sites for the courses I’m teaching next semester. The combination of both these tasks at the same time could be quite revealing.

References

Norman, D. A. (1993). Things that make us smart: defending human attributes in the age of the machine. Cambridge, Mass: Perseus. Reading, MA: Addison Wesley.

Mishra, P., & Koehler, M. (2006). Technological pedagogical content knowledge: A framework for teacher knowledge. Teachers College Record, 108(6), 1017–1054.

Perrotta, C., & Evans, M. A. (2013). Instructional design or school politics? A discussion of “orchestration” in TEL research. Journal of Computer Assisted Learning, 29(3), 260–269. doi:10.1111/j.1365-2729.2012.00494.x

Enabling academics to apply learning analytics to individual pedagogical practice: how and with what impacts?

Thanks to @cj13 for the heads up about the EDUCAUSE analytics sprint in the midst of moving, conferences, end/start of term and grant writing I missed it. Found it interesting that the first thing the struck my eye was a link to this discussion titled “Faculty need how-to information for the data they do have”. It’s interesting because the grant application I’m writing is directly aimed in this general area. Though we perhaps have a slightly different take on the problem.

As it happens, I’m about to reframe the “outcomes and rationale” section of the grant. So, rather than lose the existing content I’m going to post it below to share the thoughts and see what interesting connections arise. Some early thoughts on the project are here and we’re aiming for OLT funding.

The project team for this application includes myself, Prof Lorelle Burton (USQ), Dr Angela Murphy (CQU), Prof Bobby Harreveld (CQUni), Colin Beer (CQUni), Damien Clark (CQUni), and last, but no means least, Dr Shane Dawson (UBC).

Abstract

Learning analytics is the use of tools and techniques to gather data about the learning process and the use of that data to design, develop and evaluate learning and teaching practice. A significant challenge for learning analytics is the complexity of transforming the data it reveals into informed pedagogical action. This project will investigate how and with what impacts learning analytics can be used to inform individual pedagogical practice. Using Participatory Action Research the project will support groups of academics from two participating universities in using learning analytics to improve their learning and teaching. From this the project will generate insight into how best to use learning analytics to inform pedagogical practice, the impacts of such action, and the types of tools and organisational policies that enable this practice. These insights will be made available through an evolving, online knowledge base and appropriate design guidelines, and encapsulated in a number of supporting tools for an open source LMS.

Rationale

The Society for Learning Analytics Research (SoLAR) defines learning analytics as (Siemens et al., 2011, p. 4)

the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.

Interest in learning analytics has been rapidly growing over recent years with Ferguson (2012) suggesting it is amongst the fastest-growing areas of research within technology-enhanced learning driven by a combination of technological, pedagogical and political/economic drivers. The 2012 Horizon Report (Johnson & Cummins, 2012) argues that while learning analytics is still in its early stages of development it is likely to see widespread adoption within higher education in the next 2-3 years. The analysis in the Horizon Report’s technology outlook for Australian Tertiary Education (Johnson, Adams, & Cummins, 2012) places learning analytics into the “one year or less” for the first time seeming to suggest that Australian universities are particularly interested in “finding new ways to measure student performance, ideally in real-time” (Johnson, Adams, & Cummins, 2012, p. 1).

The rise of learning analytics as a field of research is timely given the broadening participation agenda of the Australian higher education sector. Commonwealth government targets set in response to the Bradley review (Bradley, et al. 2008) are ambitious, and are in line with the move from elite to mass higher education globally, particularly in OECD countries (Santiago, et al. 2008). For 40% of Australians aged 25-34 to hold (or be progressing towards) bachelor degrees, we will need to enrol and graduate more students. Many of these students will be from non-traditional backgrounds, and would have been considered ‘high-risk’ students in the past. Learning analytics can help us understand the circumstances under which those students are most likely to succeed. But can learning analytics also help guide teachers to make the coalface pedagogical decisions to support the success of this larger and more diverse body of students?

To date much of the work on learning analytics in higher education has centred on identifying students at risk of failure and addressing short-term issues to prevent that failure (Johnson & Cummins, 2012). The dominant use of learning analytics within higher education has largely been by management (Dawson, et al. 2011) or support staff. The larger promise of learning analytics is when it is used “to more precisely understand students’ learning needs and to tailor instruction appropriately far more accurately and far sooner than is possible today” (Johnson & Cummins, 2012, p. 23). If correctly applied and interpreted this practice has implications not only for student performance, but also for the perceptions of learning, teaching and assessment held by educators (Johnson & Cummins, 2012). The ability to correctly apply and interpret the findings of learning analytics into practice is, however, difficult, time-consuming, requires additional support, and is worthy of further investigation (Dawson et al., 2011; Dawson & McWilliam, 2008). The challenge is being able “to readily and accurately interpret the data and translate such findings into practice” (Dawson & McWilliam, 2008, p. 12).

Outcomes

This project seeks to address this challenge. It seeks to explore how and with what impacts educators can be enabled and encouraged to effectively and appropriately use learning analytics to inform individual pedagogical practice. In doing so, the project aims to develop the following outcomes:

  1. An online knowledge base that guides educators and institutions in the use of learning analytics to inform individual pedagogical practice.
  2. Enhancements to at least 12 courses across the two participating institutions through the application of learning analytics.
  3. A “harnessing analytics model” that explains how and with what impacts learning analytics can be used to inform individual pedagogical practice, including identification of enablers, barriers and critical success factors.
  4. Design guidelines explaining how to modify e-learning information systems to better enable the application of learning analytics to inform pedagogical practice.
  5. Enhanced and new learning analytics tools for the Moodle LMS based on those design guidelines and integrated with the knowledge base.
  6. Further testing, enhancements to existing and identification of new trends, correlations and patterns evident in usage data.

How

The project aims to develop these outcomes by directly helping groups of teaching academics harness learning analytics to observe and intervene in their teaching. This direct engagement in practice will take the form of cycles of Participatory Action Research at the University of Southern Queensland and CQUniversity. The use of PAR will provide the “opportunity for codeveloping processes with people rather than for people” (McIntyre, 2008, p. xii). The PAR cycles will be supported by and contribute to the evolution of the knowledge base and will inform the enhancements to Moodle – the LMS at both institutions and at over 18 Australian Universities – learning analytics tools. The institution and LMS specific interventions developed during the PAR cycles will be reviewed, tested and abstracted into broader models and guidelines that can be used at other institutions and within other e-learning tools. This sharing of insight between context specific outcomes and broader knowledge will be supported by the active involvement of learning analytics experts from other institutions and the project reference group.

References

Bradley, D., Noonan, P., Nugent, H., & Scales, B. (2008). Review of Australian Higher Education. Canberra. Retrieved from http://www.deewr.gov.au/HigherEducation/Review/Documents/PDF/Higher Education Review_one document_02.pdf

Dawson, S., Bakharia, A., Lockyer, L., & Heathcote, E. (2011). “Seeing” networks : visualising and evaluating student learning networks Final Report 2011. Main. Canberra. Retrieved from http://research.uow.edu.au/content/groups/public/@web/@learnnet/documents/doc/uow115678.pdf

Dawson, S., & McWilliam, E. (2008). Investigating the application of IT generated data as an indicators of learning and teaching performance. Melbourne: Australian Learning and Teaching Council.

Ferguson, R. (2012). The State of Learning Analytics in 2012 : A Review and Future Challenges a review and future challenges. Milton Keynes: UK. Retrieved from http://kmi.open.ac.uk/publications/pdf/kmi-12-01.pdf

Johnson, L., & Cummins, M. (2012). The NMC Horizon Report: 2012 Higher Education Edition (p. 42). Austin, Texas.

McIntyre, A. (2008). Participatory Action Research. Thousand Oaks, CA: SAGE Publications.

Siemens, G., Gasevic, D., Haythornthwaite, C., Dawson, S., Shum, S. B., & Ferguson, R. (2011). Open Learning Analytics: an integrated & modularized platform. Knowledge Creation Diffusion Utilization. Retrieved from http://solaresearch.org/OpenLearningAnalytics.pdf

Light-weight analytics tools as part of scaffolding, context-sensitive conglomerations

A couple of days ago I floated the idea of scaffolding, context-sensitive conglomerations as one idea/model/suggestion for how e-learning systems (currently mostly LMS, but hopefully other models will arise).

George Siemens has posted about light-weight analytics tools such as SNAPP. Both the comments on that post are, to my current somewhat focused/biased perspective, suggestions for the need for scaffolding conglomerations. Both comments are from practitioners who talk about how the supplement their use of discussion forums with other forms of representation. It would appear obvious that these combinations of tools are useful. I’m pretty sure you could find quite a few talented and motivated academics across the world that are using this combination. I’m also pretty sure that few of them would be located within the same institution.

Are there any discussion forum tools in e-learning systems that already provide this sort of scaffolding for users? Are there any IT departments in universities that have recognised this need and are helping academics make this connection?

I’m not aware of any, and this suggests to me that there are some fundamental problems with the way these systems are being supported and structured. i.e. current approaches mean it is unlikely for these sorts of networks of tools/conglomerations to arise.

SNAPP has used a good approach that makes it simpler to create these conglomerations, through the use of browser plugins. But the advantages of that approach come with a negative. i.e. I don’t believe you can currently generate a SNAPP visualisation for a group of courses (e.g. to see how the students/staff in a program are interacting), or compare visualisations between different courses. You also can’t easily combine SNAPP with other context specific data sources such as student records system etc.

Consequently, SNAPP is a great example of a tool that enables a “scaffolding conglomeration” when combined with an LMS discussion forum. But it still remains difficult to add the “context-sensitive” component.

I think there’s value in exploring how SNAPP and similar tools can be used as both scaffolding and context-sensitive conglomerations, and more importantly, what impact it can have on the practice of teachers, and subsequently the quality of learning.

Adding advice

While I remember, there’s a next step that I’d like to see a “scaffolding, context-sensitive conglomeration” take. Advice, examples and connections.

For example, assume I’m teaching a course, I use a discussion forum and I’ve designed its use for a specific pedagogic purpose. I’ve installed SNAPP and to my horror discover a problem. What do I do? What strategies can I employ to address this problem? What strategies have other academics in similar (or even different situations) used? What happened? Can I get their contact details so I can have a chat?

I would imagine that this addition could also be implemented for students who discovered a “bad” pattern in their own practice. They could receive advice, examples and connections from other students.

The learning analytics group seems to include a leaning towards “intelligent”/”adaptive” software to provide this sort of service. I’m more interested in how we can use these tools to connect people and provide the scaffolding that enables and encourages them to take some action.

Misc. reflections on reading about situated cognition

For various reasons, mostly PhD related (and somewhat related to procrastination), I’m taking the time to read a bit more about situated cognition. Not sure how far it will go. The following are some ad hoc reflections and essentially a diary of what I’m reading. Not aiming for this post to fulfil any purpose beyond being a place to dump observations.

The wikipedia page

So far, the Wikipedia page on situated cognition seems fairly extensive and a reasonable place to start.

Misapplication of community of practice?

The wikipedia page has the following definition of community of practice

The concept of a **community of practice** (often abbreviated as CoP) refers to the process of social learning that occurs and shared sociocultural practices that emerge and evolve when people who have common goals interact as they strive towards those goals.

I find this somewhat interesting in that my experience with CoPs around university learning and teaching has been with special groups set up for specific purposes above and beyond normal teaching. i.e. rather than have a CoP around teaching at university X, where the common purpose is to teach. A CoP is set up around attrition, graduate attributes etc and focuses on that as the goal, rather than the teaching.

I do wonder whether this on-going ad hoc creation of CoPs around special topics that are important, but haven’t been embedded into common institutional practice is a symptom of CoP misuse. i.e. if normal practice of teaching within an institution was more like a CoP, would you really need a separate CoP on retention etc? Does the need for separate CoPs indicate that the normal practice of teaching within an institution isn’t like a CoP and hence, perhaps there isn’t a sense of a common purpose amongst those involved in the process of normal teaching? Instead of a common purpose, do the actors within an institution’s normal practice of teaching and learning have their own different purposes?

The glossary from which this definition comes highlights the amount of though and subsequently special language that has arisen around situated cognition. Also has some interesting resonance with the need for design theories to have a constructs section.

Affordances

Interesting to see some of the origins of the concept of affordances beyond Norman and HCI/usability. The idea that affordances are the individual’s interpretations about what action is possible within the given environment through their perception of the environment connects strongly with some of the problems I have around the stereotypical university environment around teaching and the nature of e-learning systems. i.e. I think the affordances seen by many teaching staff aren’t good in terms of improving learning and teaching.

The relationship between affordances and schemata seems an area of some disquiet and more reading – Glenberg and Robertson (1999)

Perception

This section was a little disappointing, it only mentions visual perception. While this is apparently an importent influence on situated cognition, even from my limited reading and knowledge there appears to have been a lot more work done on perception.

Memory

Again I feel there could be more here. But it does pose the interesting question of how situated cognition is downplaying the importance of stored, symbolic representations in memory. Instead having a belief that perception and action are co-determined by effectivities and affordances… Raises the question about what is
learning and knowing (which are covered next). Also links back to the disquiet about the link between affordances and schemata.

Knowing and learning

So knowing is not a think, a memory, but a verb, it is action/participation of an agent in an environment. This is where the idea that knowledge cannot be separate from context. It gives rise to the importance of context.

This is interesting, challenging and somewhat comforting. It is comforting because to some aspect it represents an idea embedded in how Webfuse worked. For Webfuse, context was important. Webfuse wasn’t a general purpose tool that could be used elsewhere, Webfuse could not be separate from its context. Mmm, situated cognition as a kernel theory for the ISDT and Webfuse?

It does seem that some of these sections have been made specifically very narrow and focussing on language/literacy learning.

Pedagogical implications

So, situated cognition is the theory of “mind/knowing/learning etc”, while cognitive apprenticeship etc are instructional design theories drawing on that theory, it appears.

Critiques

A small section on critiques close off the page. On the face of it, the critiques do a reasonable job of removing many of the assumptions on which situated cognition is based. Need to have a look further into Anderson et al (2000).

Of course the criticisms arise from cognitivists – more information processing types – who hold a perspective that has also been challenged. Interesting that it appears that Herbert Simon is one of the critiques (last author on Anderson et al).

References

Anderson, J. R., Greeno, J. G., Reder, L. M., & Simon, H. A. (2000). Perspectives on learning, thinking, and activity. Educational Researcher, 29, 11-13.

The end of management – lessons for universities?

Yet another “death of X” article is the spark for this post. This one comes from the Wall Street Journal and is titled The end of management. There’s been a wave of these articles recently, but this one I like because it caters to my prejudice that most of the problems in organisations, especially in universities around learning and teaching, arise from an inappropriate management paradigm. The following has some connections to the oil sheiks thread.

Some choice quotes

Corporations are bureaucracies and managers are bureaucrats. Their fundamental tendency is toward self-perpetuation. They are, almost by definition, resistant to change. They were designed and tasked, not with reinforcing market forces, but with supplanting and even resisting the market.

and

The weakness of managed corporations in dealing with accelerating change is only half the double-flanked attack on traditional notions of corporate management. The other half comes from the erosion of the fundamental justification for corporations in the first place.

And a quote from Gary Hamel which summarises much of the problem with real innovation, including innovation around management

That thing that limits us is that we are extraordinarily familiar with the old model, but the new model, we haven’t even seen yet.

Moving onto the question of resources

In corporations, decisions about allocating resources are made by people with a vested interest in the status quo. “The single biggest reason companies fail,” says Mr. Hamel, “is that they overinvest in what is, as opposed to what might be.”

The challenge that strikes at the heart of improving learning and teaching within universities is capture in this quote

there’s the even bigger challenge of creating structures that motivate and inspire workers. There’s plenty of evidence that most workers in today’s complex organizations are simply not engaged in their work.

Does your university have large numbers of academic staff that are actively engaged in teaching? How does it do it?

I’d like to work for a university that gets this, or at least is trying to.

Oil sheiks, Lucifer and university learning and teaching

The following arises from a combination of factors including:

Old wine in new bottles

Perhaps the key quote from Mark’s post is

This post is simply to try and say what many people don’t want to say and that is, that most universities really don’t care about educational technology or elearning.

My related perspective is that the vast majority of university learning and teaching is, at best (trying to be very positive), just ok. There’s a small bit that is really, really bad; and a small bit that is really, really good. In addition, most interventions to improve learning and teaching are not doing anything to change this distribution. At best, they might change the media, but the overall distribution is the same.

There’s a quote from Dutton and Loader (2002) that goes something like

without new learning paradigms educators are likely to use technology to do things the way they have always done; but with new and more expensive technology.

I am currently of the opinion that without new management/leadership paradigms to inform how universities improve learning and teaching, the distribution is going to remain the same just with new and more expensive organisational structures. This article from the Goldwater Institute about administrative bloat at American universities might be an indicator of that.

Don’t blame the academics

The “When good people turn bad” radio program is an interview with Philip Zimbardo. He’s the guy responsible for the Stanford prisoner study, an example of where good people turned really bad because of the situation in which they were place. The interview includes the following from Prof Zimbardo

You no longer can focus only on individual freedom of will, individual rationality. People are always behaving in a context, in a situation, and those situations are always created and maintained by powerful systems, political systems, cultural, religious ones. And so we have to take a more complex view of human nature because human beings are complex.

This resonates somewhat with a point that Mark makes

the problem of adoption is primarily not a technical one but one of organisational culture

. I agree. It’s the culture, the systems, the processes and the policies within universities that are encouraging/enshrining this distribution where most university learning and teaching is, at best, just ok.

The culture/system doesn’t encourage nor enable this to change. When management do seek to do something about this, their existing “management paradigm” encourages an emphasis on requiring change without doing anything effective to change the culture/system.

The proposition and the interest

Which is where I am interested in and propose the following

If you really wish to improve the majority of learning and teaching within a university, then you have to focus on changing the culture/system so that academics staff are encouraged and enabled to engage in learning about how to teach.

In addition, I would suggest that requiring learning (e.g. through requiring all new academic staff to obtain a formal qualification in learning) without aligning the entire culture/system to enable academic staff to learn and experiment (some of these characteristics are summarised here) is doomed to failure.

I’d also suggest that there is no way you can “align” the culture/system of a university to enable and encourage academic staff learning about teaching. At best you can engage in a continual process of “aligning” the culture/system as that process of “aligning” is itself a learning process.

Easy to say

I can imagine some universities leaders saying “No shit Sherlock, what do you think we’re doing?”. My response is you aren’t really doing this. Your paradigm is fundamentally inappropriate, regardless of what you claim.

However, actually achieving this is not simple and I don’t claim to have all the answers. This is why this is phrased as a proposition, it’s an area requiring more work.

I am hoping that within a few days, I might have a small subset of an answer in the next, and hopefully final, iteration of the design theory for e-learning that is meant to be the contribution of my thesis.

References

Dutton, W. and B. Loader (2002). Introduction. Digital Academe: The New Media and Institutions of Higher Education and Learning. W. Dutton and B. Loader. London, Routledge: 1-32.