Breaking BAD to bridge the reality/rhetoric chasm

The following is a copy of a paper accepted at ASCILITE’2014 (and nominated for best paper) written by myself and Damien Clark (CQUniversity – @damoclarky). The official conference version of the paper is available as a PDF.

Presentation slides available on Slideshare.

The source code for the Moodle Activity Viewer is available on github. As are some of the scripts produced at USQ.

Abstract

The reality of using digital technologies to enhance learning and teaching has a history of falling short of the rhetoric. Past attempts at bridging this chasm have tried: increasing the perceived value of teaching; improving the pedagogical and technological knowledge of academics; redesigning organisational policies, processes and support structures; and, designing and deploying better pedagogical techniques and technologies. Few appear to have had any significant, widespread impact, perhaps because of the limitations of the (often implicit) theoretical foundations of the institutional implementation of e-learning. Using a design-based research approach, this paper develops an alternate theoretical framework (the BAD framework) for institutional e-learning and uses that framework to analyse the development, evolution, and very different applications of the Moodle Activity Viewer (MAV) at two separate universities. Based on this experience it is argued that the reality/rhetoric chasm is more likely to be bridged by interweaving the BAD framework into existing practice.

Keywords: bricolage, learning analytics, e-learning, augmented browsing, Moodle.

Introduction

In a newspaper article (Laxon, 2013) Professor Mark Brown makes the following comment on the quality of contemporary University e-learning:

E-learning’s a bit like teenage sex. Everyone says they’re doing it but not many people really are and those that are doing it are doing it very poorly. (n.p).

E-learning – defined by the OECD (2005) as the use of information and communications technology (ICT) to support and enhance learning and teaching – has been around for so long that there have been numerous debates about replacing it with other phrases. Regardless of the term used, there “has been a long-standing tendency in education for digital technologies to eventually fall short of the exaggerated expectations” (Selwyn, 2012, n.p.). Writing in the early 1990s Geoghagen (1994) seeks to understand why a three decade long “vision of a pedagogical utopia” (n.p.) promised by instructional technologies has failed to eventuate. Ten years on, Salmon (2005) notes that e-learning within universities is still struggling to move beyond projects driven by innovators and engage a significant percentage of students and staff. Even more recently, concerns remain about how much technology is being used to effectively enhance student learning (Kirkwood & Price, 2013). Given that “Australian universities have made very large investments in corporate educational technologies” (Holt et al., 2013, p. 388) it is increasingly important to understand and address the reality/rhetoric chasm around e-learning.

Not surprisingly the literature provides a variety of answers to this complex question. Weimer (2007) observes that academics come to the task of teaching with immense amounts of content knowledge, but little or no knowledge of teaching and learning, beyond perhaps their personal experience. A situation which may not change significantly given that academics are expected to engage equally in research and teaching and yet work towards promotion criteria that are perceived to primarily value achievements in research (Zellweger, 2005). It has been argued that the limitations of the Learning Management System (LMS) – the most common university e-learning tool – make the LMS less than suitable for more effective learner-centred approaches and is contributing to growing educator dissatisfaction (Rahman & Dron, 2012). It’s also been argued that the “limited digital fluency of lecturers and professors is a great challenge” (Johnson, Adams Becker, Cummins, & Estrada, 2014, p. 3) for the creative leveraging of emerging technologies. Another contributing factor is likely to be Selwyn’s (2008) suggestion that educational technologists have failed to be cognisant of “the more critical analyses of technology that have come to the fore in other social science and humanities disciplines” (p. 83). Of particular interest here is the observation of Goodyear et al (2014) that the “influence of the physical setting (digital and material) on learning activity is often important, but is under-researched and under-theorised: it is often taken for granted” (p. 138).

This paper reports on the initial stages of a design-based research project that aims to bridge the e-learning reality/rhetoric chasm by exploring and harnessing alternative theoretical foundations for the institutional implementation of e-learning. The paper starts comparing and contrasting two different theoretical foundations of institutional e-learning. The SET framework is suggested as a description of the mostly implicit assumptions underpinning most contemporary approaches. The BAD framework is proposed as an alternative and perhaps complementary framework that better captures the reality of what happens and if effectively integrated into institutional practices may help bridge the chasm. The development of a technology – the Moodle Activity Viewer (MAV) – and its use at two different universities is then used to illustrate the benefits and limitations of the SET and BAD frameworks, and how the two can be fruitfully combined. The paper closes with some discussion of implications and future work.

Breaking BAD versus SET in your ways

The work described here is part of an on-going cycle of design-based research that aims to develop new artefacts and theories that can help bridge the e-learning reality/rhetoric chasm. We believe that bridging this chasm is of theoretical and practical significance to the sector and to us personally. The interventions we describe in the following sections arose out of our day-to-day work and were informed by a range of theoretical perspectives. This section offers a brief description of the theoretical frameworks that have informed and been refined by this work. This is important as design-based research should depart from a problem (McKenney & Reeves, 2013), be grounded in practice, theory-driven and seek to refine both theory and practice (Wang & Hannafin, 2005). The frameworks described here are important because they identify a mindset (the SET framework) that contributes significantly to the on-going difficulty in bridging the e-learning reality/rhetoric chasm, and offers an alternate mindset (the BAD framework) that provides principles that can help bridge the chasm. The SET and BAD frameworks are broadly incommensurable ways of answering three important, inter-related questions about the implementation of e-learning. While the SET framework represents the most commonly accepted mindset used in practice, both frameworks are evident in both the literature and in practice. Table 1 provides an overview of both frameworks.

Table 1: The BAD and SET frameworks for e-learning implementation
Question SET BAD
What work gets done? Strategy – following a global plan intended to achieve a pre-identified desired future state. Bricolage – local piecemeal action responding to emerging contingencies.

 

How ICT is perceived? Established – ICT is a hard technology and cannot be changed. People and their practices must be modified to fit the fixed functionality of the technology. Affordances – ICT is a soft technology that can be modified to meet the needs of its users, their context, and what they would like to achieve.
How you see the world? Tree-like – the world is relatively stable and predictable. It can be understood through logical decomposition into a hierarchy of distinct black boxes. Distributed – the world is complex, dynamic, and consists of interdependent assemblages of diverse actors (human and not) connected via complex networks.

What work gets done: Bricolage or Strategic

The majority of contemporary Australian universities follow a strategic approach to deciding what work gets done. Numerous environmental challenges and influences have led to universities being treated as businesses with an increasing prevalence of managers using “strategic control and a focus on outputs which can be quantified and compared” (Reid, 2009, p. 575) to manage academic activities. A strategic approach involves the creation of a vision identifying a desired future state and the development of operational plans to bring about the desired future state. The only work that is deemed acceptable is that which fits within the established operational plan and is seen to contribute to the desired future state. All other work is deemed inefficient. The strategic approach is evident at all levels of institutional e-learning. Inglis (2007) describes how government required Australian universities to have institutional learning and teaching strategic plans published on their websites. The strategic or planning-by-objectives (e.g. learning outcomes, graduate attributes) approach also underpins how course design is largely assumed to occur with Visscher-Voerman and Gustafson (2004) finding that it underpins “a majority of the instructional design models in the literature” (p. 77). The strategic approach is so ingrained that it is often forgotten that these ideas have not always existed (Kezar, 2001), have significant flaws, and that there is at least one alternate perspective.

Bricolage, “the art of creating with what is at hand” (Scribner, 2005, p. 297) or “designing immediately” (BŸscher, Gill, Mogensen, & Shapiro, 2001, p. 23) involves the manipulation and creative repurposing of existing, and often unlikely, resources into new arrangements to solve a concrete, contextualized problem. Ciborra (1992) argues that bricolage – defined as the “capability of integrating unique ideas and practical design solutions at the end-user level” (p. 299) – is more important in developing organisational applications of ICT that provide competitive advantage than traditional strategic approaches. Scribner (2005) and other authors have used bricolage to understand the creative and considered repurposing of readily available resources that teachers use to engage in the difficult task of helping people learn. Bricolage is not without its problems. There are risks associated with extremes of both the strategic and bricolage approaches to how work gets done (Jones, Luck, McConachie, & Danaher, 2005). In the context of institutional e-learning, the problem is that at the moment the strategic is crowding out bricolage. For example, Groom and Lamb (2014) observe that the cost of supporting an enterprise learning tool (e.g. LMS) limits resources for user-driven innovation, in part because it draws “attention and users away” (n.p) from the strategic tool (i.e. LMS). The demands of sustaining the large and complex strategic tool dominates priorities and leads to “IT organizationsÉdefined by what’s necessary rather than what’s possible” (Groom & Lamb, 2014, n.p). There would appear to be some significant benefit to exploring a dynamic and flexible interplay between the strategic and bricolage approaches to deciding what work gets done.

How ICT is perceived: Affordances or Established

The established view sees ICT as a hard technology (Dron, 2013). What can be done with hard technology is fixed in advance either by embedding it in the technology or “in inflexible human processes, rules and procedures needed for the technology’s operation” (Dron, 2013, p. 35). An example of this is the IT person quoted by Sturgess and Nouwens (2004) as suggesting in the context of an LMS evaluation process that “we should seek to change people’s behavior because information technology systems are difficult to change” (n.p). This way of perceiving ICTs assumes that the functionality provided by technology is established and cannot be changed. This creates the problem identified by Rushkoff (2010) where “instead of optimizing our machines for humanity – or even the benefit of some particular group – we are optimizing humans for machinery” (p. 15). Perhaps in no small way the established view of ICT in e-learning contributes to Dede’s (2008) observation that “widely used instructional technology applications have less variety in approach than a low-end fast-food restaurant” (p. 58). The established view of ICT challenges Kay’s (1984) discussion of the “protean nature of the computer” (p. 59) as “the first metamedium, and as such has degrees of freedom and expression never before encountered” (p. 59). The problem is that digital technology is “biased toward those with the capacity to write code” (Rushkoff, 2010, p. 128) and increasingly those who can code have been focused on avoiding it.

The established view of ICT represents a narrow view of technological change and human agency. When unable to achieve a desired outcome, people will use the available knowledge and resources to create an alternative path, they will create a workaround (Koopman & Hoffman, 2003). For example, Hannon (2013) talks about the “hidden effort” (p. 175) of “meso-level practitioners – teaching academics, learning technologies, and academic developers” (p. 175) to bridge the gaps created by centralised technologies. The established view represents the designer-centred idea of achieving “perfect” software (Koopman & Hoffman, 2003), rather than recognising the need for on-going adaptation due to the diversity, complexity and on-going change inherent in university e-learning. The established view also ignores Kay’s (1984) description of the computer as offering “degrees of freedom and expression never before encountered” (p. 59). The established view does not leverage the affordance of ICT for change and freedom. Following Goodyear et al (2014), affordances are not a feature of a technology, but rather it is a relationship between the technology and the people using the technology. Within university e-learning the affordance for change has been limited due to both the perceived nature of the technology – best practice guidelines for integrated systems such as LMS and ERP recommend vanilla implementation (Robey, Ross, & Boudreau, 2002) – and the people – the apparent low digital fluency of academics (Johnson, Adams Becker, Cummins, & Estrada, 2014, p. 3). However, this is changing. There are faculty and students who are increasingly digitally fluent (e.g. the authors of this paper) and easily capable of harnessing the advent of technologies that “help to make bricolage an attainable reality” (BŸscher et al., 2001, p. 24) such as the IMS LTI standards, APIs (Lane, 2014) and augmented browsing (Dai, Tsai, Tsai, & Hsu, 2011). An affordances perspective of ICT seeks to leverage the capacity for ICT to be manipulated so that it offers the best possible affordances for learners and teachers. A move away from the established “design of an artefact towards emergent design of technology-in-use, particularly by the users” (Johri, 2011, p. 212).

How you see the world: Distributed or Tree-like

The methods used to solve most of the large and complex problems that make up institutional e-learning rely upon a tree-like or hierarchical conception of the world. To manage a university it is broken up into a tree-like structure consisting of divisions, faculties, schools, and so on. The organisation of the formal learning and teaching done at the university relies upon a tree-like structure of degrees, majors/minors, courses or units, learning outcomes, weeks, lectures, tutorials, etc. The information systems used to enable formal learning and teaching mirror the tree-like structure of the organisation with separation into different systems responsible for student records, learning management, learning content management etc. The individual information systems themselves are broken up into tree-like structures reliant on modular design. These tree-like structures are the result of the reliance on methods that use analysis and logical decomposition to reduce larger complex wholes into smaller more easily understood and manageable parts (Truex, Baskerville, & Travis, 2000). These methods produce tree-like structures of independent, largely black-boxed components that interact through formally approved mechanisms that typically involve oversight or approval from further up the hierarchy. For example, a request for a new feature in an LMS must wend its way up the tree-like governance structure until it is considered at the institutional level, compared against institutional priorities and ranked against other requests, before possibly being passed down to the other organisational black-box that can fulfill that request. There are numerous limitations associated with tree-like structures. For example, Holt et al (2013) identify just one of these limitations when they argue that the growing complexity of institutional e-learning means that no one leader at the top of a hierarchical tree has the knowledge to “possibly contend with the complexity of issues” (p. 389).

The solution suggested by Holt et al (2013) is distributed leadership which is in turn based on broader theoretical foundations of distributed cognition, social learning, as well as network and activity theories. A theoretical foundation that can be seen in a broad array of distributed ways of looking at the world. For example, in terms of learning, Siemens’ (2008) lists the foundations of connectivism: as activity theory; distributed and embodied cognition; complexity; and network theory. At the core of connectivism is the “thesis that knowledge is distributed across a network of connections and therefore learning consists of the ability to construct and traverse those networks” (Downes, 2011, n.p). Johri (2011) links much of this same foundation to socio-materiality and suggests that it offers “a key theoretical perspective that can be leveraged to advance research, design and use of learning technologies” (p. 210). Poldolny & Page (1998) apply the distributed view to governance and organisations and describe it as meaning that two or more actors are able to undertake repeated interactions over a period of time without having a centralised authority responsible for resolving any issues arising from those interactions. Rather than the responsibility and capability for specific actions being seen as belonging to any particular organisational member or group (tree-like), the responsibility and capability is distributed across a network of individuals, groups and technologies. The distributed view sees institution e-learning as a complex, dynamic, and interdependent assemblages of diverse actors (both human and not) distributed in complex networks.

It is our argument that being aware of the differences in thinking between the SET and BAD frameworks offers insight that can guide the design of interventions that are more likely to bridge the e-learning reality/rhetoric chasm. The following sections describe the development and adaptation of the Moodle Activity Viewer (MAV) at both CQUni and USQ as an example of what is possible when breaking BAD.

Breaking BAD and the development of MAV

The second author works for Learning and Teaching Services at CQUniversity (CQUni). In late 2012, he was working on a guide for teaching staff titled “How can I enhance my teaching practice?”. In contributing to the “Designing effective course structure” section of this guide, the author asked a range of rhetorical questions including “How do you know which resources your students access the most, and the least?”. Providing an answer to this question for the reader took more effort than expected. There are reports available in Moodle 2.2 (the version being used by CQUni at the time) that can be used to answer this question. However, they suffer from a number of limitations including: duplicated report names; unclear differences between reports; usage values include both staff and student activity; poor speed of generation; and, a tabular format. It was apparent that these limitations were acting as a barrier to reflection on course design. This was especially problematic, as the institution had placed increased emphasis on generating and responding to student feedback (CQUniversity, 2012). Annual course enhancement reports – introduced in 2010 – required teaching staff to respond to feedback from students and highlight enhancements to be made for the course’s next offering (CQUniversity, 2011). Information about activity and resource usage on the course Moodle site was seen by some to be useful in completing these reports. However, there was no apparent strategic or organisational imperative to address issues with the Moodle reports and it appeared likely that the aging version of Moodle (version 2.2) would persist for some time given other organisational priorities. As a stopgap solution the author and a colleague engaged in some bricolage and began writing SQL queries for the Moodle database and generating Excel spreadsheets. Whilst this approach provided more useful data, the spreadsheets were manually generated on request and the teaching staff had to bridge the conceptual gap between the information within the Excel spreadsheet and their Moodle course site.

In the months following, the author started thinking about a better approach. While CQUni had implemented a range of customisations to the institution’s Moodle instance, substantial changes required a clear understanding of the final requirements, alignment with strategic imperatives, and support of the senior management. At this stage of the process it was not overly clear what the final requirements of a solution would be, hence more experimentation was required to better understand the problem and possible solutions, prior to making the case for modifying Moodle.  While the author did not have the ability to change the institution’s version of Moodle itself, he did have access to: a copy of the Moodle database; access to a server computer; and software development abilities. Any bridging of this particular gap would need to draw on available resources (bricolage) and not disturb or impact critical high-availability services such as Moodle. Given uncertainty about what functionality might best enable reflection on course design any potential solution would also need to enable a significant level of agility and experimentation (bricolage).

The technical solution that seemed to best fulfill these requirements was augmented browsing. Dai et al (2011) define augmented browsing as “an effective means for dynamically adding supplementary information to a webpage without having users navigate away from the page” (p. 2418). The use of augmented browsing to add functionality to a LMS is not new.  Leony et al (2012) created a browser add-on that embeds learning analytics graphs directly within the Moodle LMS course home page. Dawson et al (2011) used what is known as bookmarklets to generate interactive sociograms to visualise student learning networks as part of SNAPP.  The problems that drove SNAPP’s use of augmented browsing – complex and difficult to interpret LMS reports and the difficulty of getting suggestions from teaching staff integrated into an institution LMS (Dawson et al., 2011) – mirror those faced at CQU.

Through a process of bricolage the Moodle Activity Viewer (MAV) was developed as an add-on for the Firefox web browser. More specifically, the MAV is built upon another popular Firefox add-on called Greasemonkey, and in Greasemonkey terms MAV is known as a userscript.  However, for the purposes of this paper, the MAV will be referred to more generally as an add-on to the browser. The intent was that the MAV would generate a heat map and embed it directly onto any web page produced by Moodle. A heat map shades each of the links in a web page with a spectrum of colours where the deeper red shades indicate links that are being clicked on more often (see Figure 1). The implementation of the MAV is completely separate from the institutional Moodle instance meaning its use has no impact on the production Moodle environment. Once the MAV add-on is installed into Firefox, and with it turned on, any web page from a Moodle course site can have a heat map overlaid on all Moodle links in that page. This process starts with the MAV add-on recognising a newly loaded page as belonging to a Moodle course site. When this occurs the MAV will generate a query asking for usage figures associated with every relevant Moodle link on that web page. This query is sent to the MAV server hosted on an available server computer. The MAV server translates the query into appropriate queries that will extract the necessary information from the Moodle database. As implemented at CQU, the MAV server relies on a copy of the Moodle database that is updated daily. While not necessary, use of a copy of the Moodle database ensures that there is no risk of disrupting the production Moodle instance.

The MAV add-on can be configured to generate overlays based on the number of clicks on a link, or the number of students who have clicked on a link. It can also be configured to limit the overlays to particular groups of students or to a particular student. When used on the main course page, MAV provides an overview of how students are using all of the course resources. Looking at a discussion forum page with the MAV enabled allows the viewer to analyse which threads or messages are receiving the most attention. Hence MAV can provide a simple form of process analytics (Lockyer, Heathcote, & Dawson, 2013).

An initial proof-of-concept implementation of the MAV was developed by April 2013. A few weeks later this implementation was demonstrated to the “Moodle 2 Project Board” to seek approval to continue development. The plan was to engage in small trials with academic staff and evolve the tool. The intent was that this would generate a blueprint for the implementation of heat maps within Moodle itself.  The low-risk nature of the approach contributed to approval to continue. However, by July 2013, the institution downsized through an organisational restructure and resources in the IT department were subsequently reduced.  As part of this restructure, and in an effort to reduce costs, the IT Department set to reduce the level of in-house systems development in favour of more established “vanilla” systems (off-the-shelf with limited or no customisations).  This new strategy made it unlikely that the MAV would be re-implemented directly within Moodle, and the augmented browsing approach might be viable longer term. As the MAV was being developed and refined, it was being tested by a small group of teaching staff within the creator’s team. Then in September 2013, the first official trial was launched making the MAV available to all staff within one of CQUniversity’s schools.

How MAV works by David T Jones, on FlickrFigure 1: How MAV works (Click on the image to see larger version)

Early in March 2012, prior to the genesis of the MAV, the second author and a colleague developed a proposal for a student retention project. It was informed by ongoing research into learning analytics at the institution and motivated by a strategic institutional imperative to improve student retention (CQUniversity, 2011).  It was not until October 2013 – after the commencement of the first trial of the MAV – that a revised version of the proposal received final approval and the project commenced in November under the name EASICONNECT.  Part of the EASICONNECT project was the inclusion of an early alerts system for disengaged students called EASI (Early Alert Student Indicators) to identify disengaged students early, and provide simple tools to nudge the students to re-engage, with the hope of improving student retention. In 2013, between the proposal submission and final approval of the EASICONNECT Project, EASI under a different name (Student Support Indicators – SSI) was created as a proof-of-concept and used in a series of small term-based trials, evolving similarly to the MAV. One of the amendments made to the approved proposal by the project sponsor (management) was the inclusion of the MAV as a project deliverable in the EASICONNECT project.

Neither EASI nor the MAV were strictly the results of strategic plans. Both systems arose from bricolage being undertaken by two members of CQUni’s Learning and Teaching Services that was later recognised as contributing to the strategic aims of the institution. With the eventual approval of the EASICONNECT project, the creators of EASI and the MAV worked more closely together on these tools and the obvious linkages between them were developed further. Initially this meant modifying the MAV so staff participating in the EASI trial could easily navigate from the MAV to EASI. In Term 1, 2014 EASI introduced links for each student in a course, that when clicked, would open the Moodle course site with the MAV enabled only for the selected student. While EASI showed a summary of the number of clicks made by the student in the course site, the MAV could then contextualise this information, revealing where those clicks took place directly within Moodle. In Term 2, 2014 a feature often requested by teaching staff was added to the MAV that would identify students who had and hadn’t clicked on links. The MAV also provided an option for staff to open EASI to initiate an email nudge to either group of students. Figure 2 provides a comparison of week-to-week usage of MAV between term 1 and 2, of 2014. The graphs show usage in terms of the number of page views and number of staff using the system, with the Term 2 figures including up until the end of Week 10 (of 15).

Both MAV and its sister project EASI were initiated as a form of bricolage. It was only later that both projects enjoyed the synthesised environment of a strategic project that provided the space and institutional permission for this work to scale and continue to merge. MAV arose due to the limited affordances offered by the LMS and the promise that different ICT could be harnessed to enhance the perceived affordances. Remembering that affordances are not something innate to a tool, but are instead co-constitutive between tool, user and context; the on-going use of bricolage allowed the potential affordances of the tool to evolve in response to use by teaching staff. Through this approach MAV has been able to evolve from potentially offering affordances of value to teaching staff as part of “design for reflection and redesign” (Dimitriadis & Goodyear, 2013) to also offering potential affordances for “design for orchestration” (Dimitriadis & Goodyear, 2013).

Figure 2: 2014 MAV usage at CQUni: Comparison between T1 and T2 (Click on images to see larger versions of the graphs)
MAV Usage - page views by David T Jones, on Flickr
MAV usage - # staff by David T Jones, on Flickr

Implementing MAV as a browser add-on also enables a break from the tree-like conceptions that underpin the design of large integrated systems like an LMS. The tree-like conception is so evident in the Moodle LMS that it is visible in the name. Moodle is an acronym for Modular Object-Oriented Dynamic Learning Environment. With Modular capturing the fact that “Moodle is built in a highly modular fashion” (Dougiamas & Taylor, 2003, p. 173), meaning that logical decomposition is used to break the large integrated system into small components or modules. This modular architecture allows the rapid development and addition of independent plugins and is a key enabler of the flexibility of Moodle. However, this is based on each of the modules being largely independent of each other, which has the consequence of making it more difficult to have functionality that crosses modular boundaries, such as taking usage information from the logging systems and integrating that information into all of the modules that work together to produce a web page generated by Moodle.

Extending MAV at another institution

In 2012 the first author commenced work within the Faculty of Education at the University of Southern Queensland (USQ). The majority of the allocated teaching load involved two offerings of EDC3100, ICTs and Pedagogy. EDC3100 is a large (300+ on-campus and online students first semester, and ~100 totally online second semester) core, third year course for Bachelor of Education (BEdu) students. The author expected that USQ would have high quality systems and processes to support large, online courses. This was due to USQ’s significant reputation in the practice and research of distance and online education; it’s then stated vision “To be recognised as a world leader in open and flexible higher education” (USQ, 2012, p. 5); and the observation that “by 2012 up to 70% of students in the Bachelor of Education were studying at least some subjects online” (Albion, 2014, p. 1163). The experience of teaching EDC3100 quickly revealed an e-learning reality/rhetoric chasm.

As a core course EDC3100 students study at all of USQ’s campuses, a Malaysian partner, and online from across Australia and the world. The students are studying to become teachers in early childhood, primary, secondary and VET settings. The course is designed so that the “Study Desk” (the Moodle course site) is an essential source of information and support for all students. The course design makes heavy use of discussion forums for a range of learning activities. Given the size and diversity of the student population there are times when it is beneficial for teaching staff to customise their responses to the student’s context and specialisation. For instance, an example from the Australian Curriculum may be appropriate for a primary or lower secondary pre-service teacher based in Australia, but inappropriate for a VET pre-service teacher. Whilst the Moodle discussion forum draws on user profiles to identify authors of posts, the available information is limited to that provided centrally via the institution and by the users. For EDC3100 this means that a student’s campus is apparent through their membership of the Moodle groups automatically created by USQ’s systems, however, seeing this requires navigating away from the discussion forum. The student’s specialisation is not visible in Moodle. The only way this information is available is to ask an administrative staff member with the appropriate student records access to generate a spreadsheet (and then update the spreadsheet as students add and drop the course) that includes this specific information. The lack of easy access to this information constrains the ability of teaching staff to effectively intervene.

One explanation for the existence of this gap is the limitations of the SET approach to institutional e-learning systems. The tree-based practice of logical decomposition results in distinct tasks – such as the management of student demographic and enrolment data (Peoplesoft), and the practice of online learning (Moodle) – being supported by different information systems with different data models and owned by different organisational units. Logical decomposition allows each of these individual systems and their owners to focus on the efficiency of their primary task. However, it comes at the cost of making it more difficult to both recognise and respond to requirements that go across the tasks (e.g. teaching). It is even more difficult when the requirement is specific to a subset of the organisation. For example, ensuring that information about the specialisation of BEdu students is evident in Moodle is only of interest to some of the staff teaching into the BEdu. Even if this barrier could be overcome, modifying the Moodle discussion forum to make this type of information more visible would be highly unlikely due to the cost, difficulty and (quite understandable) reluctance to make changes to enterprise software inherent in the established-view of technology.

To address this need the MAV add-on was modified to recognise USQ Moodle web pages that contain links to student profiles (e.g. a forum post). On recognising such a page the modified version of MAV queries a database populated using the manually provided spreadsheet described above. MAV uses that information to add to each student profile link a popup dialog that provides student information such as specialisation and campus without leaving the page. Adding different information (e.g. activity completion, GPA etc.) to this dialog can proceed without the approval of any centralised authority. The MAV server and the database run on the author’s laptop and the author has the skill to modify the database and write new code for both the MAV server and client. As such it’s an example of Podonly and Page’s (1998) distributed approach to governance. The only limitation is whether or not the necessary information can be retrieved in a format that can be easily imported into the database.

Conclusions, implications and future work

Future work will focus on continuing an on-going cycle of design-based research exploring how and with what impacts the BAD framework can be fruitfully integrated into the practice of institutional e-learning. To aid this process we are exploring how MAV, its various modifications, and descendants can be effectively developed and shared within and between institutions. As a first step, the CQU MAV code has been released on GitHub (https://github.com/damoclark/mav), development is occurring in the open and interested collaborators are welcome. A particular interest is in exploring and evaluating the use of MAV to implement scaffolding and context-sensitive conglomerations. Proposed in Jones (2012) a conglomeration seeks to enhance the affordances offered by any standard e-learning tool (e.g. a discussion forum) with a range of additional and often contextually specific information and functionality. Both uses of MAV described above are simple examples of a conglomeration. Of particular interest is whether these conglomerations can be used to explore whether Goodyear’s (2009) idea that “research-based evidence and the fruits of successful teaching experience can be embodied in the resources that teachers use at design time” can be extended to institutional e-learning tools.

Perhaps the biggest challenge to this work arises from the observation that the SET framework forms the foundation for current institutional practice and that the SET and BAD frameworks are largely incommensurable. At CQU, MAV has benefited from recognition and support of senior management; yet, it still challenges the assumptions of those operating solely through the SET framework. The incommensurable nature of the SET and BAD frameworks imply that any attempts to fruitfully merge the two will need to deal with existing, and sometimes strongly held assumptions and mindsets. For example, rather than require the IT division to formally approve and develop all applications of ICT, their focus should perhaps turn (at least in part) to enabling and encouraging “ways to make work-arounds easier for users to create, document and share” (Koopman & Hoffman, 2003, p. 74) through organisational “settings, and systems É arranged so that invention and prototyping by end-users can flourish” (Ciborra, 1992, p. 305). Similarly, rather than academic staff development focusing on ensuring that the appropriate knowledge is embedded in the heads of teaching staff (e.g. formal teaching qualifications), there should be a shift to a focus on ensuring that the appropriate knowledge is embedded within the network of actors – both people and artefacts – distributed within and perhaps outside the institution. Rather than accept “the over-hyped, pre-configured digital products and practices that are being imported continually into university settings” (Selwyn, 2013, p. 3), perhaps universities should instead actively contribute to “a genuine grassroots interest needs to be developed in the co-creation of alternative educational technologies.  In short, mass participation is needed in the development of “digital technology for university educators by university educators” (p. 3).

Biggs (2012) conceptualises the job of a teacher as being responsible for creating a learning context in which “all students are more likely to use the higher order learning processes which ‘academic’ students use spontaneously” (p. 39). If this perspective is taken one step back, then it is the responsibility of a university to create an institutional context in which all teaching staff are more likely to create the type of learning context which ‘good’ teachers create spontaneously. The on-going existence of the e-learning reality/rhetoric chasm suggests many universities are yet to achieve this goal. This paper has argued that this is due in part to the institutional implementation of e-learning being based on a limited SET of theoretical conceptions. The paper has compared the SET framework with the BAD framework and argued that the BAD framework provides a more promising theoretical foundation for bridging this chasm. It has illustrated the strengths and weaknesses of these two frameworks through a description of the origins and on-going use of the Moodle Activity Viewer (MAV) at two institutions. The suggestion here is not that institutions should see the BAD framework as a replacement for the SET framework, but rather that they should engage in some bricolage and explore how contextually appropriate mixtures of both frameworks can help bridge their e-learning reality/rhetoric chasm. Perhaps universities need to break a little BAD?

References

Albion, P. (2014). From Creation to Curation: Evolution of an Authentic’Assessment for Learning’Task. In M. Searson & M. Ochoa (Eds.), Society for Information Technology & Teacher Education International Conference (pp. 1160-1168). Chesapapeake, VA: AACE.

Biggs, J. (2012). What the student does: teaching for enhanced learning. Higher Education Research & Development, 31(1), 39-55. doi:10.1080/07294360.2012.642839

BŸscher, M., Gill, S., Mogensen, P., & Shapiro, D. (2001). Landscapes of practice: bricolage as a method for situated design. Computer Supported Cooperative Work, 10(1), 1-28.

Ciborra, C. (1992). From thinking to tinkering: The grassroots of strategic information systems. The Information Society, 8(4), 297-309.

CQUniversity. (2011). CQUniversity Annual Report 2010 (p. 136). Rockhampton.

CQUniversity. (2012). CQUniversity Annual Report 2011 (p. 84). Rockhampton.

Dai, H. J., Tsai, W. C., Tsai, R. T. H., & Hsu, W. L. (2011). Enhancing search results with semantic annotation using augmented browsing. IJCAI Proceedings – International Joint Conference on Artificial Intelligence, 22(3), 2418-2423.

Dawson, S., Bakharia, A., Lockyer, L., & Heathcote, E. (2011). “Seeing” networks : visualising and evaluating student learning networks Final Report 2011. Canberra: Australian Learning and Teaching Council.

Dede, C. (2008). Theoretical perspectives influencing the use of information technology in teaching and learning. In J. Voogt & G. Knezek (Eds.), International Handbook of Information Technology in Primary and Secondary Education (pp. 43-62). New York: Springer.

Dimitriadis, Y., & Goodyear, P. (2013). Forward-oriented design for learning : illustrating the approach. Research in Learning Technology, 21, 1-13. Retrieved from http://www.researchinlearningtechnology.net/index.php/rlt/article/view/20290

Downes, S. (2011). “Connectivism” and Connective Knowledge. Retrieved from http://www.huffingtonpost.com/stephen-downes/connectivism-and-connecti_b_804653.html

Dron, J. (2013). Soft is hard and hard is easy: learning technologies and social media. Form@ Re-Open Journal per La Formazione in Rete, 13(1), 32-43. Retrieved from http://fupress.net/index.php/formare/article/view/12613

Geoghegan, W. (1994). Whatever happened to instructional technology? Paper presented at the 22nd Annual Conference of The International Business Schools Computing Association. Baltimore, MD.

Goodyear, P. (2009). Teaching, technology and educational design: The architecture of productive learning environments (pp. 1-37). Sydney. Retrieved from http://www.olt.gov.au/system/files/resources/Goodyear%2C P ALTC Fellowship report 2010.pdf

Goodyear, P., Carvalho, L., & Dohn, N. B. (2014). Design for networked learning: framing relations between participants’ activities and the physical setting. In S. Bayne, M. de Laat, T. Ryberg, & C. Sinclair (Eds.), Ninth International Conference on Networked Learning 2014 (pp. 137-144). Edinburgh, Scotland. Retrieved from http://www.networkedlearningconference.org.uk/abstracts/pdf/goodyear.pdf

Groom, J., & Lamb, B. (2014). Reclaiming innovation. EDUCAUSE Review, 1-12. Retrieved from http://www.educause.edu/visuals/shared/er/extras/2014/ReclaimingInnovation/default.html

Hannon, J. (2013). Incommensurate practices: sociomaterial entanglements of learning technology implementation. Journal of Computer Assisted Learning, 29(2), 168-178. doi:10.1111/j.1365-2729.2012.00480.x

Holt, D., Palmer, S., Munro, J., Solomonides, I., Gosper, M., Hicks, M., É Hollenbeck, R. (2013). Leading the quality management of online learning environments in Australian higher education. Australasian Journal of Educational Technology, 29(3), 387-402. Retrieved from http://www.ascilite.org.au/ajet/submission/index.php/AJET/article/view/84

Inglis, A. (2007). Approaches taken by Australian universities to documenting institutional e-learning strategies. In R. J. Atkinson, C. McBeath, S.K. Soong, & C. Cheers (Eds.), ICT: Providing Choices for Learners and Learning. Proceedings ASCILITE Singapore 2007 (pp. 419-427). Retrieved from http://www.ascilite.org.au/conferences/singapore07/procs/inglis.pdf

Johnson, L., Adams Becker, S., Cummins, M., & Estrada, V. (2014). 2014 NMC Technology Outlook for Australian Tertiary Education: A Horizon Project Regional Report. Austin, Texas. Retrieved from http://www.nmc.org/publications/2014-technology-outlook-au

Johri, A. (2011). The socio-materiality of learning practices and implications for the field of learning technology. Research in Learning Technology, 19(3), 207-217. Retrieved from http://researchinlearningtechnology.net/coaction/index.php/rlt/article/view/17110

Jones, D. (2012). The life and death of Webfuse : principles for learning and leading into the future. In M. Brown, M. Hartnett, & T. Stewart (Eds.), Future challenges, sustainable futures. Proceedings ascilite Wellington 2012 (pp. 414-423). Wellington, NZ.

Jones, D., Luck, J., McConachie, J., & Danaher, P. A. (2005). The teleological brake on ICTs in open and distance learning. In 17th Biennial Conference of the Open and Distance Learning Association of Australia. Adelaide.

Kay, A. (1984). Computer Software. Scientific American, 251(3), 53-59.

Kezar, A. (2001). Understanding and Facilitating Organizational Change in the 21st Century: Recent Research and Conceptulizations. ASHE-ERIC Higher Education Report, 28(4).

Kirkwood, A., & Price, L. (2013). Technology-enhanced learning and teaching in higher education: what is “enhanced” and how do we know? A critical literature review. Learning, Media and Technology, (August), 1-31. doi:10.1080/17439884.2013.770404

Koopman, P., & Hoffman, R. (2003). Work-arounds, make-work and kludges. Intelligent Systems, IEEE, 18(6), 70-75.

Lane, K. (2014). The University of API (p. 28). Retrieved from http://university.apievangelist.com/white-paper.html

Laxon, A. (2013, September 14). Exams go online for university students. The New Zealand Herald.

Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing Pedagogical Action: Aligning Learning Analytics With Learning Design. American Behavioral Scientist, 57(10), 1439-1459. doi:10.1177/0002764213479367

McKenney, S., & Reeves, T. C. (2013). Systematic Review of Design-Based Research Progress: Is a Little Knowledge a Dangerous Thing? Educational Researcher, 42(2), 97-100. doi:10.3102/0013189X12463781

OECD. (2005). E-Learning in Tertiary Education: Where do we stand? (p. 289). Paris, France: Centre for Educational Research and Innovation, Organisation for Economic Co-operation and Development. Retrieved from http://new.sourceoecd.org/education/9264009205

Podolny, J., & Page, K. (1998). Network forms of organization. Annual Review of Sociology, 24, 57-76.

Rahman, N., & Dron, J. (2012). Challenges and opportunities for learning analytics when formal teaching meets social spaces. In 2nd International Conference on Learning Analytics and Knowledge (pp. 54-58). Vancourver, British Columbia: ACM Press. doi:10.1145/2330601.2330619

Reid, I. C. (2009). The contradictory managerialism of university quality assurance. Journal of Education Policy, 24(5), 575-593. doi:10.1080/02680930903131242

Robey, D., Ross, W., & Boudreau, M.-C. (2002). Learning to implement enterprise systems: An exploratory study of the dialectics of change. Journal of Management Information Systems, 19(1), 17-46.

Rushkoff, D. (2010). Program or be programmed: Ten commands for a digital age. New York: OR Books.

Salmon, G. (2005). Flying not flapping: a strategic framework for e-learning and pedagogical innovation in higher education institutions. ALT-J, Research in Learning Technology, 13(3), 201-218.

Scribner, J. (2005). The problems of practice: Bricolage as a metaphor for teachers’ work and learning. Alberta Journal of Educational Research, 51(4), 295-310. Retrieved from http://ajer.journalhosting.ucalgary.ca/ajer/index.php/ajer/article/view/587

Selwyn, N. (2008). From state‐of‐the‐art to state‐of‐the‐actual? Introduction to a special issue. Technology, Pedagogy and Education, 17(2), 83-87. doi:10.1080/14759390802098573

Selwyn, N. (2012). Social media in higher education. The Europa World of Learning. Retrieved from http://www.educationarena.com/pdf/sample/sample-essay-selwyn.pdf

Selwyn, N. (2013). Digital technologies in universities: problems posing as solutions? Learning, Media and Technology, 38(1), 1-3. doi:10.1080/17439884.2013.759965

Siemens, G. (2008). What is the unique idea in Connectivism? Retrieved July 13, 2014, from http://www.connectivism.ca/?p=116

Sturgess, P., & Nouwens, F. (2004). Evaluation of online learning management systems. Turkish Online Journal of Distance Education, 5(3). Retrieved from http://tojde.anadolu.edu.tr/tojde15/articles/sturgess.htm

Truex, D., Baskerville, R., & Travis, J. (2000). Amethodical systems development: the deferred meaning of systems development methods. Accounting Management and Information Technologies, 10, 53-79.

USQ. (2012). University of Southern Queensland 2011 Annual Report. Toowoomba. doi:10.1037/e543872012-001

Visscher-Voerman, I., &Gustafson, K. (2004). Paradigms in the theory and practice of education and training design. Educational Technology Research and Development, 52(2), 69-89.

Wang, F., & Hannafin, M. (2005). Design-Based Research and Technology-Enhanced Learning Environments. Educational Technology Research and Development, 53(4), 5-23.

Weimer, M. (2007). Intriguing connections but not with the past. International Journal for Academic Development, 12(1), 5-8.

Zellweger, F. (2005). Strategic Management of Educational Technology: The Importance of Leadership and Management. Riga, Latvia.

The IRAC framework: Locating the performance zone for learning analytics #ascilite

The following is a draft version of the presentation I’ll be giving at ASCILITE tomorrow. Hopefully this will go close to fitting within the 12 minute time frame.

Other resources

The paper on which this presentation is based is available. As is @cfellows insightful and interesting annotated response (an accessory every #ascilite paper should come with).

The slides are also available on Slideshare.

The presentation

Click on the images below to see a larger version.

Slide01

The aim of this talk is to give an introduction and rationale to the IRAC framework. In short, the rationale is that use of the IRAC framework – especially when its more complete – provides a useful lens to improve the likelihood that learning analytics interventions will actually be used of learners, teachers and others and subsequently be more likely to improve learning.

Slide02

The motivation for this work is our observation that most of what universities are doing to implement learning analytics relates metaphorically to the steaming pile in this image. It’s also based on our beliefs and observations that the literature around learning analytics has some areas of over and under emphasis.

Slide03

We’ve been especially annoyed/frustrated with the influx of business intelligence folk and vendors into the learning analytics area. In no small part because they haven’t really been able to get business intelligence working in the much simpler and older uses of business intelligence. If they couldn’t get it to work effectively there, they certainly won’t be able to get it to work in an area as complex and different as learning and teaching.

But I wouldn’t want to limit my criticism to these folk. I don’t think that many of the folk responsible for the transformational improvements to the quality of learning and teaching from the wildly successful application of technologies such as the LMS, lecture capture and copy detection within universities are likely to do significantly better with learning analytics.

And finally, I have some significant cognitive limitations. I need some help thinking about how to design learning analytics interventions.

Slide04

Which leads to the question of how we can help do this better? How do you do it better?

This is where the IRAC framework comes in. We think that there is value in having a framework that can be used to scaffold the analysis, evaluation and design of learning analytics interventions. In fact, we found ourselves wanting and needing such a framework. Especially one which didn’t necessarily suffer some of the limitations of existing frameworks.

The IRAC framework is our early attempt to achieve this.

Slide05

Because it is still early days, we are still questioning much of the IRAC framework and hope to hear from you what your questions might be about it.

Slide06

So, rather than define learning analytics. I’m going to start by defining what we think the purpose of learning analytics is.

For us learning analytics is yet another tool in our arsenal to help improve learning. There is no point in learning analytics unless it can help improve learning.

In order for this to happen, the learning analytics interventions we design have to be integrated into the practice of learning and teaching. If the students, teachers and others involved aren’t using these learning analytics interventions, then there is no way the interventions can help improve learning.

To be clear, like much of institutional e-learning we don’t think most institutional learning analytics interventions are destined to be used to an great level of quantity or quality.

Slide07

There are already a range of models looking at various aspects of learning analytics. In fact, the Atif paper at this conference adds a nifty little conceptual analysis of one part.

The model we’re going to use here is one by George Siemens from his journal article earlier in the year. Throughout the presentation as we introduce the IRAC framework we’ll show how it relates to the Siemens model. This will help make one of our points about the apparently over-emphasis on certain topics that perhaps aren’t as important as others.

Slide08

To illustrate the value of the IRAC framework we’re going to use it to analyse an example learning analytics intervention.

In applying the IRAC framework we currently think it’s always done by starting with thinking of it applied to a particular context – a nuanced appreciation of context is the main defence against adoption of faddish IT innovations – and with a particular purpose.

The context we’re going to consider is CQUniversity and the purpose is identifying at risk students. A purpose that is close to the consideration of most current institutions.

Slide09

Of course, in keeping with my identified purpose of learning analytics, I think the purpose should be rephrased as helping at risk students. Identifying them is pointless if nothing is done to help them.

Slide10

This is the SSI system. This particular view is what the course coordinator – the person in charge of a single course or unit – would see. It lists all of the students in their course ranked from lowest to highest on the basis of an “Estimation of Success” indicator.

I imagine your institution has or is developing something that will fulfill a similar purpose. The Atif paper also at this conference actually looked at three such tools ECU’s C4S, UNE’s AWE and OUA’s PASS. Everyone is doing it.

So how do we use IRAC to analysis and think about SSI?

Slide11

Let’s start at the beginning. IRAC is an acronym. The first part of the acronym is I for information. What is the information you are drawing on, how are you analysing it, and what considerations around ethics/privacy etc exist.

As you can see from this representation at least three-quarters of the Siemens’ model is focused on Information. No great surprise that we think that this is not helpful. It’s a necessary first step but it is by no means a sufficient step if your purpose for learning analytics is to improve learning.

Slide12

In fact, given that most of the people involved in the origins of learning analytics come from a data mining, business intelligence, computer science background this emphasis on information is no great surprise. It’s a perfect example of Kaplan’s law of instrument. i.e. if all you have is a business intelligence system, then every task looks like…..

Slide13

The IRAC framework is still very much under development and this is the first public run of the “Information – SAO” model. The idea is that when you’re pondering the Information component of a learning analytics intervention a useful way of thinking about this might be the combination of Source, Analysis and Output – SAO. Let’s demonstrate with SSI.

The source information from which the EOS is calculated includes demographic and enrolment information about the student (e.g. # of courses enrolled, # passed, their GPA etc) and their activity in the LMS. This is a fairly standard combination of basic data used in most similar systems.

The Analysis – how the raw information is transformed into the output – is by way of a formula. Essentially, points are “awarded” based on the value for each of the bits of information. i.e. if you’ve failed 100% of courses and are enrolled in 8 courses you are going to get a very large negative number in terms of points.

The output is a single number. Which eventually is represented (the second component of IRAC) as a traffic light.

Slide14

The idea here is that you can start to compare other systems. So drawing on the Atif paper from yesterday it’s possible to say that the AWE and PASS systems (similar to SSI) draw on discussion forum and social media pages as additional sources of information. Similar comparisons can be done with the other parts of the SAO model and of the IRAC framework.

Slide15

It’s possible to make other observations about SSI. The proxy it uses for learner behaviour/activity is clickstream data. How many times they’ve clicked on the course website. This focus on the clicksteam as the source of information about the learner is problematic for a number of reasons that have been explained in the literature.

Slide16

We are doing work that extends beyond the clickstream but due to the constraints of a 12 minute presentation, I’m not getting into details here. So very quickly. I’m the author of a Moodle module called BIM – find out more at the URL on the slide. BIM aggregates, mirrors and manages students blogging. The data BIM focuses on is not clickstreams, but what students are writing. The students’ blog posts. It is through this that BIM allows moving away from simple behaviour – which is all you can get from the clickstream – into information that has some cognitive component.

Slide17

In Sunday’s keynote at the A-LASI workshop Dragan Gasevic used the term “predict-o-mania” to label the practice of using the same predictive model for all situations with a complete ignorance of context. He also then showed a range of research to show just how silly such a practice is.

This is a significant weakness of SSI. The Estimation of Success is calculated using the same formula for all courses regardless of the context. The formula has been tested with past data at the institution and is somewhat generalisable, but this is still a significant limit.

This limitation has been identified by the Desire2Learn folk and their system – alongside other tools like the Moodle engagement block – provides support for more context-specific variety in the models being used.

Slide18

Another observation from the Desire2Learn folk that identifies a limitation with the EOS calculation. i.e. that it is a black box or in the words of Judy Kay from Sunday, it’s not scrutable. In fact, Colin reports that there have been academics using SSI that first asked to see the detail of the EOS analysis/formula so that they could understand what it is telling them.

Slide19

From this we can start populating the IRAC framework for SSI. The idea being is that if you did this for different learning analytics interventions you could start making some judgements about how well the intervention may suit your context and where you might like to make improvements if you’re the developer of such a system. You could also use this type of analysis to compare different learning analytics interventions and draw conclusions about the appropriateness for your context and purpose.

Slide20

The R in IRAC is representation. You’ve gathered your information and analysed, now you have to represent it so that people can understand and act upon it. The trouble is that most institutional implementation of learning analytics often pays too little attention to representation. Though the research literature does have some people doing some interesting things.

This is perhaps the first adoption barrier for learning analytics. If the representation is hard to understand or hard to access (or inappropriate) it won’t be used.

Slide21

One of the advantages SSI provides is that the representation of the data does effectively integrate a range of information that was previously not available in one place. Too often the information silos are a result of different institutional systems (and their support and control processes) that never talk to each other. The value of bringing this information together in a form that is easily accessible to teaching staff is not to be under-estimated.

The representation in this case is tabular. It hasn’t made significant use of advanced visualisation techniques. This might be seen as a problem

Slide22

So that adds integrated and tabular to the IRAC framework for SSI. A tick here indicates presence in the tool not that the particular feature is good or bad.

Slide23

A good time to mention my opinion that “dashboard suck”.

Dashboards are the other focus for institutional learning analytics projects and they suck. But they also illustrate the other limitation of much of the learning analytics work. It stops at representation.

i.e. a dashboard represents the finding. A dashboard doesn’t help you do anything. They tend to have little or no affordances for action.

Slide24

As mentioned above, we believe that learning analytics is only useful if it leads to changes in learning and teaching. It has to lead to action. The A in RAC is affordances. Or what sort of actions does the learning analytics application afford? What does it help people do in response to the the insight it represents?

Slide25

Just quickly the theoretical foundations of the IRAC framework arise from Don Norman’s work around cognitive artefacts. In particular, how the literature around Electronic Performance Support Systems (EPSS) has used these principles to develop the idea of the performance zone. This is talked about more in the paper, if you’re interested.

Affordances are not a new topic for the ASCILITE crowd. In short, the idea here is that a learning analytics tool should help or make easy certain actions on the part of the appropriate individuals. If action isn’t afforded, then it is likely that nothing will get done. If nothing gets done, then how does learning improve?

With recent literature highlighting the increasing workload associated with online/blended delivery in university learning and teaching the idea of systems that make the right tasks easier sounds good. Though it does raise a range of questions about what is right and many, many more.

Slide26

So what affordances for action does SSI provide? One is the idea of interventions. There’s an intervention log that allows course coordinators to record details of various types of interventions and to show when those interventions occurred in relation to the students involved. It also provides a mail merge facility that makes it easier to provide apparently personal messages to groups of students.

Slide27

After selecting students from the SSI interface the mail merge allows the course coordinator to frame an email message. The message can include variables – picked from several provided lists – that will be replaced with student specific values when the email is sent. Experience shows that many students currently see these emails as personal.

Slide28

Affordances are not a given. What is afforded by an artifact depends on the people doing the perceiving. In addition, exaptation – the use of the artefact for unintended purposes – may play a role.

For example, experience has shown that up to 30% of the messages being sent through the SSI mail merge are not related to at risk students. Instead course coordinators are using to distribute information to students. Obviously, there is something about the affordance offered by the SSI email merge tool that is missing from the range of other available tools.

Slide29

There are other less obvious affordances built into SSI. The most obvious is the default presentation of the information. The students that are the top of the table are those at risk. And we know that those at the bottom of the list are less likely to receive attention. SSI affords a focus on those students at risk. This may be a good thing, but it also means that those students in the middle or those who are doing very well are likely to receive less attention, if any at all.

Slide30

SSI’s email merge facility is arguably an example of an important type of affordance for these systems that I’ve labeled “CRM”. CRM as in Customer Relationship Management. It’s not a great name but links with the closest common functionality that many in higher education are currently familiar with. The idea of something that scaffolds appropriate communication.

The idea of affordances for action is somewhat under-represented in work around learning analytics, but it’s coming. There’s much work to be done identifying what affordances might be useful in a range of contexts and explore what might make sense within something called “CRM” functionality.

Slide31

Arguably related to the idea of affordances is the proposal from Lockyer et al (2013) of checkpoint and process analytics. Analytics that are specific to specific learning designs. Obviously something that a system like SSI does not provide. But which when integrated into tools that support specific types of learning designs it opens up the possibility of specific affordances. I’m particularly, interested in the affordances learning analytics offers to a tool like BIM and its intent to encourage students to engage in reflection and also to construct a PLN.

Slide32

The PassNote app is another example of what “CRM” affordances might include. PassNote is from the folk at Purdue University who produce Course Signals. Course Signals is perhaps the most famous SSI-like learning analytics intervention – especially in recent times for perhaps not the best of reasons.

Course Signals uses a formula to identify students as being “red”, “yellow” or “green” based on their level of “at-riskness”. Pass Note is designed to help the teacher frame the messages to send to the student(s).

Slide33

PassNote provides a range of potential messages that can be chosen on the basis of the students’ “colour” and the topic of the message. The content of these messages has apparently been designed on the basis of research findings.

This page shows the range of possible messages that a teacher might select for a student showing up as “read” if the topic of concern is “Attendance”. It appears at this stage, the teacher must copy and paste the suggested message content into whatever communication mechanism they are using. It would appear that a merger between this functionality and the email functionality of SSI might be useful.

The four labels added to this page are summaries of some of the principles underpinning the content of these messages and are taken from the Pass Note page.

Slide34

Perhaps stretching a bit, but PassNote is encroaching onto the idea of “pedagogic scaffolding”. i.e. the system – like Passnote – draws on a range of theories or research findings to improve and scaffold any action the person might take. SSI doesn’t do provide this affordance.

Slide35

Just quickly as another example of “pedagogic advice”. This is a screen shot from a project led by Dan Meyer to help students use mathematics to model a situation and make predictions. All of the activity takes place within this environment and based on what the student does, the system offers some pedagogical scaffolding to the teacher in the form of questions they may wish to ask particular students.

Slide36

The C in IRAC stands for change and the loop in George’s model captures this nicely. However, in my experience, the people involved with university e-learning pay almost no attention whatsoever to the need to respond productively to on-going change – even with all the rhetoric within the university sector about the constancy of change. In fact, my ASCILITE paper from last year argues that the conceptions of product (e.g. the LMS as an enterprise system) and process (big up-front design) that is endemic to university e-learning is completely and utterly unsuitable for the nature of the task.

For us, it is fundamental that any learning analytics intervention have built into the ability to change and to change everything. The information it collects, the analysis methods it uses, how it represents the insight and the affordances it provides. It’s also important how quickly it can change, how fine-grained that change can be and who can change it.

Slide37

PassNote also offers an example of one of the main rationales for change.

The PassNote app wasn’t originally part of the Course Signals project. PassNote arose out of the experience with Course Signals. Especially the observation that the actions being taken by teaching staff in response to the Course Signals information was less than optimum.

The experience of using Course Signals generated new requirements that had to be addressed.

Slide38

This is not a new trend. HCI research even has a name for this type of situation. It’s called the task-artifact cycle.

The task of identifying at risk students generated some requirements that led to the development of the Course Signals artifact. The use of Course Signals created some new possibilities and tasks. i.e. the need to communicate effective with the at risk students to ensure something helpful was done. This in turn generated a new set of requirements that led to the development of PassNote.

The important point here is that the cycle doesn’t end with PassNote.

What happens when 30%, 50% or 100% of the teaching staff at a University start using Course Signals and PassNote? What possibilities might this create? What new tasks? What new requirements?

Slide39

So the task-artifact cycle gets added under change.

One of the strengths of SSI is that the designers are very much aware of this need. SSI is not a traditional Information Systems project where the assumption is that the smart people running the project can predict what will be needed before the project is underway. The “identification of at risk students” initial purpose was mostly a label to get buy in from senior management because everyone else is doing it. The actual intent of the project is much more ambitious.

For this reason, the project is not implemented within a heavy-weight, enterprise level IT infrastructure because such infrastructures may be reliable, but they are also incredibly static. They can’t respond to change.

i.e. the SSI Project has rejected the conceptions of process and product that infect institutional e-learning.

Slide40

The need for change is further supported by 30 odd years of research into Decision Support Systems (DSS) – of which data warehouses are a part, and I assume that learning analytics has a close relationship. That research has established some fundamental design principles including evolutionary development.

Arguably the ability to change is more important than all of the other components of the IRAC framework. After all, how can you learn if you are unable to change?

Slide41

The need for change has also been identified in the learning analytics literature.

One of the big dangers of an inability to change brings to learning analytics is that much of what learning analytics is based on – e.g. the clickstream – data that is simple to log. Data that is coming from the systems and the pedagogical practices that we currently have. Pedagogical practices that are perhaps not all that they should be now or into the future. If we are not to be caught in the morass of historical and crappy processes then evolutionary development of learning analytics – and all of e-learning – is essential and for us is not present.

What I think will be most interesting about what Colin and Damien are doing at CQUniversity is that they have embraced – and for now are being allowed somewhat by the institution – to engage in evolutionary development. They technologies and approaches they are adopting allow them to evolve SSI, MAV and other systems they are working on much more quickly and more in step with the needs of the students and teachers at CQUniversity than other approaches I’ve seen. It’s this that is going to give them a much greater chance of getting their learning analytics interventions integrated into practice.

It’s not a question of how badly (or how well) you start, it’s a question of how quickly you can get better.

Slide42

One illustration of this ability to change is a recent addition to SSI. This column in the SSI output summarises the total number clicks the student has made on the Moodle course site for this course for the entire semester. This total has always been there, but what hasn’t been there is the link. Each number is a link.

Slide43

If you click on that link you get taken to the Moodle course site for the course. But not the standard Moodle course site like what you see here.

Slide44

Instead, SSI uses another system under development at CQU (MAV – Moodle Activity Viewer) that modifies the entire Moodle course site to generate a heat map of clicks. In this case, the heat map MAV generates shows where a particular student has clicked and how many times. This particular students hasn’t clicked many times so you can’t actually see much difference in the heat map, i.e. there are no red areas indicating large numbers of clicks.

Now this still relies on the clickstream. i.e. behavioural and not cognitive data. However, it is a representation that can help the teacher make some more informed decisions than simply just having the number of clicks. An example of learning analytics helping by augmenting human decision making rather than replacing it. In this case, helping the teacher draw on their knowledge about the course structure and what is happening to leverage the limited value of the click stream.

Slide45

This adds the idea of “In place” representation. This also provides a link back to the Electronic Performance Support Systems literature that underpins IRAC and the idea that we should be providing help to folk at the point they need it most. Not separate to the learning environment, but embedded within it. Not a dashboard implemented in some data warehouse that means I need another application to access it, but embedded into the learning environment. By providing the information as part of the learning environment – rather than somewhere else – the cognitive load is reduced and the existing knowledge of the student/teacher can be more easily leveraged.

Slide46

MAV was actually developed separately from SSI. It’s original purpose was to allow teaching staff to generate a heat map of the clicks for their course site for all students, groups of students or individual students. This particular representation is of MAV working for all students on a different course site. In this case, showing the number of students.

Since the work at CQUni recognises the importance of change, it was possible for them to quickly combine these two systems.

Slide47

Time to finish – perhaps past time. Our hypothesis here is that the IRAC framework – especially in its completed state – offers insights that can help in the analysis, design and implementation of learning analytics interventions. We think that this analysis, done with a particular context and purpose in mind, will lead to learning analytics interventions that are more likely to be used and thus more likely to actually improve learning.

Slide48

And in short, provide a way in which university learning analytics interventions a less like a stinking pile of …..

Slide49

Still very early days. We’ve got lots of work still to do and lots of questions to ask. It would be great to start with your questions, questions?

Some of the short-term work we have planned includes to do an analysis of the learning analytics literature for two reasons.

  1. Identify a range of topic lists, frameworks, models and examples that fit under each of the IRAC components, and
  2. Explore what, if any, components of the IRAC framework are under-represented in the literature.

We’re also keen to undertake some design-based research using the IRAC framework to design modifications to a range of learning analytics interventions.

Of course, this doesn’t capture the full scope of the potential questions of interest in all of the above.

Slide50

Slide51

Slide52

Image attribution

Slide 5, 49: “Question Everything / Nullius in verba / Take nobody’s word for it” by Duncan Hull available at http://flickr.com/photos/dullhunk/202872717 under Attribution License http://creativecommons.org/licenses/by/2.0/

Slide 53: “University of Michigan Library Card Catalog” by David Fulmer available at http://flickr.com/photos/dfulmer/4350629792 under Attribution License http://creativecommons.org/licenses/by/2.0/

Slide 3: “Warehouse” by Michele Ursino available at http://flickr.com/photos/micurs/6118627854 under Attribution-ShareAlike License http://creativecommons.org/licenses/by-sa/2.0/

Slide 15: “Stream” by coniferconifer available at http://flickr.com/photos/coniferconifer/9535872266 under Attribution License http://creativecommons.org/licenses/by/2.0/

Slide 50, 51, 52: “The British Library” by Steve Cadman available at http://flickr.com/photos/stevecadman/486263551 under Attribution-ShareAlike License http://creativecommons.org/licenses/by-sa/2.0/

Slide 17: “Lawyer Crystal Ball” by CALI – Center for Computer-Assisted Legal Instruction available at http://flickr.com/photos/cali.org/6150105185 under Attribution-NonCommercial-ShareAlike License http://creativecommons.org/licenses/by-nc-sa/2.0/

Slide 23: “Dashboard” by Marko Vallius available at http://flickr.com/photos/markvall/3892112410 under Attribution-NonCommercial-ShareAlike License http://creativecommons.org/licenses/by-nc-sa/2.0/

Slide 4: “framework” by kaz k available at http://flickr.com/photos/kazk/198640938 under Attribution License http://creativecommons.org/licenses/by/2.0/

Slide 47: “25.365” by romana klee available at http://flickr.com/photos/romanaklee/5391995939 under Attribution-ShareAlike License http://creativecommons.org/licenses/by-sa/2.0/

Slide 18: “The Internet” by Martin Deutsch available at http://flickr.com/photos/MartinDeutsch/3190769121 under Attribution-NonCommercial-NoDerivs License http://creativecommons.org/licenses/by-nc-nd/2.0/

Slide 8, 9: “day 140” by mjtmail (tiggy) available at http://flickr.com/photos/mjtmail(tiggy)/2518317362 under Attribution License http://creativecommons.org/licenses/by/2.0/

Slide 6: “Purpose” by Seth Sawyers available at http://flickr.com/photos/sidewalkflying/3534131757 under Attribution License http://creativecommons.org/licenses/by/2.0/

Slide 12: “Making Omelettes” by PhotoGraham available at http://flickr.com/photos/PhotoGraham/260939952 under Attribution-NonCommercial-ShareAlike License http://creativecommons.org/licenses/by-nc-sa/2.0/

Slide 2, 48: “Smoking pile of sh*t” by David Jones available at http://flickr.com/photos/DavidTJones/3626888438 under Attribution-NonCommercial-ShareAlike License http://creativecommons.org/licenses/by-nc-sa/2.0/

Slide 40: “Change Allley sign” by Matt Brown available at http://flickr.com/photos/MattFromLondon/3163571645 under Attribution License http://creativecommons.org/licenses/by/2.0/

Reviewing the past to imagine the future of elearning #ascilite

Cathy Gunn, Reviewing the past to imagine the future of elearning

The technologies that make a difference aren’t those that are hyped.

Comment: But example given of access to online journals is an example of the Web – the information superhighway. Perhaps even Vannevar Bush. It was/is hyped. But it wasn’t hyped as being specific to education.

1993 – the idea that “the fundamental nature of teaching and learning is shifting”. Link to constructivist etc. Hypertext, multimedia. The start of massification/diversification.

2013 – shift still happening. But now Connectivism – shifting to – collectivism and peer learning.

From review/recommend to collaborate/critique

Specialised use to mass engagement e.g. ride on a plane and see the use.

Comment: But the the type of “mass” engagement is still very unique/individual. The platform enables self-customisation. Not something that applies to institutional systems – is that a factor in their limited use?

50 years of learning technology research required to find that “the type of media has no reliable effect on learning”

So, what should we study and what methods should be used?

Links to Gunn & Steel (2012)

Sticky problems in 2013?

  • Most teachers still don’t make use of the potential.
  • Push to standardise, secure and control at odds with open access, free tools & experimentation
  • Research methods are not established enough to move field forward
  • Funding models don’t provide for sustainable development

Predictions

  • LA + ability to exploring learning design intent will add a missing link to methodology
  • Discoverable OERs, MOOCs etc will realise the dream of a learning object economy
  • Imperatives for change & the affordances of technology will synergise developments.
  • Dominant designs will emerge and re-engineeer IT industries
  • Power of collective consciousness will transform education and knowledge creation.

Enhancing learning analytics by understanding the needs of teachers #ascilite

Linda Corrin, Gregor Kennedy, Raoul Mulder, Enhancing learning analytics by understanding the needs of teachers

Looking at the needs of lecturers.

LA is still new and emerging.

Research focus is on tools or specific problems.

Note: How does that link to the IRAC idea of affordances being limited.

All this research is being fed down to the teachers. This research is trying to go the other way.

Based on committee work at Uni Melbourne. What do lectuerers really need to know.

  1. What are the key L&T problems/situations that teachers face for which learning analytics could be useful?
  2. What data could be used to address these problems?
  3. ???

Ran focus groups with selected samples of undergraduate degrees. ADL&T and Program Coordinators nominate “important” teachers.

Findings

  • student performance – at risk students, attendance, access to learning resources, participation in communication in class settings, performance in assessment

    A lot of staff wanted to see what the data would say about the combination of student performance + engagement. Leading to the concept of the “ideal student”.

    Different groups had different thoughts on whether students should have access to data

    Provision of feedback – combining performance and engagement. How do/can the students interpret the feedback.

  • student engagement –
  • the learning experience – greater understanding of how students develop knowledge; track prior knowledge and its development through learning activities. Data??
  • quality of teaching and the curriculum – automated textual analysis of messages students sending to student support services/discussion forums; formative and summative assessment to identify areas for review; access to support resources
  • administrative functions associated with L&T – assesment of consistency of student placements; enrolment and profiling tutorial groups; tracking safety requirements for field trips; student selection of units

Issues

  • needs not currently met by available presentations – level of detail; timing; multiple data sources
  • Skills and time to interpret
  • How to measure learning
  • privacy/ethics
  • Impact on curriculum design – i.e. management saying you must use tool X so we can measure

Actions

  • professional development on LA
  • policy guidelines

Summary

Interesting findings but limited by the constraints of the “requirements analysis” process. i.e. the assumption that people can think of all the factors in a situation where they haven’t had much experience with a system or indeed aren’t in the process of using a system. Especially given the early comments that the learning analytics field itself is still at an early stage of development.

A window into lecturers’ conversations #ascilite

Negin Mirriahi – A window into lecturers’ conversations: With whom are they speaking about technology and why does it matter?

How can HE institutions enhance technology adoption?

How can the top down initiatives be improved. How do we engage academics in using the technologies we want them to?

Comment: Well choosing it for them is perhaps not a good start. (Though unavoidable).

Interviews with 23 lecturers in foreign language teacher.

How did they heard about the LMS. Large section from colleagues. Aim is to drill down on what those discussions were about.

Questionaires used to identify range of things, including who they are talking with. Network maps showing connections about who is talking to who.

So what types of conversations they are using.

Informal conversations –
Format conversations – formal meetings for a project team, formal discipline networks

Finding (not sure how this was concluded) it’s the formal and informal conversations that makes the difference. i.e. not the workshops etc that are put in place.

What about those who don’t have conversations? Some quotes

We already have the tools, I explore it myself, I’m good at learning computer things

Seem to indicate that these folk might be good mentors, how to do that?

What can we do?

Provide opportunities for

  • informal conversations -shared offices, e-learning showcase, conferences
  • formal conversations – regular meetings, mentorship, CoP
  • other – workshops, educational technologists support, online resources

Question: What about the informal conversations at the moment of need? i.e. a helpdesk process that hasn’t had the informal chat nature abstracted away by the adoption of IT enterprise helpdesk system. Helping connections between the different people across an organisation be created.

Using the e-learning Maturity Model to Identify Good Practice in E-Learning #ascilite

Live blogging from a talk by Stephen Marshall – Using the e-learning Maturity Model to Identify Good Practice in E-Learning

Different ways of talking about quality as

  • perfection
  • exception – surpassing of standards
  • functionality – degree of utility.
  • adequate return – cost benefit.

Comment: Quality as a big stick.

Focus here is quality as sensemaking. Not as ranking, ordering etc.

We shall never be able to esacpe the ulimate dilemma that all our knowledge is about the past, and all our decisions are about the future.

Wilson (2000), From scenario thinking to strategic action

Brief description of maturity models – assumes “continuous improvement” – optimising is the ultimate

Comment: Does this model actually apply in a dynamic environment? Can an organisation ever know

Showing the reports for Oz Unis against the eMM.

Universities not strong on self-criticism. Focused on looking good in public.

Comment: Surprise, surprise.

Without these conversations – acknowledging things need to be improved – limits way forward.

eMM based on hueristics and the idea that we don’t know yet how – as institutions – to do e-learning well.

Now using various elements of the eMM to illustrate examples of good practice at various Oz universities.

Summary

Perhaps the most useful application and perspective on quality and eMM that I’ve seen. Of course, when most senior management think about quality, sensemaking is perhaps the last thing they are thinking of. Especially given the observation that Universities aren’t good at being self-critical.

Not to mention “most universities don’t measure what they do” and the comment that this sort of work goes around in cycles as accountability becomes more important.

Sensemaking – #ascilite

Live blogging of workshop run by Associate Professor Gregor Kennedy – early work from MM mentioning audit trail. Something that Reeves and Hedberg (2003) criticise as being hard to impossible without the students themselves explaining.

Talked about as early skepticism which disappeared with the arrival of big data.

Comment: But perhaps the skepticism has just been swamped by the fad.

A fair bit of time on workshop activities

How are learning analytics used

In order of prevalence

  • Detect at risk students – majority here
  • Teaching and learning research and evaluation
  • Student feedback for adaptive learning
  • Track students’ skills devleopment within curricula

Sensemaking – fundamental issues

Process of analytics: measure, parse, analyse, interpret, report

Note: Sticking to the analytics as simply information, not a foundation for action as suggested in IRAC.

Each of the steps require decisions to be made: metric selection, granularity of analysis, analysis sophistication, meaning making behaviour != cognition, timely representation, provision to multiple audiences

Behaviour != cognition

Basic level analytics data record the behavioural responses of users

Some – free-text responses – can have a cognitive component

Cognitive component is absent

Thus easy to answer what, but not why.

metric selection

Dashboard views provide aggregated student or class view. Done in a way that is known or not known.

Typical metrics

  • How many times did they do something
  • How much time did they spend.
  • Some sort of standardised score – assessment etc.

granularity

At what level do you collect and analyse data

  • Every click
  • Key components of a task – particular aspects specific to a learning task
  • Key aspects of your online subject
  • Key aspectis of your online course

Top down and bottom up

Computer science – bottom up – data mining for meaningful patterns.

L&T folk – top down – pedagogical models and specific learning designs

Hard to do it only one way. Usually a combination of both required

The IMS white paper on learning measurement for analytics identified as an example of someone starting to do both.

Note: this model might be useful for the 2009 extension paper.

The affordances of the tool also influences the analysis.

Has a iterative model of analysis from macro down to specific.

Interfaces for learning data visualisations – #ascilite.

Live blogging workshop from Prof Judy Kay. A computer scientist from the user modeling, AIED background, pervasive computing. A focus on personalisation. Putting people in control – personal data.

Interest in open learner models.

Learning analytics seen as a form of learner/user modeling – with interfaces.

How to create interfaces in LA?

  • User-centered approaches – start with where people are, hence need to understand mental models. – stakeholders – mental models – the problem
  • core tools and principles – starts to influence mental models
  • Case studies

It’s an exciting time as we can influence the shape of the core tools and principles.

Interfaces..visualisations

Why?

Fekete, Van Wijik, Stasko, North (2008) – The value of information visualisation.

We’re hardwired – preattentively processed task.

How? No simple rules

Some principles

  • Individual data takes on more meaning when comparisons are supported: others, temporal, contextual

Note: Application for MAV

Patina: Dynamic heatmaps for visualising application usage – Matekjka, Grossman, Fitzmaurice (2013)

  • Ability to show different footprints – allowing comparison

Case study

The problem – group work is hard and important, creates problems. Stakeholders – learner as individual, team leaders, facilitators. Students with capstone project – working in term for client. Also used by Masters Education students. Using trac.

Build a tool – Narcissus – Upton and Kay (2009).

Integrated into trak. – showing comparison of user participation. Different colours for different types. Click on the cell and see the details about participation from that user for that cell. Since using this, never had to fail a group. They see the data, it’s visible. Worked as a conversation starter. Students told the data would never be used for assessment.

Navigating the information space in an entirely new way based on what the people are doing.

Note: Potentially useful for BIM – self-regulation – comparisons

Sequence mining – identify individuals and what they are doing and group them into categories – managers, developers, loafers, other.

Current problems

  • Teacher – early identification of at-risk indviduals
  • Learner – decision suport: Am I doing well enough? Am I doing what is expected of me?
  • Insitution – effectiviness of learning and teaching.

General principles

Bull and Kay (2007) – Student models that invite the learner in: The SMILI:() open learner modelling framework

Using this work as the foundation/source of principles

OLM – any interface to data that a system keeps about the learner.

Note: this literature would have some general principles for Information in IRAC.

What is open? How is it presented? Who controls access?

Purposes

  • Improving accuracy
  • Promoting learner
  • Helping learners to plan and/or monitor learning
  • Facilitating collaboration and/or competition
  • Faciliting navigation of the learning system
  • Assessment

Scrutable user models and personalised systems. Systems are deterministic.

Note: links to IRAC

Interfaces to substantial learner models – analysis of an SPOOC.

Mental models

The set of user beliefs. Kay doesn’t see enough about mental models in the learning analytics literature.

The importance here is that mental models influences what a user can “see” and “hear”, how the interpret information. Clashes exists between user, programmer, expert MMs.

Pervasive technologies

Mention of orchestration as some of what drives this work.

Principles

  • Skill meters
  • Game elements
  • Good match to mental models

Summary

Some pointers to interesting research from the AI/HCI fields that could help inform learning analytics and prevent a lot of reinventing the wheel. But the observation made by the presenter that there are no principles, does mean that that promise of how to build these visualisations hasn’t been answered directly in the workshop.