Category Archives: thesis

Tertiary course design is very poor, and we solve it by “blame the teacher”

The following is inspired by this tweet

which links to this newspaper article titled “Tertiary course design ‘very poor'”. An article certain to get a rise out of me because it continues the “blame the teacher” refrain of common to certain types of central L&T type people

After 33 years of working in higher education in all parts in NZ, the US and UK, the one thing we’ve become very clear about in curriculum design is that our people in higher education need to actually be educated as educators to work at that level

This seems to imply then that all of the courses taught by those with teaching qualifications should be beacons of quality learning experiences. My observations of courses at a number of universities taught by graduates of higher education teaching certificates and by those in Faculties of Education would seem to indicate otherwise. Not to mention reports of “ticking the box” from colleagues at top universities required to complete graduate certificates in higher education teaching. i.e. they have to complete the certificate to have a job, so they complete it. They are successful products of formal education, they know how to successfully jump through the required hoops.

This is not to suggest there is no value in these courses. But it’s not the solution to the problem. It’s not even the best way to build knowledge of teaching and learning amongst academics.

The following figure is from Richardson (2005)

Integrated model of teachers' approaches to teaching

The findings from this research is that there can be significant differences between the espoused theories information teaching and learning and the theories in use (Leveson, 2004). Teachers can know all the “best” learning theory but not use that in their teaching. While teachers may hold higher-level views of teaching, other contextual factors may prevent the use of those conceptions (Leveson, 2004). Environmental, institutional, or other issues may impel teachers to teach in a way that is against their preferred approach (Samuelowicz & Bain, 2001). Prosser and Trigwell (1997) found that teachers with a student-focused approach were more likely to report that their departments valued teaching, that their class sizes were not too large, and that they had control over what was taught and how it was taught. In examining conceptions of e-learning held by academic staff, Gonzalez (2009) found that institutional factors and the nature of the students were the most relevant contextual factors influencing teaching.

Now, consider the world of Australian (and New Zealand?) Universities as we move into 2013. Do you think the environmental factors have gotten any better in terms or enabling teachers to teach in the ways they want? An increasing focus on retention, an increasingly diverse intake of students, decreasing funding, increasing use of e-learning, decreasing quality of institutional e-learning systems, increasing casualisation of the academic work-force, research versus teaching, increasing managerialisation and increasingly rapid rounds of restructuring….are any of these factors destined to encourage quality approaches to teaching and learning?

My argument is that given this environment, even if you could get every academic at a university to have a formal qualification in learning and teaching, there wouldn’t be any significant increase in the quality of student learning because the environment would limit any chance of action and only encourage academics to “tick” the qualifications box.

On the other hand, if the teaching and learning environment at a university wasn’t focused on the efficient performance of a set of plans (which limit learning) and instead focused on encouraging and enabling academics and the system to learn more about about teaching and learning within their specific context…….

References

Leveson, L. (2004). Encouraging better learning through better teaching: a study of approaches to teaching in accounting. Accounting Education, 13(4), 529–549.

Prosser, M., & Trigwell, K. (1997). Relations between perceptions of the teaching environment and approaches to teaching. British Journal of Educational Psychology, 67(1), 25–35.

Richardson, J. (2005). Students’ approaches to learning and teachers’ approaches to teaching in higher education. Educational Psychology, 25(6), 673–680.

Samuelowicz, K., & Bain, J. (2001). Revisiting academics’ beliefs about teaching and learning. Higher Education, 41(3), 299–325.

#ascilite2012 technical support and the tail wagging the dog

I’m slowly recovering from a week at conferences. First, ASCILITE’2012 (#ascilite2012) and second the SoLAR Southern Flare Conference (#FlareAus). I was going to spend the week before preparing, but marking and other tasks intervened. This meant I spent much of the week preparing presentations which meant a couple of late nights and limited social collaboration. Add in a couple of early flights and I’m a little tired and frustrated. This may come through in the following.

Perhaps the biggest frustration this week was the audio-visual support at #ascilite2012. This is summed up nicely by the following quote from the “Information for Presenters” page from the conference website

All authors are required to email their final PowerPoint presentation (with all embedded images and videos in the same folder) by no later than 20 November 2012.

Just to be clear on the point, the conference started on the 25th of November. That’s right, the expectation was that we’d have our presentations completed 5 days before the conference started.

This probably wasn’t going to happen for most people. So a follow up option was provided (from the same page)

We would prefer that presenters use the equipment we provide in the venue as each venue will have mac and pc capability, therefore we ask for your presentation before hand, or at least 5 hours before your presentation at the event.

According to an overhead comment from one of the people organising the presentation support, this is how all conferences work.

Sorry, but no.

Tail wagging the dog

To me this is a perfect example of the tail represented by technology and the technologists wagging the dog.

Edu Doggy

For at least the last 10 years I’ve been taking laptops to conferences. For me – and many others I know – our process is to work on the presentations until the very last minute due to two main factors. First, we’re busy. I didn’t get a chance to work directly on my presentation for #ascilite2012 until I left home to travel to Wellington. I didn’t really get into my #FlareAus presentation until the night before. Second, we like to incorporate insights, comments and events from the conference. In the hour or so before my #ascilite the conference chair introduced the idea of FOMO to describe MOOCs and other hypes and Neil Selwyn decried the absence of a focus on the present in educational technology research. Both points that resonated strongly with my presentation. I had to work these into the presentation.

Theoretically, this necessary change was not possible.

Which is somewhat ironic given that the aim of the presentation, the paper and my thesis was to argue that university e-learning suffers from exactly the same problem.

Especially when the #ascilite2012 call for papers is talking about

Recent waves of global uncertainty coupled with local crises and government reforms are reshaping the tertiary education landscape.

Doing it with academics not possible

An extension to this proposition is that since the people, process and product of university e-learning is inflexible, then university e-learning by definition is either done “to” or “for” the academics. i.e. the tail wags the dog. The practice of e-learning is constrained by the people, process and product. This prevents university being done “with” the academics. i.e. as a learning process. This was the theme picked up in our #FlareAus presentation.

The proposal being that learning analytics within universities will largely be done “to” and “for” academics, rather than “with”. Subsequent to this will be a whole range of pitfalls and eventually the likely end result that learning analytics will become yet another fashion, fad or band-wagon.

Evidence of workarounds

Just like I chose to ignore the requirements of the audio-visual folk at #ascilite2012. There was evidence at #FlareAus of people working around the requirements/constraints of university e-learning.

The presentation from Abelardo Pardo used the client-side (browser) approach to working around the inflexibility of the LMS (Moodle). i.e. staff install a browser plugin that identifies when a particular LMS web page arrives in the browser and adds something useful to the page.

Susan Tull presented on the University of Cantebury’s LearnTrak system (more detail in this EDUCAUSE Review article). LearnTrak is a customised version of GISMO which is a “Graphical Interactive Student Monitoring Tool for Moodle”. Susan’s presentation was before mine at #FlareAus. I liked the idea because they were working with their academics to provide a system that worked for them. That responded to local needs. At least that was the impression.

GISMO apparently takes the Moodle plugin approach but it appears that it does breakaway from Moodle’s interface fairly quickly in order to present a fairly detailed collection of reports, mostly charts.

Both these approaches have their limitations. But I am now wondering if there is a vein (rich or otherwise) of research opportunities in developing better and different approaches to breaking the inflexibility of the product and the process of university e-learning. This might become a theme.

Enabling academics to apply learning analytics to individual pedagogical practice: how and with what impacts?

The following is an excerpt from an unsuccessful 2012 second round OLT grant. We’re currently pondering what the next step is with the idea.

A recent presentation at the Southern Solar Flare conference places the following idea in a broader context of learning analytics and how universities are implementing it.

Project Rationale

The Society for Learning Analytics Research (SoLAR) defines learning analytics as (Siemens et al., 2011, p. 4)

..the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.

One example of learning analytics is the Social Networks Adapting Pedagogical Practice (SNAPP) tool developed by an ALTC project (Dawson, Bakharia, Lockyer, & Heathcote, 2011) to visualise interaction patterns in course discussion forums to support a focus on learner isolation, creativity and community formation. While SNAPP’s network diagrams were found to be effective in promoting reflection on teaching activities it was found that teachers had difficulty in understanding the relationship between their pedagogical approaches and the insights revealed by SNAPP (Dawson et al., 2011, p. 4). Being able to interpret the patterns provided by learning analytics and apply them to practice is difficult, time-consuming, requires additional support, and is worthy of further investigation (Dawson et al., 2011; Dawson & McWilliam, 2008). The challenge is being able “to readily and accurately interpret the data and translate such findings into practice” (Dawson & McWilliam, 2008, p. 12). This project aims to address this challenge.

Ferguson (2012) identifies learning analytics as one of the fastest-growing areas of research within technology-enhanced learning with interest being driven by a combination of technological, pedagogical, political and economic drivers. The 2012 Horizon Report (Johnson & Cummins, 2012) argues that while learning analytics is still in its early stages of development it is likely to see widespread adoption within the higher education sector in the next 2-3 years. The recent Horizon technology outlook for Australian Tertiary Education (Johnson, Adams, & Cummins, 2012) places learning analytics – for the first time anywhere in the world – into the “one year or less” time-frame for adoption. This finding suggests that with the imminent and widespread adoption of learning analytics, Australian universities may be particularly interested in “finding new ways to measure student performance, ideally in real-time” (Johnson et al., 2012, p. 1).

An interest perhaps driven in part by the broadening participation agenda of the Commonwealth government and its targets set in response to the Bradley review (Bradley, Noonan, Nugent, & Scales, 2008). For 40% of Australians aged 25-34 to hold (or be progressing towards) bachelor degrees, we will need to enrol and graduate more students. Many of these students will be from non-traditional backgrounds, and would have been considered ‘high-risk’ students in the past. Learning analytics can help us understand the circumstances under which those students are most likely to succeed. But can learning analytics also help guide teachers to make the coalface pedagogical decisions to support the success of this larger and more diverse body of students?

To date much of the work on learning analytics in higher education has centred on identifying students at risk of failure and addressing short-term issues to prevent that failure (Johnson & Cummins, 2012). The dominant use of learning analytics within higher education has largely been by University administrators (Dawson, et al. 2011) or support staff. The larger promise of learning analytics is when it is used “to more precisely understand students’ learning needs and to tailor instruction appropriately far more accurately and far sooner than is possible today” (Johnson & Cummins, 2012, p. 23). If correctly applied and interpreted this practice has implications not only for student performance, but also for the perceptions of learning, teaching and assessment held by educators (Johnson & Cummins, 2012). There are, however, “a dearth of studies that have investigated the relationship between learning analytics and data requirements that would better assist teachers in the design and evaluation of learning and teaching practice” (Dawson et al., 2011, p. 4). Consequently, it remains uncertain exactly how to best enable teachers to apply learning analytics to inform their individual pedagogical practices, and, if successful, what impacts such practices would have on the perceptions of learning, teaching and assessment held by those teachers, and ultimately on student performance. This project seeks to provide answers to these “How?” and “With what impacts?” questions.

Project outcomes

This project’s six outcomes are summarised in the following table and described in more detail following.

1A course (aka unit or subject) is the smallest stand-alone offering
2 – OKB is the Online Knowledge Base
Outcome Format Audience Contribution
Course 1 modifications Modified courses

Case studies of changes disseminated via presentations, the web (OKB2), and publications

Students & staff in the courses

Other teaching staff and institutional policy markers

Researchers

Improvements to student learning and changes to teacher perceptions

Data about the impacts of using learning analytics to inform teaching.

Concrete examples of using learning analytics that can be duplicated and modified in other contexts.

Modifications to Moodle tools Changes made available to the broader Moodle community by contribution back to the Moodle code base. 17 Australian Universities using Moodle and the broader Moodle community.

Improved tools will lead to greater use of learning analytics by Moodle using teaching staff.

Concrete examples that improve understanding of the tool design guidelines

Tool design guidelines Design guidelines and theory disseminated via presentations, the web (OKB) and publications Tool developers

Researchers

Better tools in other LMS.

Foundation for further research.

Harnessing analytics models Models for how both teaching staff and institutions can enable the use of learning analytics to inform pedagogical practice disseminated via presentations, the web (OKB) and publications Teaching staff

Policy makers and institutional leaders

Teaching support staff

Researchers

Aid teaching staff and institutions in harnessing learning analytics contributing to more widespread, effective use.

Foundation for further research

Refinements to learning analytics patterns New or modified patterns disseminated via presentations, the web (OKB) and publications Teaching staff

Learning analytics researchers

Improved understanding of what is happening and suggestions for possible interventions.

Foundation for further research

Online knowledge base (OKB) A website providing a variety of learning paths through the project’s outcomes, resources, discussions, and processes. Teaching staff

Policy makers and institutional leaders

Researchers

Widespread and varied dissemination of project outcomes

Course modifications

At the centre of this project are two cycles of Participatory Action Research (PAR) with teaching academics at the University of Southern Queensland (USQ) and CQUniversity (CQUni). The aim of these cycles will be to work with the academics to explore how learning analytics can be used to inform their pedagogical practice – the methods they use for learning and teaching. Learning analytics tools and methods will be used to examine prior offerings of courses taught by these academics, share and explore perceptions of learning and teaching, identify potential course modifications, and examine the outcomes of those modifications. The modifications made will be supported and evaluated by a range of additional means. The modifications made, the reasons why, and the impact of the modifications will inform other project outcomes and will be available as case studies disseminated via various means.

Modifications to Moodle tools

USQ and CQUni are two of the 17 Australian Universities using Moodle as their institutional Learning Management System (LMS). Moodle provides an array of current and newly developed learning analytics tools of varying capabilities and qualities. Informed by prior research, the project’s theoretical framework and the insights gained during the project’s two PAR cycles a range of modifications will be made to these tools with the intent of better enabling the use of learning analytics to inform and share pedagogical practice. NetSpot – an e-learning services company that hosts Moodle for 10 Australian Universities – will make the necessary changes to these Moodle tools. All changes will be made available to the broader Moodle community.

Tool design guidelines

The rationale for and changes made to the Moodle-based learning analytics tools will be captured in design guidelines. The design guidelines are intended to make this knowledge available to the developers of other e-learning systems and thereby enable them to make improvements to their systems to better enable the use of learning analytics to inform pedagogical practice. The design guidelines will be made available via the OKB and will also be published in peer-reviewed outlets.

Harnessing analytics models

Changes to pedagogical practice will not arise simply because of the availability of new tools. A significant body of research has found that changes in approaches to teaching are influenced by disciplinary characteristics, conceptions of teaching, situational factors and perceptions of the teaching environment (Richardson, 2005). In addition, enabling increased use of learning analytics is likely to modify these factors and their relationship. Through its PAR cycles the project will explore these changes and work with academics and institutional leaders to identify factors that constrain and enable the on-going use of learning analytics to inform pedagogical practice. The results of this work will be combined with extant literature to develop a range of design models intended to aid institutional policy makers, teaching support staff and teaching staff in deciding how best to enable the use of learning analytics to inform pedagogical practice in their context.

Refinements to learning analytics patterns

Learning analytics often leads to the transformation of large amounts of usage data into useful patterns, indicators or visualisations. For example, the common correlation between increasing levels of LMS activity and increasing grades (Dawson & McWilliam, 2008, p. 2). These patterns are often used to inform decision-making. However, the patterns that are identified are directly influenced by what is being looked for (e.g. the emphasis on student retention in learning analytics work focusing attention on patterns identifying success factors) and the contexts being explored. For example, Beer, Jones & Clark (2009) found that the increasing activity/increasing grades correlation did not exist for certain groups of students and courses with certain characteristics.

During its PAR cycles, this project will be working directly with a diverse collection of teaching academics to achieve their purposes with learning analytics. This set of different perspectives will lead to the identification of new and the refinement of existing learning analytics patterns. These patterns will be distributed through the OKB, traditional peer-reviewed publications, and, where appropriate, incorporation into the learning analytics tools.

Online knowledge base

The Online Knowledge Base (OKB) fulfils two functions. First, it will be the primary means of communication and collaboration between the project team, reference group and project participants both within this and continuing work. Second, the OKB will serve as one of the major components of the project’s dissemination strategy. Both these functions will rely heavily on social media and related software (e.g. Mendeley, Diigo, blogs, Twitter etc.). The use of these tools will in turn be aggregated and curated to form an online knowledge base. The OKB will provide access to all project outcomes as both formal reports and as a variety of learning paths design explicitly for different stakeholders.

Value and need for the project

The need for this project is well established in the learning analytics literature including the findings and recommendations from two prior learning analytics related ALTC grants (Dawson et al., 2011; Dawson & McWilliam, 2008). Samples from that literature showing the need for this project includes:

  • Learning analytics is still at the early stages of implementation and experimentation (Siemens & Long, 2011).
  • The analytical tools provided by most LMS are poorly utilised (Dawson & McWilliam, 2008) in large part because the “tools and presentation format are complex and far removed from the specific learning context” (Dawson & McWilliam, 2008, p. 8).
  • Australian academics have limited understanding of what is available from these tools and how that data is related to pedagogical practice (Dawson & McWilliam, 2008).
  • There is a dearth of studies examining the use and impact of learning analytics to inform the design, delivery and future evaluations of individual teaching practices (Dawson et al., 2011; Dawson, Heathcote, & Poole, 2010).

This project builds directly upon prior work including prior ALTC grants (Dawson et al., 2011; Dawson & McWilliam, 2008), the development of Moodle learning analytics tools, and broader learning analytics research. The project is, through the project team and the members of the reference group, directly connected to the broader learning analytics communities. By working with a diverse range of teaching staff to explore the use of learning analytics to inform pedagogical practice the project will fill an identified need and produce outcomes of immediate use to a significant portion of the Australian higher education sector. Beyond immediate use the project’s outcomes create a platform for on-going work.

Approach

The project will take place over the course of 24 months and includes four main stages summarised in the following table.

Stages Outcomes Evaluation & dissemination
1: Formation and initial design
(Jan 2013 -
Jul 2013)
Project established; ethics approval granted; recruitment of research assistant.
Initial version of OKB.
Initial tool design guidelines and enhancements to LMS tools.
Project evaluation plan.
Two Reference group meetings.
Appointment of and initial meetings with project evaluator.
Presentations at CQU and USQ.
Identification of PAR cycle #1 participants and institutional influencers.
Promotion of OKB and broaden connections with learning analytics community.
2: PAR cycle #1
(Aug 2013 -
Feb 2014)
Modifications to 4 courses.
Changes to OKB.
Enhancements to tool design guidelines and LMS tools.
Initial draft of harnessing analytics guidelines.
Refinements to LA patterns.
On-going engagement and self- evaluation by PAR participants, critical friends, influencers and other team members.
Invited presentations integrated into OKB.
Reference group meeting.
Work with project evaluator.
Contribution of tool modifications to Moodle community.
3: PAR cycle #2
(Jan 2014 -
Sep 2014)
Modification to a further 8 courses. Changes to OKB.
Further enhancements to tool design guidelines and LMS tools.
Enhancements of harnessing analytics guidelines.
Refinements to LA patterns.
On-going engagement and self- evaluation by PAR participants, critical friends, influencers and other team members.
Invited presentations integrated into OKB.
Initial publications.
Reference group meeting.
Work with project evaluator.
Contribution of tool modifications to Moodle community.
4: Project Finalisation
(Oct 2014 -
Dec 2014)
Final enhancements to all project outcomes.
Final project report.
Publications and presentations.
Summative evaluation by project evaluator.
Promotion of OKB with final project outcomes.
Planning for future work.

Methodology and framework

The project will be using a combination of Participatory Action Research (PAR) and design-based research (DBR). This combination is used in part because of the close connection between the two methods – e.g. Wang and Hannafin (2005, p. 6) suggest that DBR is “akin” but slightly different to PAR – but also because of two slightly different project tasks. Firstly, the project aims through the lens of situated cognition to engage fully in the specifics of two institutional contexts for the purposes of helping individual academics address their needs. PAR is the best fit for this purpose and is seen as being better known across disciplines than DBR. To support the use of PAR, the project will adopt the lessons learned by Fraser & Harvey (2008) in a previous ALTC-funded project. These include: supported reflection sessions, the provision of theoretical sparks, and the pairing of academic participants with institutional influencers. The project also has a second task in that it must formulate artefacts (e.g. the knowledge base and enhanced Moodle tools) and design theory (e.g. the harnessing analytics model and the tool design guidelines) useful to the broader community. Design-based research both draws on and is conducted in order to generate design theory (Wang & Hannafin, 2005). Design-based research also involved the use of “multiple research methods in real-world learning environments” (Wang & Hannafin, 2005, p. 20).

In terms of theoretical frameworks, the project’s design draws upon situated cognition, distributed cognition and the role of conceptions and reflection in changing teaching and learning. A brief explanation of each of these and its application to the project design follows.

Seely Brown and Duguid (1989) argue that the tendency for education, training and technology design to focus on abstract representations that are detached from practice actually distort the intricacies of practice. This distortion hinders how well practice can be understood, engendered, or enhanced. It hinders learning. The design and development of many e-learning systems tend to suffer from this limited understanding of the intricacies of practice involved in modern e-learning. Dawson et al.’s (2011) observation that university administrators have been the dominant users of learning analytics in higher education indicates a similar problem with learning analytics. The use of Participatory Action Research will provide the “opportunity for codeveloping processes with people rather than for people” (McIntyre, 2008, p. xii). By situating the project within a shared purpose of joint, collective purpose, PAR will improve the understanding and learning gained about how learning analytics can be used to inform pedagogical practice.

The majority of e-learning systems provide direct support for the implementation of technical tasks such as posting a message to a discussion forum. The difficult cognitive task of combining these technical tasks to create an effective and appropriate pedagogical design is left almost entirely to the teacher. Dawson & McWilliam (2008) identify this problem with most LMS learning analytics tools which have presentation formats that are too complex and far removed from the specific learning context.

Hollan, Hutchins and Kirsh (2000) describe how distributed cognition expands what is considered cognitive beyond an individual to encompass interactions between people, their environment and the tools therein. Boland, Ramkrishnan and Te’eni (1994, p. 459) define a distributed cognition system as one that “supports interpretation and dialogue among a set of inquirers by providing richer forms of self-reflection and communication”. This project will make enhancements to learning analytics tools that make such tools an effective part of a distributed cognition system. A particular focus of the enhancements will be on reducing the difficulty of the task and offering greater support to teacher self-reflection and collaboration.

There is a significant body of literature that has established a link between the conceptions of learning and teaching held by academics and the quality of student learning outcomes (c.f. Richardson, 2005). It has also been found that environmental, institutional, or other issues may impel academics to teach in a way that is against their preferred approach (Richardson, 2005). There is a similarly widely acknowledged view that reflection on teaching contributes to changes in conceptions of teaching that lead to enhanced teaching practice and possibly improved student learning (Kreber & Castleden, 2009). Through its use of situated cognition and participatory action research this project aims to develop significant insight into the factors that constrain and enable adoption of learning analytics. By working within a PAR process with teaching academics and their accompanying institutional influencers the project aims to respond to these factors. Lastly, the combination of PAR with distributed cognition will encourage teaching staff to engage in a range of reflective processes that could lead to changes in their conceptions of learning and teaching and subsequently the quality of student learning outcomes.

References

Beer, C., Jones, D., & Clark, K. (2009). The indicators project identifying effective learning, adoption, activity, grades and external factors. Same places, different spaces. Proceedings ascilite Auckland 2009 (pp. 60–70). Auckland, New Zealand. Retrieved from http://www.ascilite.org.au/conferences/auckland09/procs/beer.pdf

Boland, R., Ramkrishnan, V., & Te’eni, D. (1994). Designing information technology to support distributed cognition. Organization Science, 5(3), 456–475.

Bradley, D., Noonan, P., Nugent, H., & Scales, B. (2008). Review of Australian Higher Education. Canberra. Retrieved from http://moourl.com/iax5c

Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32–42.

Dawson, S., Bakharia, A., Lockyer, L., & Heathcote, E. (2011). “Seeing” networks : visualising and evaluating student learning networks Final Report 2011. Main. Canberra. Retrieved from http://moourl.com/zwkpf

Dawson, S., Heathcote, L., & Poole, G. (2010). Harnessing ICT potential: The adoption and analysis of ICT systems for enhancing the student learning experience. International Journal of Educational Management, 24(2), 116–128. doi:10.1108/09513541011020936

Dawson, S., & McWilliam, E. (2008). Investigating the application of IT generated data as an indicator of learning and teaching performance. Canberra: Australian Learning and Teaching Council. Retrieved from http://moourl.com/hpds8

Ferguson, R. (2012). The State of Learning Analytics in 2012 : A Review and Future Challenges a review and future challenges. Milton Keynes: UK. Retrieved from http://kmi.open.ac.uk/publications/pdf/kmi-12-01.pdf

Fraser, S., & Harvey, M. (2008). Leadership and assessment: Strengthening the nexus. Final report. Strawberry Hills: Australian Learning and Teaching. Canberra. Retrieved from http://moourl.com/02pk6

Hollan, J., Hutchins, E., & Kirsh, D. (2000). Distributed cognition: Toward a new foundation for human-computer interaction research. ACM Transactions on Computer-Human Interaction, 7(2), 174–196.

Johnson, L., Adams, S., & Cummins, M. (2012). Technology Outlook for Australian Tertiary Education 2012-2017: An NMC Horizon Report Regional Analysis. New Media Consortium. Austin, Texas. Retrieved from http://www.nmc.org/publications/2012-technology-outlook-au

Johnson, L., & Cummins, M. (2012). The NMC Horizon Report: 2012 Higher Education Edition (p. 42). Austin, Texas.

Kreber, C., & Castleden, H. (2009). Reflection on teaching and epistemological structure: reflective and critically reflective processes in “pure/soft” and “pure/hard” fields. Higher Education, 57(4), 509–531.

McIntyre, A. (2008). Participatory Action Research. Thousand Oaks, CA: SAGE Publications.
Richardson, J. (2005). Students’ approaches to learning and teachers’ approaches to teaching in higher education. Educational Psychology, 25(6), 673–680.

Siemens, G., Gasevic, D., Haythornthwaite, C., Dawson, S., Shum, S. B., & Ferguson, R. (2011). Open Learning Analytics: an integrated & modularized platform. Knowledge Creation Diffusion Utilization. Retrieved from http://solaresearch.org/OpenLearningAnalytics.pdf

Siemens, G., & Long, P. (2011). Penetrating the Fog: Analytics in Learning and Education. EDUCAUSE Review, 46(5). Retrieved from http://moourl.com/j6a5d

Wang, F., & Hannafin, M. (2005). Design-Based Research and Technology-Enhanced Learning Environments. Educational Technology Research and Development, 53(4), 5–23.

Moving beyond a fashion: likely paths and pitfalls for learning analytics

The following resources are for a presentation given at the Southern Solar Flare on the 30th November, 2012.

The premise of the talk is that learning analytics shows all the hallmarks of a management fashion, fad, or bandwagon and to avoid this we need to talk more realistically about its implementation. The talk identifies three paths that Universities might use to implement learning analytics and identifies pitfalls for each of these paths. The argument is that there are one or two dominant paths and a forgotten path. It’s the forgotten path that my co-author and I are most interested in. It’s the path which we think will allow learning analytics to have the most impact upon learning and teaching.

There is an abstract on the conference site and an extended abstract.

This presentation evolved from an unsuccessful OLT grant application that attempted to engage with the forgotten path.

Slides

References

Abrahamson, E., & Fairchild, G. (1999). Management fashion: Lifecycles, triggers and collective learning processes. Administrative Science Quarterly, 44(4), 708–740.

Arnott, D., & Pervan, G. (2005). A critical analysis of decision support systems research. Journal of Information Technology, 20(2), 67–87.

Beer, C., Clark, K., & Jones, D. (2010). Indicators of engagement. Curriculum, technology and transformation for an unknown future. Proceedings of ASCILITE Sydney 2010 (pp. 75–86). Sydney.

Beer, C., Jones, D., & Clark, D. (2012). Analytics and complexity : Learning and leading for the future The hidden complexity behind simple patterns. In M. Brown (Ed.), Future Changes, Sustainable Futures. Proceedings of ascilite 2012. Wellington, NZ.

Beer, C., Jones, D., & Clark, K. (2009). The indicators project identifying effective learning, adoption, activity, grades and external factors. Same places, different spaces. Proceedings ascilite Auckland 2009 (pp. 60–70). Auckland, New Zealand.

Birnbaum, R. (2000). Management Fads in Higher Education: Where They Come From, What They Do, Why They Fail. San Francisco: Jossey-Bass.

Campbell, G. (2012). Here I Stand. Retrieved April 2, 2012, from https://sas.elluminate.com/site/external/jwsdetect/playback.jnlp?psid=2012-03-01.1231.M.0728C08DFE8BF0EB7323E19A1BC114.vcr&sid=2008104

Cecez-Kecmanovic, D., Janson, M., & Brown, A. (2002). The rationality framework for a critical study of information systems. Journal of Information Technology, 17, 215–227.

Clark, K., Beer, C., & Jones, D. (2010). Academic involvement with the LMS : An exploratory study. In C. Steel, M. Keppell, P. Gerbic, & S. Housego (Eds.), Curriculum, technology & transformation for an unknown future. Proceedings ascilite Sydney 2010 (pp. 487–496).

Convery, A. (2009). The pedagogy of the impressed: how teachers become victims of technological vision. Teachers and Teaching, 15(1), 25–41.

Dawson, S., Bakharia, A., Lockyer, L., & Heathcote, E. (2011). “Seeing” networks: visualising and evaluating student learning networks Final Report 2011. Main. Canberra.

Dawson, S., Heathcote, L., & Poole, G. (2010). Harnessing ICT potential: The adoption and analysis of ICT systems for enhancing the student learning experience. International Journal of Educational Management, 24(2), 116–128.

Findlow, S. (2008). Accountability and innovation in higher education: a disabling tension? Studies in Higher Education, 33(3), 313–329.

Fleming, P., & Spicer, A. (2003). Working at a cynical distance: Implications for power, subjectivity and resistance. Organization, 10(1), 157–179.

Geoghegan, W. (1994). Whatever happened to instructional technology? In S. Bapna, A. Emdad, & J. Zaveri (Eds.), (pp. 438–447). Baltimore, MD: IBM.

Hirschheim, R., Murungi, D. M., & Pe–a, S. (2012). Witty invention or dubious fad? Using argument mapping to examine the contours of management fashion. Information and Organization, 22(1), 60–84.

Jamieson, K., & Hyland, P. (2006). Factors that influence Information Systems decisions and outcomes: A summary of key themes from four case studies. Adelaide, Australia.

Johnson, L., Adams, S., & Cummins, M. (2012). Technology Outlook for Australian Tertiary Education 2012-2017: An NMC Horizon Report Regional Analysis. New Media Consortium. Austin, Texas.

Johnson, L., Smith, R., Levine, A., & Haywood, K. (2010). The horizon report: 2010 Australia-New Zealand Edition. Austin, Texas.

Jones, D. (2012). The life and death of Webfuse: Principles for learning and leading into the future. ASCILITEÕ2012.

Jones, N., & OÕShea, J. (2004). Challenging hierarchies: The impact of e-learning. Higher Education, 48, 379–395.

Lattuca, L., & Stark, J. (2009). Shaping the college curriculum: Academic plans in context. San Francisco: John Wiley & Sons.

Macfadyen, L., & Dawson, S. (2012a). Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan. Educational Technology & Society, 15(3), 149–163.

Macfadyen, L., & Dawson, S. (2012b). Numbers Are Not Enough. Why e-Learning Analytics Failed to Inform an Institutional Strategic Plan. Educational Technology & Society, 15(3), 149–163.

Marsh, J., Pane, J., & Hamilton, L. (2006). Making sense of data-driven decision making in education: Evidence from recent RAND research. Santa Monica, CA.

Pollock, N. (2005). When is a work-around? Conflict and negotiation in computer systems development. Science Technology Human Values, 30(4), 496–514.

Ramamurthy, K., Sen, A., & Sinha, A. P. (2008). Data warehousing infusion and organizational effectiveness. Systems, Man and É, 38(4), 976–994.

Rogers, E. (1995). Diffusion of Innovations (4th ed.). New York: The Free Press.

Schiller, M. J. (2012). Big Data Fail : Five Principles to Save Your BI. CIO Insight. Retrieved from http://www.cioinsight.com/c/a/Expert-Voices/Big-Data-Fail-Five-Principles-to-Save-Your-BI-Butt-759074/

Shaw, P. (1997). Intervening in the shadow systems of organizations: Consulting from a complexity perspective. Journal of Organizational Change Management, 10(3), 235–250.

Simon, H. (1991). Bounded rationality and organizational learning. Organization Science, 2(1), 125–134.

Snowden, D. (2002). Complex Acts of Knowing. Journal of Knowledge Management, 6(2), 100–111.

Stark, J. (2000). Planning introductory college courses: Content, context and form. Instructional Science, 28(5), 413–438.

Swanson, E. B., & Ramiller, N. C. (2004). Innovating mindfully with information technology. MIS Quarterly, 28(4), 553–583.

Trigwell, K. (2001). Judging university teaching. The International Journal for Academic Development, 6(1), 65–73.

Tutty, J., Sheard, J., & Avram, C. (2008). Teaching in the current higher education environment: perceptions of IT academics. Computer Science Education, 18(3), 171–185.

Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124–1131.

White, N. (2006). Tertiary education in the Noughties: the student perspective. Higher Education Research & Development, 25(3), 231–246.

The life and death of Webfuse: What’s wrong with industrial e-learning and how to fix it

The following is a collection of presentation resources (i.e. the slides) for an ASCILITE’2012 of this paper. The paper and presentation are a summary of the outcomes my PhD work. The thesis goes into much more detail.

Abstract

Drawing on the 14-year life and death of an integrated online learning environment used by tens of thousands of people, this paper argues that many of the principles and practices underpinning industrial e-learning – the current dominant institutional model – are inappropriate. The paper illustrates how industrial e-learning can limit outcomes of tertiary e-learning and limits the abilities of universities to respond to uncertainty and effectively explore the future of learning. It limits their ability to learn. The paper proposes one alternate set of successfully implemented principles and practices as being more appropriate for institutions seeking to learn for the future and lead in a climate of change.

Slides

The slides are available on Slideshare and should show up below. These slides are the extended version, prior to the cutting required to fit within the 20 minute time limit.

References

Arnott, D. (2006). Cognitive biases and decision support systems development: a design science approach. Information Systems Journal, 16, 55–78.

Behrens, S., Jamieson, K., Jones, D., & Cranston, M. (2005). Predicting system success using the Technology Acceptance Model: A case study. 16th Australasian Conference on Information Systems. Sydney.

Brews, P., & Hunt, M. (1999). Learning to plan and planning to learn: Resolving the planning school/learning school debate. Strategic Management, 20(10), 889–913.

Cecez-Kecmanovic, D., Janson, M., & Brown, A. (2002). The rationality framework for a critical study of information systems. Journal of Information Technology, 17, 215–227.

Central Queensland University. (2004). Faculty teaching and learning report. Rockhampton, Australia.

Davenport, T. (1998). Putting the Enterprise into the Enterprise System. Harvard Business Review, 76(4), 121–131.

Dede, C. (2008). Theoretical perspectives influencing the use of information technology in teaching and learning. In J. Voogt & G. Knezek (Eds.), (pp. 43–59). New York: Springer.

Dillard, J., & Yuthas, K. (2006). Enterprise resource planning systems and communicative action. Critical Perspectives on Accounting, 17(2-3), 202–223.

Fleming, P., & Spicer, A. (2003). Working at a cynical distance: Implications for power, subjectivity and resistance. Organization, 10(1), 157–179.

Haywood, T. (2002). Defining moments: Tension between richness and reach. In W. Dutton & B. Loader (Eds.), (pp. 39–49). London: Routledge.

Hutchins, E. (1991). Organizing work by adaptation. Organization Science, 2(1), 14–39.

Introna, L. (1996). Notes on ateleological information systems development. Information Technology & People, 9(4), 20–39.

Jamieson, K., & Hyland, P. (2006). Factors that influence Information Systems decisions and outcomes: A summary of key themes from four case studies. Adelaide, Australia.

Jones, D. (1996). Solving Some Problems of University Education: A Case Study. In R. Debreceny & A. Ellis (Eds.), Proceedings of AusWebÕ96 (pp. 243–252). Gold Coast, QLD: Southern Cross University Press.

Jones, D. (2002). Student Feedback, Anonymity, Observable Change and Course Barometers. In P. Barker & S. Rebelsky (Eds.), World Conference on Educational Multimedia, Hypermedia and Telecommunications 2002 (pp. 884–889). Denver, Colorado: AACE.

Jones, D. (2003). Course Barometers: Lessons gained from the widespread use of anonymous online formative evaluation. QUT, Brisbane.

Jones, D., & Buchanan, R. (1996). The design of an integrated online learning environment. In A. Christie, B. Vaughan, & P. James (Eds.), Making New Connections, asciliteÕ1996 (pp. 331–345). Adelaide.

Jones, D., & Luck, J. (2009). Blog Aggregation Management: Reducing the Aggravation of Managing Student Blogging. In G. Siemns & C. Fulford (Eds.), World Conference on Educational Multimedia, Hypermedia and Telecommunications 2009 (pp. 398–406). Chesapeake, VA: AACE.

Jones, N., & OÕShea, J. (2004). Challenging hierarchies: The impact of e-learning. Higher Education, 48, 379–395.

Katz, R. (2003). Balancing Technology and Tradition: The Example of Course Management Systems. EDUCAUSE Review, 38(4), 48–59.

Kurtz, C., & Snowden, D. (2007). Bramble Bushes in a Thicket: Narrative and the intangiables of learning networks. In M. Gibbert & T. Durand (Eds.), . Blackwell.

Laurillard, D. (2002). Rethinking University Teaching: A Conversational Framework for the Effective Use of Learning Technologies. London: Routledge.

Light, B., Holland, C. P., & Wills, K. (2001). ERP and best of breed: a comparative analysis. Business Process Management Journal, 7(3), 216–224.

March, J. (1991). Exploration and exploitation in organizational learning. Organization Science, 2(1), 71–87.

Mintzberg, H. (1989). Mintzberg on Management, Inside our Strange World of Organisations. New York: Free Press.

Morgan, Glenda. (2003). Faculty use of course management systems. Educause Centre for Applied Research.

Morgan, Glenn. (1992). Marketing discourse and practice: Towards a critical analysis. In M. Alvesson & H. Willmott (Eds.), (pp. 136–158). London: SAGE.

Pozzebon, M., Titah, R., & Pinsonneault, A. (2006). Combining social shaping of technology and communicative action theory for understanding rhetorical closuer in IT. Information Technology & People, 19(3), 244–271.

Robey, D., Ross, W., & Boudreau, M.-C. (2002). Learning to implement enterprise systems: An exploratory study of the dialectics of change. Journal of Management Information Systems, 19(1), 17–46.

Rossi, D., & Luck, J. (2011). Wrestling, wrangling and reaping: An exploration of educational practice and the transference of academic knowledge and skill in online learning contexts. Studies in Learning, Evaluation, Innovation and Development, 8(1), 60–75.

Seely Brown, J., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32–42.

Seely-Brown, J., & Hagel, J. (2005). From push to pull: The next frontier of innovation. The McKinsey Quarterly. McKinsey & Company.

Simon, H. (1991). Bounded rationality and organizational learning. Organization Science, 2(1), 125–134.

Sturgess, P., & Nouwens, F. (2004). Evaluation of online learning management systems. Turkish Online Journal of Distance Education, 5(3).

Thomas, J. (2012). Universities canÕt all be the same – it’s time we embraced diversity. The Conversation. Retrieved June 28, 2012, from http://theconversation.edu.au/universities-cant-all-be-the-same-its-time-we-embraced-diversity-7379

Truex, Duane, Baskerville, R., & Travis, J. (2000). Amethodical systems development: the deferred meaning of systems development methods. Accounting Management and Information Technologies, 10, 53–79.

Truex, Duanne, Baskerville, R., & Klein, H. (1999). Growing systems in emergent organizations. Communications of the ACM, 42(8), 117–123.

Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124–1131.

Underwood, J., & Dillon, G. (2011). Chasing dreams and recognising realities: teachersÕ responses to ICT. Technology, Pedagogy and Education, 20(3), 317–330. doi:10.1080/1475939X.2011.610932

Wagner, E., Scott, S., & Galliers, R. (2006). The creation of Òbest practiceÓ software: Myth, reality and ethics. Information and Organization, 16(3), 251–275.

Weick, K., & Quinn, R. (1999). Organizational change and development. Annual Review of Psychology, 50, 361–386.

The Texas sharpshooter fallacy and other issues for learning analytics

Becoming somewhat cynical about the headlong rush toward learning analytics I’m commencing an exploration of the problems associated with big data, data science and some of the other areas which form the foundation for learning analytics. The following is an ad hoc collection of some initial resources I’ve found and need to engage with.

Feel free to suggest some more.

The Texas sharpshooter fallacy

This particular fallacy gets a guernsey mainly because of the impact of its metaphoric title. From the Wikipedia page

The Texas sharpshooter fallacy often arises when a person has a large amount of data at their disposal, but only focuses on a small subset of that data. Random chance may give all the elements in that subset some kind of common property (or pair of common properties, when arguing for correlation). If the person fails to account for the likelihood of finding some subset in the large data with some common property strictly by chance alone, that person is likely committing a Texas Sharpshooter fallacy.

Critical questions for big data

Boyd, D., & Crawford, K. (2012). Critical questions for big data. Information, Communication & Society, 15(5), 662–679.

Abstract

The era of Big Data has begun. Computer scientists, physicists, economists, mathematicians, political scientists, bio-informaticists, sociologists, and other scholars are clamoring for access to the massive quantities of information produced by and about people, things, and their interactions. Diverse groups argue about the potential benefits and costs of analyzing genetic sequences, social media interactions, health records, phone logs, government records, and other digital traces left by people. Significant questions emerge. Will large-scale search data help us create better tools, services, and public goods? Or will it usher in a new wave of privacy incursions and invasive marketing? Will data analytics help us understand online communities and political movements? Or will it be used to track protesters and suppress speech? Will it transform how we study human communication and culture, or narrow the palette of research options and alter what ‘research’ means? Given the rise of Big Data as a socio-technical phenomenon, we argue that it is necessary to critically interrogate its assumptions and biases. In this article, we offer six provocations to spark conversations about the issues of Big Data: a cultural, technological, and scholarly phenomenon mythology that provokes extensive utopian and dystopian rhetoric.

The headings give a good idea of the provocations:

  • Big data changes the definition of knowledge.
  • Claims to objectivity and accuracy are misleading.
  • Bigger data are not always better data.
  • Taken out of context, Big data loses its meaning.
  • Just because it is accessible does not make it ethical.
  • Limited access to big data creates new digital divides.

Effects of big data analytics on organisations’ value creation

Mouthaan, N. (2012). Effects of big data analytics on organizations’ value creation. University of Amsterdam.

A Masters’ thesis, that amongst other things is

arguing that big data analytics might create value in two ways: by improving transaction efficiency and by supporting innovation, leading to new or improved products and services

and

this study also shows that big data analytics is indeed a hype created by
both potential users and suppliers and that many organizations are still experimenting with its implications as it is a new and relatively unexplored topic, both in scientific and organizational fields.

The promise and peril of big data

Bollier, D., & Firestone, C. (2010). The promise and peril of big data. Washington DC: The Aspen Institute.

Some good discussion of issues reported by a rappoteur, issues included.

  • How to make sense of big data?

    • Data correlation or scientific methods – Chris Anderson’s “Data deluge makes the scientific method obsolete” and responses. e.g. “MY TiVO thinks I’m gay”, gaming, the advantage of theory/deduction etc.
    • How should theories be crafted in the an age of big data?
    • Visualisation as a sense-making tool.
    • Bias-free interpretation of big data.

      Cleaning data requires decisions about what to ignore. Problem increased when data comes from different sources. Quote “One man’s noise is another man’s data”

    • Is more actually less?
      Does it yield new insights or create confusion and false confidence. “Big data is driven more by storage capabilities than by superior ways to ascertain useful knowledge”.
    • Correlations, causality and strategic decision making.
  • Business and social implications of big data
    • Social perils posed by big data
  • How should big data abuses be addressed?
  • Research ethics in emerging forms of online learning

    Esposito, A. (2012). Research ethics in emerging forms of online learning: issues arising from a hypothetical study on a MOOC. Electronic Journal of e-Learning, 10(3), 315–325.

    Will hopefully give some initial insights into the thorny issue of ethics.

    Data science and prediction

    Dhar, V. (2012). Data Science and Prediction. Available at SSRN. New York City.

    Appears to be slightly more “boosterish” than some of the other papers.

    Abstract

    The world’s data is growing more than 40% annually. Coupled with exponentially growing computing horsepower, this provides us with unprecedented basis for ‘learning’ useful things from the data through statistical induction without material human intervention and acting on them. Philosophers have long debated the merits and demerits of induction as a scientific method, the latter being that conclusions are not guaranteed to be certain and that multiple and numerous models can be conjured to explain the observed data. I propose that ‘big data’ brings a new and important perspective to these problems in that it greatly ameliorates historical concerns about induction, especially if our primary objective is prediction as opposed to causal model identification. Equally significantly, it propels us into an era of automated decision making, where computers will make the bulk of decisions because it is infeasible or more costly for humans to do so. In this paper, I describe how scale, integration and most importantly, prediction will be distinguishing hallmarks in this coming era of Data Science.’ In this brief monograph, I define this newly emerging field from business and research perspectives.

    Codes and codings in crisis: Signification, performativity and excess

    Mackenzie, A., & Vurdubakis, T. (2011). Codes and Codings in Crisis: Signification, Performativity and Excess. Theory, Culture & Society, 28(6), 3–23.

Three likely paths for learning analytics and academics

The following is an early attempt to write and share some thoughts on what, why and with what impacts Australian universities are going to engage with learning analytics over the next couple of years. Currently it’s fairly generic and the same structure could be used with any fad or change process.

You could read the next section, but it’s basically an argument as to why its important to consider the how learning analytics will impact academics. The three likely paths section describes the paths.

Context and rationale

By all indications learning analytics is one of the next big things in university learning and teaching. Ferguson (2012) identifies learning analytics as one of the fastest-growing areas of research within technology-enhanced learning with interest being driven by a combination of technological, pedagogical, political and economic drivers. The 2012 Horizon Report (Johnson & Cummins, 2012) argues that while learning analytics is still in its early stages of development it is likely to see widespread adoption within the higher education sector in the next 2-3 years. The recent Horizon technology outlook for Australian Tertiary Education (Johnson, Adams, & Cummins, 2012) places learning analytics – for the first time anywhere in the world – into the “one year or less” time-frame for adoption. Given what I hear around the traps, it would appear that every single Australian university is doing something (or thinking about it) around learning analytics.

My interest is in how these plans are going to impact upon academics and their pedagogical practice. It’s a fairly narrow view, but an interesting, self-serving and possibly important one. Johnson & Cummins (2012) suggestion that the larger promise of learning analytics is when it is used “to more precisely understand students’ learning needs and to tailor instruction appropriately far more accurately and far sooner than is possible today” (p. 23). I don’t think automated-tutoring information systems are going to be up to that task anytime soon. At least not across a broad cross-section of what is taught at Universities. So academics/teachers/instructors will be involved in someway.

But we don’t know much about this and it appears to be difficult. Dawson et al. (2011) make the observation of “a dearth of studies that have investigated the relationship between learning analytics and data requirements that would better assist teachers in the design and evaluation of learning and teaching practice” (p. 4). Not only that, it has been found that being able to interpret the patterns provided by learning analytics and apply them to practice is difficult, time-consuming and requires additional support (Dawson et al., 2011; Dawson & McWilliam, 2008). So I wonder how and with what impacts the almost inevitable strategies adopted by Australian Universities will help with this.

Not surprisingly, I am not optimistic. So I’m trying to develop some sort of framework to help think about the different likely paths they might adopt, the perspectives which underpin these paths, and what the likely positives, problems and outcomes might be of those paths.

The three likely paths

For the moment, I’ve identified three likely paths which I’ve labelled as

  1. Do it to the academics.
  2. Do it for the academics.
  3. Do it with the academics.

There are probably other paths (e.g. do nothing, ignore the academics) that might be adopted, but I feel these are probably the most likely.

These are listed in order of which I think are most likely to happen. There may be examples of path #3 spread throughout institutions, but I fear they will be few and far between.

Currently, it’s my theory that organisations probably need to travel all three paths. The trouble is that the 3rd path will probably be ignored and this will reduce the impact and adoption of learning analytics.

The eventual plan is to compare and contrast these different paths by the different assumptions or perspectives on which they are based. The following gives a quick explanation of each of the paths and an initial start on this analysis.

Those of you who know me, can probably see some correspondence between these three paths and the 3 levels of improving teaching. There is a definite preference in the following for the 3rd path, this is not to suggest that it should (or can) be the only path explored, or that the other paths have no value. All three have there part to play, but I think it would be wrong if the 3rd path was ignored.

Perhaps that’s the point here, to highlight the need for the 3rd path to complement the limitations of the other two. Not to mention helping surface some of the limitations of the other two so they can be appropriately addressed.

Some questions for your

  • Is there any value in this analysis?
  • What perspectives/insights/theories do I need to look at to better inform this?
  • What might be some useful analysis lens for the three paths?
  • Are there other paths?
  • What am I missing?

Do it to the academics

It seems a fair bit of interest in learning analytics is being driven by non-teaching folk. Student administration and IT folk amongst the foremost with senior management in there somewhere as well. Long and Siemens (2001) define this level as academic analytics rather than learning analytics. But I believe it belongs here because of the likelihood that if senior managers use academic analytics to make decisions, that some of the decisions they make will have an impact on academics (i.e. do it to them).

I can see this path leading to outcomes like

  • Implementation of a data warehouse, various dashboards and reports.
  • Some of these may be used to make data-driven decisions.
  • The implementation of various strategies such as “at-risk” processes that are done independently of academics.
  • At it’s worst, the creation of various policies or processes that require courses to meet certain targets or adopt certain practices (e.g. the worst type of “common course site policy), i.e. performativity.

In terms of analysing/characterising this type of approach, you might suggest

  • Will tend to actually be academic analytics, rather than learning analytics (as defined by Long and Siemens, 2011) but may get down to learning analytics at the departmental level.
  • It’s based on a “If I tell them to do it, they will…” assumption.
    i.e. what is written in the policy is what the academics will actually do.
  • A tendency to result in task corruption and apparent compliance.
  • It assumes academics will change their teaching practice based on what you told them to do.
  • Is based on the assumptions of teleological processes.
    i.e. the system is stable and predictable, the designers can manipulate the system’s behaviour, and the designer’s can determine the goals/criteria for success.
  • It is located a long way from the actual context of learning and teaching and assumes that big data sets and data mining algorithms will enable the identification of useful information that can guide decision making.
  • It does not recognise the diversity inherent in teaching/teachers and learning/learners.
    Assumes learning is like sleeping
  • It is based on the assumption of senior managers (of people in general) as rational decision makers, if only they had the right data.
  • What is actually done will rely heavily on which vendor gets chosen to implement.
  • Will be largely limited to the data that is already in the system(s).

Do it for the academic

There are possibly two sub-paths within this path

  1. The researcher path.
    Interested researchers develop theoretically-informed, research-based approaches to how learning analytics can be used by academics to improve what they do. They are developing methods for the use of academics.
  2. The support division path.
    This is where the Information Technology or Learning and Teaching support division of the university note the current buzz-word (learning analytics) and implement a tool, some staff development etc to enable academics to harness the buzz-word.

In terms of analysing/characterising this approach, I might identify

  • It’s based on the “If I build it, they will come” assumption.
  • It assumes you can improve/change teaching by providing new tools and awareness.
  • Which generally hasn’t worked for a range of reasons including perhaps the chasm
    i.e. the small number of early adopter academics engage, the vast majority don’t.
  • It assumes some level of commonality in teaching/teachers and learning/learners.
    At least at some level, as it assumes implementing a particular tool or approach may be applicable across the organisation. Assumes learning is perhaps more like eating?
  • It assumes that the researchers or the support division have sufficient insight to develop something appropriate.
  • It assumes we know enough about learning analytics and helping academics use learning analytics to inform pedagogical practice to enshrine practice around a particular set of tools.
  • Is based on the assumptions of teleological processes.
    i.e. the system is stable and predictable, the designers can manipulate the system’s behaviour, and the designer’s can determine the goals/criteria for success.
  • It will be constrained by the institutions existing systems and the support divisions’ people and their connections.
  • The support division path can be heavily influenced by the perspective of the academic (or others) as a client/customer which assumes that the client/customer knows what they want and because they generally don’t often sinks to a process of “manage the customer” rather than help.

Do it with the academic

In this path the application of learning analytics is treated as something that needs to be learned about. Folk work with the academics to explore how learning analytics can be best used to inform individual pedagogical practice. Perhaps drawing on insights from the other paths, but also modifying the other paths based on what is learned.

In terms of analysing/characterising this approach, I might identify

  • It assumes that if you want to change/improve teaching, then the academics need to learn and be helped to learn.
    (That probably sounds more condescending than I would like).
  • Based on a “If they learn it, they will do it” premise.
    Which doesn’t have to be true.
  • It assumes learning/learners and teaching/teachers are incredibly diverse.
  • It assumes we don’t know enough about what might be found with learning analytics and how we might learn how to use it.
  • Assumption that the system(s) in place will change in response to this learning which in turn means more learning ….and the cycle continues.

References

Dawson, S., Bakharia, A., Lockyer, L., & Heathcote, E. (2011). “Seeing” networks : visualising and evaluating student learning networks Final Report 2011. Main. Canberra.

Dawson, S., & McWilliam, E. (2008). Investigating the application of IT generated data as an indicator of learning and teaching performance. Canberra: Australian Learning and Teaching Council.

Ferguson, R. (2012). The State of Learning Analytics in 2012 : A Review and Future Challenges a review and future challenges. Milton Keynes: UK.

Johnson, L., Adams, S., & Cummins, M. (2012). Technology Outlook for Australian Tertiary Education 2012-2017: An NMC Horizon Report Regional Analysis. New Media Consortium. Austin, Texas.

Johnson, L., & Cummins, M. (2012). The NMC Horizon Report: 2012 Higher Education Edition (p. 42). Austin, Texas.

Siemens, G., & Long, P. (2011). Penetrating the Fog: Analytics in Learning and Education. EDUCAUSE Review, 46(5).