Enabling academics to apply learning analytics to individual pedagogical practice: how and with what impacts?

The following is an excerpt from an unsuccessful 2012 second round OLT grant. We’re currently pondering what the next step is with the idea.

A recent presentation at the Southern Solar Flare conference places the following idea in a broader context of learning analytics and how universities are implementing it.

Project Rationale

The Society for Learning Analytics Research (SoLAR) defines learning analytics as (Siemens et al., 2011, p. 4)

..the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.

One example of learning analytics is the Social Networks Adapting Pedagogical Practice (SNAPP) tool developed by an ALTC project (Dawson, Bakharia, Lockyer, & Heathcote, 2011) to visualise interaction patterns in course discussion forums to support a focus on learner isolation, creativity and community formation. While SNAPP’s network diagrams were found to be effective in promoting reflection on teaching activities it was found that teachers had difficulty in understanding the relationship between their pedagogical approaches and the insights revealed by SNAPP (Dawson et al., 2011, p. 4). Being able to interpret the patterns provided by learning analytics and apply them to practice is difficult, time-consuming, requires additional support, and is worthy of further investigation (Dawson et al., 2011; Dawson & McWilliam, 2008). The challenge is being able “to readily and accurately interpret the data and translate such findings into practice” (Dawson & McWilliam, 2008, p. 12). This project aims to address this challenge.

Ferguson (2012) identifies learning analytics as one of the fastest-growing areas of research within technology-enhanced learning with interest being driven by a combination of technological, pedagogical, political and economic drivers. The 2012 Horizon Report (Johnson & Cummins, 2012) argues that while learning analytics is still in its early stages of development it is likely to see widespread adoption within the higher education sector in the next 2-3 years. The recent Horizon technology outlook for Australian Tertiary Education (Johnson, Adams, & Cummins, 2012) places learning analytics – for the first time anywhere in the world – into the “one year or less” time-frame for adoption. This finding suggests that with the imminent and widespread adoption of learning analytics, Australian universities may be particularly interested in “finding new ways to measure student performance, ideally in real-time” (Johnson et al., 2012, p. 1).

An interest perhaps driven in part by the broadening participation agenda of the Commonwealth government and its targets set in response to the Bradley review (Bradley, Noonan, Nugent, & Scales, 2008). For 40% of Australians aged 25-34 to hold (or be progressing towards) bachelor degrees, we will need to enrol and graduate more students. Many of these students will be from non-traditional backgrounds, and would have been considered ‘high-risk’ students in the past. Learning analytics can help us understand the circumstances under which those students are most likely to succeed. But can learning analytics also help guide teachers to make the coalface pedagogical decisions to support the success of this larger and more diverse body of students?

To date much of the work on learning analytics in higher education has centred on identifying students at risk of failure and addressing short-term issues to prevent that failure (Johnson & Cummins, 2012). The dominant use of learning analytics within higher education has largely been by University administrators (Dawson, et al. 2011) or support staff. The larger promise of learning analytics is when it is used “to more precisely understand students’ learning needs and to tailor instruction appropriately far more accurately and far sooner than is possible today” (Johnson & Cummins, 2012, p. 23). If correctly applied and interpreted this practice has implications not only for student performance, but also for the perceptions of learning, teaching and assessment held by educators (Johnson & Cummins, 2012). There are, however, “a dearth of studies that have investigated the relationship between learning analytics and data requirements that would better assist teachers in the design and evaluation of learning and teaching practice” (Dawson et al., 2011, p. 4). Consequently, it remains uncertain exactly how to best enable teachers to apply learning analytics to inform their individual pedagogical practices, and, if successful, what impacts such practices would have on the perceptions of learning, teaching and assessment held by those teachers, and ultimately on student performance. This project seeks to provide answers to these “How?” and “With what impacts?” questions.

Project outcomes

This project’s six outcomes are summarised in the following table and described in more detail following.

1A course (aka unit or subject) is the smallest stand-alone offering
2 – OKB is the Online Knowledge Base
Outcome Format Audience Contribution
Course 1 modifications Modified courses

Case studies of changes disseminated via presentations, the web (OKB2), and publications

Students & staff in the courses

Other teaching staff and institutional policy markers

Researchers

Improvements to student learning and changes to teacher perceptions

Data about the impacts of using learning analytics to inform teaching.

Concrete examples of using learning analytics that can be duplicated and modified in other contexts.

Modifications to Moodle tools Changes made available to the broader Moodle community by contribution back to the Moodle code base. 17 Australian Universities using Moodle and the broader Moodle community.

Improved tools will lead to greater use of learning analytics by Moodle using teaching staff.

Concrete examples that improve understanding of the tool design guidelines

Tool design guidelines Design guidelines and theory disseminated via presentations, the web (OKB) and publications Tool developers

Researchers

Better tools in other LMS.

Foundation for further research.

Harnessing analytics models Models for how both teaching staff and institutions can enable the use of learning analytics to inform pedagogical practice disseminated via presentations, the web (OKB) and publications Teaching staff

Policy makers and institutional leaders

Teaching support staff

Researchers

Aid teaching staff and institutions in harnessing learning analytics contributing to more widespread, effective use.

Foundation for further research

Refinements to learning analytics patterns New or modified patterns disseminated via presentations, the web (OKB) and publications Teaching staff

Learning analytics researchers

Improved understanding of what is happening and suggestions for possible interventions.

Foundation for further research

Online knowledge base (OKB) A website providing a variety of learning paths through the project’s outcomes, resources, discussions, and processes. Teaching staff

Policy makers and institutional leaders

Researchers

Widespread and varied dissemination of project outcomes

Course modifications

At the centre of this project are two cycles of Participatory Action Research (PAR) with teaching academics at the University of Southern Queensland (USQ) and CQUniversity (CQUni). The aim of these cycles will be to work with the academics to explore how learning analytics can be used to inform their pedagogical practice – the methods they use for learning and teaching. Learning analytics tools and methods will be used to examine prior offerings of courses taught by these academics, share and explore perceptions of learning and teaching, identify potential course modifications, and examine the outcomes of those modifications. The modifications made will be supported and evaluated by a range of additional means. The modifications made, the reasons why, and the impact of the modifications will inform other project outcomes and will be available as case studies disseminated via various means.

Modifications to Moodle tools

USQ and CQUni are two of the 17 Australian Universities using Moodle as their institutional Learning Management System (LMS). Moodle provides an array of current and newly developed learning analytics tools of varying capabilities and qualities. Informed by prior research, the project’s theoretical framework and the insights gained during the project’s two PAR cycles a range of modifications will be made to these tools with the intent of better enabling the use of learning analytics to inform and share pedagogical practice. NetSpot – an e-learning services company that hosts Moodle for 10 Australian Universities – will make the necessary changes to these Moodle tools. All changes will be made available to the broader Moodle community.

Tool design guidelines

The rationale for and changes made to the Moodle-based learning analytics tools will be captured in design guidelines. The design guidelines are intended to make this knowledge available to the developers of other e-learning systems and thereby enable them to make improvements to their systems to better enable the use of learning analytics to inform pedagogical practice. The design guidelines will be made available via the OKB and will also be published in peer-reviewed outlets.

Harnessing analytics models

Changes to pedagogical practice will not arise simply because of the availability of new tools. A significant body of research has found that changes in approaches to teaching are influenced by disciplinary characteristics, conceptions of teaching, situational factors and perceptions of the teaching environment (Richardson, 2005). In addition, enabling increased use of learning analytics is likely to modify these factors and their relationship. Through its PAR cycles the project will explore these changes and work with academics and institutional leaders to identify factors that constrain and enable the on-going use of learning analytics to inform pedagogical practice. The results of this work will be combined with extant literature to develop a range of design models intended to aid institutional policy makers, teaching support staff and teaching staff in deciding how best to enable the use of learning analytics to inform pedagogical practice in their context.

Refinements to learning analytics patterns

Learning analytics often leads to the transformation of large amounts of usage data into useful patterns, indicators or visualisations. For example, the common correlation between increasing levels of LMS activity and increasing grades (Dawson & McWilliam, 2008, p. 2). These patterns are often used to inform decision-making. However, the patterns that are identified are directly influenced by what is being looked for (e.g. the emphasis on student retention in learning analytics work focusing attention on patterns identifying success factors) and the contexts being explored. For example, Beer, Jones & Clark (2009) found that the increasing activity/increasing grades correlation did not exist for certain groups of students and courses with certain characteristics.

During its PAR cycles, this project will be working directly with a diverse collection of teaching academics to achieve their purposes with learning analytics. This set of different perspectives will lead to the identification of new and the refinement of existing learning analytics patterns. These patterns will be distributed through the OKB, traditional peer-reviewed publications, and, where appropriate, incorporation into the learning analytics tools.

Online knowledge base

The Online Knowledge Base (OKB) fulfils two functions. First, it will be the primary means of communication and collaboration between the project team, reference group and project participants both within this and continuing work. Second, the OKB will serve as one of the major components of the project’s dissemination strategy. Both these functions will rely heavily on social media and related software (e.g. Mendeley, Diigo, blogs, Twitter etc.). The use of these tools will in turn be aggregated and curated to form an online knowledge base. The OKB will provide access to all project outcomes as both formal reports and as a variety of learning paths design explicitly for different stakeholders.

Value and need for the project

The need for this project is well established in the learning analytics literature including the findings and recommendations from two prior learning analytics related ALTC grants (Dawson et al., 2011; Dawson & McWilliam, 2008). Samples from that literature showing the need for this project includes:

  • Learning analytics is still at the early stages of implementation and experimentation (Siemens & Long, 2011).
  • The analytical tools provided by most LMS are poorly utilised (Dawson & McWilliam, 2008) in large part because the “tools and presentation format are complex and far removed from the specific learning context” (Dawson & McWilliam, 2008, p. 8).
  • Australian academics have limited understanding of what is available from these tools and how that data is related to pedagogical practice (Dawson & McWilliam, 2008).
  • There is a dearth of studies examining the use and impact of learning analytics to inform the design, delivery and future evaluations of individual teaching practices (Dawson et al., 2011; Dawson, Heathcote, & Poole, 2010).

This project builds directly upon prior work including prior ALTC grants (Dawson et al., 2011; Dawson & McWilliam, 2008), the development of Moodle learning analytics tools, and broader learning analytics research. The project is, through the project team and the members of the reference group, directly connected to the broader learning analytics communities. By working with a diverse range of teaching staff to explore the use of learning analytics to inform pedagogical practice the project will fill an identified need and produce outcomes of immediate use to a significant portion of the Australian higher education sector. Beyond immediate use the project’s outcomes create a platform for on-going work.

Approach

The project will take place over the course of 24 months and includes four main stages summarised in the following table.

Stages Outcomes Evaluation & dissemination
1: Formation and initial design
(Jan 2013 -
Jul 2013)
Project established; ethics approval granted; recruitment of research assistant.
Initial version of OKB.
Initial tool design guidelines and enhancements to LMS tools.
Project evaluation plan.
Two Reference group meetings.
Appointment of and initial meetings with project evaluator.
Presentations at CQU and USQ.
Identification of PAR cycle #1 participants and institutional influencers.
Promotion of OKB and broaden connections with learning analytics community.
2: PAR cycle #1
(Aug 2013 -
Feb 2014)
Modifications to 4 courses.
Changes to OKB.
Enhancements to tool design guidelines and LMS tools.
Initial draft of harnessing analytics guidelines.
Refinements to LA patterns.
On-going engagement and self- evaluation by PAR participants, critical friends, influencers and other team members.
Invited presentations integrated into OKB.
Reference group meeting.
Work with project evaluator.
Contribution of tool modifications to Moodle community.
3: PAR cycle #2
(Jan 2014 -
Sep 2014)
Modification to a further 8 courses. Changes to OKB.
Further enhancements to tool design guidelines and LMS tools.
Enhancements of harnessing analytics guidelines.
Refinements to LA patterns.
On-going engagement and self- evaluation by PAR participants, critical friends, influencers and other team members.
Invited presentations integrated into OKB.
Initial publications.
Reference group meeting.
Work with project evaluator.
Contribution of tool modifications to Moodle community.
4: Project Finalisation
(Oct 2014 -
Dec 2014)
Final enhancements to all project outcomes.
Final project report.
Publications and presentations.
Summative evaluation by project evaluator.
Promotion of OKB with final project outcomes.
Planning for future work.

Methodology and framework

The project will be using a combination of Participatory Action Research (PAR) and design-based research (DBR). This combination is used in part because of the close connection between the two methods – e.g. Wang and Hannafin (2005, p. 6) suggest that DBR is “akin” but slightly different to PAR – but also because of two slightly different project tasks. Firstly, the project aims through the lens of situated cognition to engage fully in the specifics of two institutional contexts for the purposes of helping individual academics address their needs. PAR is the best fit for this purpose and is seen as being better known across disciplines than DBR. To support the use of PAR, the project will adopt the lessons learned by Fraser & Harvey (2008) in a previous ALTC-funded project. These include: supported reflection sessions, the provision of theoretical sparks, and the pairing of academic participants with institutional influencers. The project also has a second task in that it must formulate artefacts (e.g. the knowledge base and enhanced Moodle tools) and design theory (e.g. the harnessing analytics model and the tool design guidelines) useful to the broader community. Design-based research both draws on and is conducted in order to generate design theory (Wang & Hannafin, 2005). Design-based research also involved the use of “multiple research methods in real-world learning environments” (Wang & Hannafin, 2005, p. 20).

In terms of theoretical frameworks, the project’s design draws upon situated cognition, distributed cognition and the role of conceptions and reflection in changing teaching and learning. A brief explanation of each of these and its application to the project design follows.

Seely Brown and Duguid (1989) argue that the tendency for education, training and technology design to focus on abstract representations that are detached from practice actually distort the intricacies of practice. This distortion hinders how well practice can be understood, engendered, or enhanced. It hinders learning. The design and development of many e-learning systems tend to suffer from this limited understanding of the intricacies of practice involved in modern e-learning. Dawson et al.’s (2011) observation that university administrators have been the dominant users of learning analytics in higher education indicates a similar problem with learning analytics. The use of Participatory Action Research will provide the “opportunity for codeveloping processes with people rather than for people” (McIntyre, 2008, p. xii). By situating the project within a shared purpose of joint, collective purpose, PAR will improve the understanding and learning gained about how learning analytics can be used to inform pedagogical practice.

The majority of e-learning systems provide direct support for the implementation of technical tasks such as posting a message to a discussion forum. The difficult cognitive task of combining these technical tasks to create an effective and appropriate pedagogical design is left almost entirely to the teacher. Dawson & McWilliam (2008) identify this problem with most LMS learning analytics tools which have presentation formats that are too complex and far removed from the specific learning context.

Hollan, Hutchins and Kirsh (2000) describe how distributed cognition expands what is considered cognitive beyond an individual to encompass interactions between people, their environment and the tools therein. Boland, Ramkrishnan and Te’eni (1994, p. 459) define a distributed cognition system as one that “supports interpretation and dialogue among a set of inquirers by providing richer forms of self-reflection and communication”. This project will make enhancements to learning analytics tools that make such tools an effective part of a distributed cognition system. A particular focus of the enhancements will be on reducing the difficulty of the task and offering greater support to teacher self-reflection and collaboration.

There is a significant body of literature that has established a link between the conceptions of learning and teaching held by academics and the quality of student learning outcomes (c.f. Richardson, 2005). It has also been found that environmental, institutional, or other issues may impel academics to teach in a way that is against their preferred approach (Richardson, 2005). There is a similarly widely acknowledged view that reflection on teaching contributes to changes in conceptions of teaching that lead to enhanced teaching practice and possibly improved student learning (Kreber & Castleden, 2009). Through its use of situated cognition and participatory action research this project aims to develop significant insight into the factors that constrain and enable adoption of learning analytics. By working within a PAR process with teaching academics and their accompanying institutional influencers the project aims to respond to these factors. Lastly, the combination of PAR with distributed cognition will encourage teaching staff to engage in a range of reflective processes that could lead to changes in their conceptions of learning and teaching and subsequently the quality of student learning outcomes.

References

Beer, C., Jones, D., & Clark, K. (2009). The indicators project identifying effective learning, adoption, activity, grades and external factors. Same places, different spaces. Proceedings ascilite Auckland 2009 (pp. 60–70). Auckland, New Zealand. Retrieved from http://www.ascilite.org.au/conferences/auckland09/procs/beer.pdf

Boland, R., Ramkrishnan, V., & Te’eni, D. (1994). Designing information technology to support distributed cognition. Organization Science, 5(3), 456–475.

Bradley, D., Noonan, P., Nugent, H., & Scales, B. (2008). Review of Australian Higher Education. Canberra. Retrieved from http://moourl.com/iax5c

Brown, J. S., Collins, A., & Duguid, P. (1989). Situated cognition and the culture of learning. Educational Researcher, 18(1), 32–42.

Dawson, S., Bakharia, A., Lockyer, L., & Heathcote, E. (2011). “Seeing” networks : visualising and evaluating student learning networks Final Report 2011. Main. Canberra. Retrieved from http://moourl.com/zwkpf

Dawson, S., Heathcote, L., & Poole, G. (2010). Harnessing ICT potential: The adoption and analysis of ICT systems for enhancing the student learning experience. International Journal of Educational Management, 24(2), 116–128. doi:10.1108/09513541011020936

Dawson, S., & McWilliam, E. (2008). Investigating the application of IT generated data as an indicator of learning and teaching performance. Canberra: Australian Learning and Teaching Council. Retrieved from http://moourl.com/hpds8

Ferguson, R. (2012). The State of Learning Analytics in 2012 : A Review and Future Challenges a review and future challenges. Milton Keynes: UK. Retrieved from http://kmi.open.ac.uk/publications/pdf/kmi-12-01.pdf

Fraser, S., & Harvey, M. (2008). Leadership and assessment: Strengthening the nexus. Final report. Strawberry Hills: Australian Learning and Teaching. Canberra. Retrieved from http://moourl.com/02pk6

Hollan, J., Hutchins, E., & Kirsh, D. (2000). Distributed cognition: Toward a new foundation for human-computer interaction research. ACM Transactions on Computer-Human Interaction, 7(2), 174–196.

Johnson, L., Adams, S., & Cummins, M. (2012). Technology Outlook for Australian Tertiary Education 2012-2017: An NMC Horizon Report Regional Analysis. New Media Consortium. Austin, Texas. Retrieved from http://www.nmc.org/publications/2012-technology-outlook-au

Johnson, L., & Cummins, M. (2012). The NMC Horizon Report: 2012 Higher Education Edition (p. 42). Austin, Texas.

Kreber, C., & Castleden, H. (2009). Reflection on teaching and epistemological structure: reflective and critically reflective processes in “pure/soft” and “pure/hard” fields. Higher Education, 57(4), 509–531.

McIntyre, A. (2008). Participatory Action Research. Thousand Oaks, CA: SAGE Publications.
Richardson, J. (2005). Students’ approaches to learning and teachers’ approaches to teaching in higher education. Educational Psychology, 25(6), 673–680.

Siemens, G., Gasevic, D., Haythornthwaite, C., Dawson, S., Shum, S. B., & Ferguson, R. (2011). Open Learning Analytics: an integrated & modularized platform. Knowledge Creation Diffusion Utilization. Retrieved from http://solaresearch.org/OpenLearningAnalytics.pdf

Siemens, G., & Long, P. (2011). Penetrating the Fog: Analytics in Learning and Education. EDUCAUSE Review, 46(5). Retrieved from http://moourl.com/j6a5d

Wang, F., & Hannafin, M. (2005). Design-Based Research and Technology-Enhanced Learning Environments. Educational Technology Research and Development, 53(4), 5–23.

About these ads

4 thoughts on “Enabling academics to apply learning analytics to individual pedagogical practice: how and with what impacts?

  1. Pingback: Moving beyond a fashion: likely paths and pitfalls for learning analytics « The Weblog of (a) David Jones

  2. Pingback: Enabling academics to apply learning analytics to individual pedagogical practice: how and with what impacts? | Analyse This | Scoop.it

  3. Pingback: Can/will learning analytics challenge the current QA mentality of university teaching « The Weblog of (a) David Jones

  4. Pingback: Analytics for Learning and Teaching « The Weblog of (a) David Jones

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s