Enabling academics to apply learning analytics to individual pedagogical practice: how and with what impacts?

Thanks to @cj13 for the heads up about the EDUCAUSE analytics sprint in the midst of moving, conferences, end/start of term and grant writing I missed it. Found it interesting that the first thing the struck my eye was a link to this discussion titled “Faculty need how-to information for the data they do have”. It’s interesting because the grant application I’m writing is directly aimed in this general area. Though we perhaps have a slightly different take on the problem.

As it happens, I’m about to reframe the “outcomes and rationale” section of the grant. So, rather than lose the existing content I’m going to post it below to share the thoughts and see what interesting connections arise. Some early thoughts on the project are here and we’re aiming for OLT funding.

The project team for this application includes myself, Prof Lorelle Burton (USQ), Dr Angela Murphy (CQU), Prof Bobby Harreveld (CQUni), Colin Beer (CQUni), Damien Clark (CQUni), and last, but no means least, Dr Shane Dawson (UBC).

Abstract

Learning analytics is the use of tools and techniques to gather data about the learning process and the use of that data to design, develop and evaluate learning and teaching practice. A significant challenge for learning analytics is the complexity of transforming the data it reveals into informed pedagogical action. This project will investigate how and with what impacts learning analytics can be used to inform individual pedagogical practice. Using Participatory Action Research the project will support groups of academics from two participating universities in using learning analytics to improve their learning and teaching. From this the project will generate insight into how best to use learning analytics to inform pedagogical practice, the impacts of such action, and the types of tools and organisational policies that enable this practice. These insights will be made available through an evolving, online knowledge base and appropriate design guidelines, and encapsulated in a number of supporting tools for an open source LMS.

Rationale

The Society for Learning Analytics Research (SoLAR) defines learning analytics as (Siemens et al., 2011, p. 4)

the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimizing learning and the environments in which it occurs.

Interest in learning analytics has been rapidly growing over recent years with Ferguson (2012) suggesting it is amongst the fastest-growing areas of research within technology-enhanced learning driven by a combination of technological, pedagogical and political/economic drivers. The 2012 Horizon Report (Johnson & Cummins, 2012) argues that while learning analytics is still in its early stages of development it is likely to see widespread adoption within higher education in the next 2-3 years. The analysis in the Horizon Report’s technology outlook for Australian Tertiary Education (Johnson, Adams, & Cummins, 2012) places learning analytics into the “one year or less” for the first time seeming to suggest that Australian universities are particularly interested in “finding new ways to measure student performance, ideally in real-time” (Johnson, Adams, & Cummins, 2012, p. 1).

The rise of learning analytics as a field of research is timely given the broadening participation agenda of the Australian higher education sector. Commonwealth government targets set in response to the Bradley review (Bradley, et al. 2008) are ambitious, and are in line with the move from elite to mass higher education globally, particularly in OECD countries (Santiago, et al. 2008). For 40% of Australians aged 25-34 to hold (or be progressing towards) bachelor degrees, we will need to enrol and graduate more students. Many of these students will be from non-traditional backgrounds, and would have been considered ‘high-risk’ students in the past. Learning analytics can help us understand the circumstances under which those students are most likely to succeed. But can learning analytics also help guide teachers to make the coalface pedagogical decisions to support the success of this larger and more diverse body of students?

To date much of the work on learning analytics in higher education has centred on identifying students at risk of failure and addressing short-term issues to prevent that failure (Johnson & Cummins, 2012). The dominant use of learning analytics within higher education has largely been by management (Dawson, et al. 2011) or support staff. The larger promise of learning analytics is when it is used “to more precisely understand students’ learning needs and to tailor instruction appropriately far more accurately and far sooner than is possible today” (Johnson & Cummins, 2012, p. 23). If correctly applied and interpreted this practice has implications not only for student performance, but also for the perceptions of learning, teaching and assessment held by educators (Johnson & Cummins, 2012). The ability to correctly apply and interpret the findings of learning analytics into practice is, however, difficult, time-consuming, requires additional support, and is worthy of further investigation (Dawson et al., 2011; Dawson & McWilliam, 2008). The challenge is being able “to readily and accurately interpret the data and translate such findings into practice” (Dawson & McWilliam, 2008, p. 12).

Outcomes

This project seeks to address this challenge. It seeks to explore how and with what impacts educators can be enabled and encouraged to effectively and appropriately use learning analytics to inform individual pedagogical practice. In doing so, the project aims to develop the following outcomes:

  1. An online knowledge base that guides educators and institutions in the use of learning analytics to inform individual pedagogical practice.
  2. Enhancements to at least 12 courses across the two participating institutions through the application of learning analytics.
  3. A “harnessing analytics model” that explains how and with what impacts learning analytics can be used to inform individual pedagogical practice, including identification of enablers, barriers and critical success factors.
  4. Design guidelines explaining how to modify e-learning information systems to better enable the application of learning analytics to inform pedagogical practice.
  5. Enhanced and new learning analytics tools for the Moodle LMS based on those design guidelines and integrated with the knowledge base.
  6. Further testing, enhancements to existing and identification of new trends, correlations and patterns evident in usage data.

How

The project aims to develop these outcomes by directly helping groups of teaching academics harness learning analytics to observe and intervene in their teaching. This direct engagement in practice will take the form of cycles of Participatory Action Research at the University of Southern Queensland and CQUniversity. The use of PAR will provide the “opportunity for codeveloping processes with people rather than for people” (McIntyre, 2008, p. xii). The PAR cycles will be supported by and contribute to the evolution of the knowledge base and will inform the enhancements to Moodle – the LMS at both institutions and at over 18 Australian Universities – learning analytics tools. The institution and LMS specific interventions developed during the PAR cycles will be reviewed, tested and abstracted into broader models and guidelines that can be used at other institutions and within other e-learning tools. This sharing of insight between context specific outcomes and broader knowledge will be supported by the active involvement of learning analytics experts from other institutions and the project reference group.

References

Bradley, D., Noonan, P., Nugent, H., & Scales, B. (2008). Review of Australian Higher Education. Canberra. Retrieved from http://www.deewr.gov.au/HigherEducation/Review/Documents/PDF/Higher Education Review_one document_02.pdf

Dawson, S., Bakharia, A., Lockyer, L., & Heathcote, E. (2011). “Seeing” networks : visualising and evaluating student learning networks Final Report 2011. Main. Canberra. Retrieved from http://research.uow.edu.au/content/groups/public/@web/@learnnet/documents/doc/uow115678.pdf

Dawson, S., & McWilliam, E. (2008). Investigating the application of IT generated data as an indicators of learning and teaching performance. Melbourne: Australian Learning and Teaching Council.

Ferguson, R. (2012). The State of Learning Analytics in 2012 : A Review and Future Challenges a review and future challenges. Milton Keynes: UK. Retrieved from http://kmi.open.ac.uk/publications/pdf/kmi-12-01.pdf

Johnson, L., & Cummins, M. (2012). The NMC Horizon Report: 2012 Higher Education Edition (p. 42). Austin, Texas.

McIntyre, A. (2008). Participatory Action Research. Thousand Oaks, CA: SAGE Publications.

Siemens, G., Gasevic, D., Haythornthwaite, C., Dawson, S., Shum, S. B., & Ferguson, R. (2011). Open Learning Analytics: an integrated & modularized platform. Knowledge Creation Diffusion Utilization. Retrieved from http://solaresearch.org/OpenLearningAnalytics.pdf

About these ads

One thought on “Enabling academics to apply learning analytics to individual pedagogical practice: how and with what impacts?

  1. Pingback: Enabling academics to apply learning analytics to individual pedagogical practice: how and with what impacts? | EDUCAUSE Analytics Sprint | Scoop.it

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s