Supporting Action Research with Learning Analytics

The following is a summary and some thoughts on

Dyckhoff, a. L., Lukarov, V., Muslim, A., Chatti, M. a., & Schroeder, U. (2013). Supporting action research with learning analytics. In Proceedings of the Third International Conference on Learning Analytics and Knowledge – LAK ’13 (pp. 220–229). New York, New York, USA: ACM Press. doi:10.1145/2460296.2460340

Thoughts

Bringing in reflection, action research and the idea of learning analytics enabling these reinforces one of my interests. So I’m biased toward this sort of work.

Some good quotes supporting some ideas we’re working on.

Find it interesting that the LA research work tends to talk simply about indicators, i.e. the patterns/correlations that are generated from analysis, rather than on helping users (teachers/learners) actually do something.

Abstract

My emphasis added.

Learning analytics tools should be useful, i.e., they should be usable and provide the functionality for reaching the goals attributed to learning analytics. This paper seeks to unite learning analytics and action research. Based on this, we investigate how the multitude of questions that arise during technology-enhanced teaching and learning systematically can be mapped to sets of indicators. We examine, which questions are not yet supported and propose concepts of indicators that have a high potential of positively influencing teachers’ didactical considerations. Our investigation shows that many questions of teachers cannot be answered with currently available research tools. Furthermore, few learning analytics studies report about measuring impact. We describe which effects learning analytics should have on teaching and discuss how this could be evaluated.

Introduction

Starts with the proposition that “teaching is a dynamic activity” where teachers should “constantly analyse, self-reflect, regulate and update their diadactical methods and the learning resources they provide to their students”

Of course, learning is also a dynamic activity. Raising the possibility that the same sort of analysis being done here might be done for learners.

Moves onto reflection, its definition and how it can foster learning if “embedded in a cyclical process of active experimentation, where concrete experience forms a basis for observation and reflection”. Action research is positioned as “a method for reflective teaching practice” ending up with learning analytics being able to “initiate and support action research” (AR).

Noting multiple definitions of learning analytics (LA) before offering what they use

learning analytics as the development and exploration of methods and tools for visual analysis and pattern recognition in educational data to permit institutions, teachers, and students to iteratively reflect on learning processes and, thus, call for the optimization of learning designs [39, 40] on the on (sic) hand and aid the improvement of learning on the other [14, 15].

Relationship between LA and AR

  • LA arise from observations made with already collected data.
  • AR start with research question arising from teaching practice
  • AR often use qualitative methods for more holistic view, LA mostly quantitative.

Important point – the creation of indicators from the LA work has “been controlled by and based on the data available in learning environments”. Leading to a focus on indicators arising from what’s available. AR starts with the questions first, before deciding about the methods and sources. Proposes that asking questions without thought to the available data could “improve the design of future LA tools and learning environments”.

Three questions/assumptions

  1. Indicator-question-mapping
    Which teacher questions cannot be mapped to existing indicators? Which indicators could delivery what kind of enlightenment?
  2. Teacher-data-indicators
    Current indicators don’t “explicitly relate teaching and teaching activities to student learning” (p. 220). Are there tools that do this? How should it be done?
  3. Missing impact analysis
    Current LA research “fails to prove the impact of LA tools on stakeholders’ behaviors” (p. 221). How can LA impact teaching? How could it be evaluated?

Paper structure

  • Methods – research procedure and materials
  • Categorisation of indicators
  • Analysis and discussion
  • Conclusion

Methods

  1. Results of qualitative meta-analysis investigating what kind of questions teachers ask while performing AR in TEL. – see table below
  2. Collected publications on LA tools and available indicators.
  3. 2 of the researchers developed a categorisation scheme for the 198 indicators.
  4. Further analysis of LA tools and indicators.
  5. 2 researchers mapped teachers’ questions to sets of available indicators

Teachers’ questions

The questions asked by teachers – summarised in the following table – are taken from

Dyckhoff, A.L. 2011. Implications for Learning Analytics Tools: A Meta-Analysis of Applied Research Questions. IJCISIM. 3, (2011), 594–601.

Must read this to learn more about how these questions came about. Strike me as fairly specific and not necessarily exhaustive. Authors note that some questions fit into more than one category.

(a) Qualitative evaluation (b) Quantitative measures of use/attendenace (c) Differentiation between groups of students
How do students like/rate/value specific learning offerings?
How difficult/easy is it to use the learning offering?
Why do students appreciate the learning offering?
When and how long are student accessing specific learning offerings (during a day)?
How often do students use a learning environment (per week)?
Are there specific learning offerings that are NOT used at all?
By which properties can students be grouped?
Do native speakers have fewer problems with learning offerings than non-native speakers?
How is the acceptance of specific learning offerings differing according to user properties (e.g. previous knowledge)?
(d) Differentiation between learning offerings (e) Data consolidation/correlation (f) Effects on performance
Are students using specific learning materials (e.g. lecture recordings) in
addition or alternatively to attendance?
Will the access of specific learning offerings increase if lectures and exercises on the same topic are scheduled during the same week?
How many (percent of the) learning modules are student viewing?
Which didactical activities facilitate continuous learning?
How do learning offerings have to be provided and combined to with support to increase usage?
How do those low achieving students profit by continuous learning with etest compared to those who have not yet used the e-tests?
Is the performance in e-tests somehow related

Tools

Provides a list of tools chosen for analysis. Chosen given presentation in literature as “state-of-the-art LA-tools, which can already be used by their intended target users”.

Categorisation of indicators

Categorisation scheme includes

  • Five perspectives categories – “point of view a user might have on the same data”
    1. individual student

      “inspire an individual student’s self-reflection” on their learning. Also support teachers in monitoring. Sophisticated systems recommend learning activities. Includes a long list of example indicators in this category with references.

    2. group

      As the name suggests, the group.

    3. course
    4. content
    5. teacher

      Only a few found in this category – including sociogram of interaction between teacher and participant.

  • Six data sources categories
    1. student generated data

      Students’ presence online. Clickstreams, but also forum posts etc.

    2. context/local data
    3. academic profile

      Includes grades and demographic data.

    4. evaluation

      student responses to surveys, ratings, course evlaluations.

    5. performance

      Grades etc from the course. # of assignments submitted.

    6. course meta-data

      course goals, events etc.

Analysis and discussion

Indicator-questions-mapping

Mapped indicators from chosen tool to questions asked by teachers. Missing documentation meant mapping was at times subjective.

“Our analysis showed that current LA implementations still fail to answer several important questions of teachers” (p. 223).

Using categories from the above table

  • Category A – almost all “cannot yet be answered sufficient”. Deal with questions of student satisifaction and preferences.
  • Category B – most questions can be answered. A few cannot (e.g. use of service via mobile or at home) Aside: while a question teachers’ might ask, not sure it’s strongly connected to learning.
  • Category E – generally no. Most systems don’t allow the combination of data that this would require. I would expect in large part because of the research nature of these tools – focused on a particular set of concerns. The paper raises the question of learner privacy issues.
  • Category F – can be difficult depending on access to this information.

Teacher-data-indicators

“We did not find tools or indicators that explicitly collect and present teacher data”. (p. 224). The closest are indicators related to course phases and interactions between teachers and students.

Activity logs contain some teacher data. But other data missing. Information on lectures etc is missing.

If teachers had indicators about their activities and online
presence, they might be inspired and motivated to be more active in the online learning environment. Hence, their presence in discussions might stimulate students likewise to participate more actively and motivate them to share knowledge and ideas.

Authors brainstormed some potential indicators

  • Teacher form participation indicator.
  • Teacher correspondence indicator

    Tracking personal correspondence, and tracking interventions and impact on student behaviour.

  • Average assignments grading time.

    Would be interesting to see the reaction to enabling this.

    The authors mention privacy issues and suggest only showing the data to the individual teacher.

Missing impact analysis

Provides a table comparing AR and LA.

“very few publications reporting about findings related to the behavioural reactions of teachers and students, i.e. few studies measure the impact of using learning analytics tools” (p. 225) Instead LA research tends to focus on functionality, usability issues and percieved usefulness of specific indicators. …”several projects have not yet published data about conducting reliable case studies or evaluation results at all”.

Proceed to offer one approach to measuring impact of LA tools – an approach that could “be described as design-based research with a focus on uncovering action research activities”.

The steps

  1. Make the tools available to users.

    A representable group of non-expert teachers and students. Need to know about the course. How it’s operating without LA and a great deal of information to use as a reference point for comparison later on. Including interviews/online surveys with staff and students.

  2. Identify which activities are likely to be improved by LA.

    Hypothesise about the usage and impact of LA.

  3. Interview after use.

Limitations of this approach

  • long time required.
  • significant effort from researchers and participants.
  • analysis of qualitative data prone to personal interpretation.
  • Clear conclusions may not be possible.

Limitations of this study

  • meta-analysis from which questions were drawn was limited to case studies described in the conference proceedings of a German e-learning conference.
  • identification of indicators limited to 27 tools, there are other research especially from EDM. “The challenge is, how to make them usable”
  • subjectivity of the questions and the indicators – addressed somewhat by two researchers – but not an easy process.

Conclusions

Learning Analytics tools should be an integral part of TEL. The tools aim at having an impact on teachers and students. But the impact has not been evaluated. The concern we are raising is that LA tools should not only be usable, but also useful in the context of the goals we want to achieve. (p. 227)

  • present indicators focused on answering questions around usage analysis
  • “currently available research tools do not yet answer many questions of teacher”
  • qualitative analysis and correlation between multiple data sources can’t yet be answered.
  • “causes for these shortcomings are insufficient involvement of teachers in the design and development of indicators, absence of rating data/features, non-used student academic profile data, and absence of specific student generated data (mobile, data usage from different devices), as well as missing data correlation and combination from different data sources”.
  • teachers data is not easily visible.
  • future tools will probably have rating features and that data should be used by LA tools.
  • “researchers should actively involve teachers in the design and implementation of indicators”.
  • researchers need to provide guidelines on how indicators can be used and limitations.
  • Need to create evaluation tools to measure impact of LA.

References

Dyckhoff, a. L., Lukarov, V., Muslim, A., Chatti, M. a., & Schroeder, U. (2013). Supporting action research with learning analytics. In Proceedings of the Third International Conference on Learning Analytics and Knowledge – LAK ’13 (pp. 220–229). New York, New York, USA: ACM Press. doi:10.1145/2460296.2460340

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s