Aligning learning analytics with learning design

The following is a summary and some initial responses to this paper

Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing Pedagogical Action: Aligning Learning Analytics With Learning Design. American Behavioral Scientist. doi:10.1177/0002764213479367

The abstract for the paper is

This article considers the developing field of learning analytics and argues that to move from small-scale practice to broad scale applicability, there is a need to establish a contextual framework that helps teachers interpret the information that analytics provides. The article presents learning design as a form of documentation of pedagogical intent that can provide the context for making sense of diverse sets of analytic data. We investigate one example of learning design to explore how broad categories of analytics—which we call checkpoint and process analytics—can inform the interpretation of outcomes from a learning design and facilitate pedagogical action.

I’m interested by the focus on moving from “small-scale practice to broad scale applicability”, but I wonder about how broad scale pedagogical practice can be given the inherent diversity/complexity.

Thoughts

The paper describes how learning design and learning analytics could be merged in a way that would provide better understanding of what is going on around student learning and perhaps allow demonstration of learning quality. To do this, it is assumed that the teacher or teaching team

  1. Makes explicit their pedagogical intent using an existing learning design.
  2. Then analyses this learning design to identify where analytics can provide checkpoint (i.e. have the learners performed certain necessary steps) and process (what’s going on during the learning process) insights.
  3. These insights are then used to either make interventions during learning or to redesign for the next time.

What I particularly like about this is the recognition that the real value of learning analytics arises from when it is applied within the learning process with knowledge of what is intended. As opposed at some abstract higher level.

But I wonder is the following are problems?

  • How many University academics have explicit pedagogical intents?

    Let alone make use of learning designs. How many think beyond the number of weeks of semester, the number of hours of lectures/tutorials, the topics to be covered and the assessment?

    Especially in a higher ed sector increasingly reliant on casualisation?

  • How many University academics or even support staff know enough about the data stored and the available learning analytics tools to be able to identify, design and implement creative and appropriate combinations of checkpoint and process learning analytics specific to a particular learning design?

    Especially given the learning designs at a certain level are generic. Once I start implementing a learning design in a particular course with a particular system there is going to be wide diversity of possible data and tools that I might like (or more correctly am only allowed) to draw upon.

    Beyond the contextual diversity, there is the idea of diversity in learning designs. Both these would require creative and informed manipulation of systems. Systems that aren’t designed for this sort of manipulation and in an environment where it is known that most people have troubles interpreting available visualisations, let alone creating new ones.

  • How many could actually get access to the data that could be used to support this?

    For example, Moodle activity completion offers some support for checkpoint analytics, but the reports are difficult to read, let alone create linkages between them.

    Damien Clark I think there’s some potential here for MAV or at least something based on its architectural approach.

  • Which leads to questions about the affordances of the available tools and the systems within which they are used.

    The LMS as an enterprise system is not flexible nor agile. It can change rapidly. Consequently, most of the existing LMS learning analytics work is focused on common tasks. Tasks that the majority of users of such systems might find useful. This is one of the reasons for the focus on engagement as measured by clicks, rather than better support for specific learning designs.

    A learning design is only going to be used by a subset of users of the LMS. Perhaps a very small subset of the LMS. Which makes it unlikely that the LMS will provide the affordances necessary. Raising the question of how to go about this?

    Do you

    1. Go down the personal route?

      Each academic (or at least those keen) have their own analytics tool that operates outside the LMS. This is much the approach that browser-based tools like SNAPP and others uses. But perhaps this needs to be expanded more.

    2. Add the capability for “pedagogical skins” to existing LMS tools.

      e.g. a way to say to the discussion forum tool that my pedagogical intent with this forum is X, provide me with the analytics and other affordances that make sense for pedagogical intent X.

    3. Or something else?

Some other misc thoughts follow and then a summary of the paper.

Analytics and course redesign

The paper argues that

Analytics can also help with course redesign. Traditionally, educators draw on their past experience when they teach a course or when they are designing a new course. For many teachers this may be an informal process that relies on recall, student surveys, and/or teacher notes recorded during or after the teaching session. Revisiting the learning analytics collected during the course can support teachers when they are planning to run the course again or when the learning design is being applied to a different cohort or context.

I think this would be an interesting area to explore. In part because I don’t think most academics have the time or inclination to do active redesign. It’s all tweaks.

While there are significant flaws with recall and student surveys, the data provided by learning analytics is not without fault. Especially given the fairly low level of the data that is available.

Learning design reuse

Learning design aims for reusability across educational contexts. Hence repositories of learning designs and tools that make learning designs machine readable and adaptable. But how have these been adopted? Who has adopted them? What are the barriers?

Relationship between learning design and the tools?

The paper claims that learning design can be used “as a framework for design of analytics to support faculty in their learning and teaching decisions”. But given the current nature of the tools available, just how realistic is this?

On a more personally interesting note, is the idea that learning designs describe pedagogical intent, but not how students are engaged in that design during or post-implementation. Perhaps giving a focus for how learning analytics can be integrated into a tool like BIM.

The paper argues that the critical step is a marriage between learning design and learning analytics? I wonder if a more concrete step is the marriage between learning analytics and the e-learning tools being used to implement the learning designs? If there is difficulty in understanding, then surely the tools can/should help? Wonder if they address this?

What if the learning tools themselves provided specific checkpoint and process analytics specific to the types of learning intent most common with a tool. Wouldn’t that help? After all, shouldn’t the technology be invisible?

The tool focus of course narrows applicability. A learning design might encompass a whole range of tools and getting those to play nicely together could be interesting, but also more useful. It gets at some of the same integration questions with BIM and EDC3100.

Of course, given the current nature of the tools available, I don’t think there are many people that could easily engage in the sort of creative combination of checkpoint and process analytics and learning design embodied here.

Link between learning outcomes and analytics

Some suggestions here of a link between learning outcomes and analytics which may link with work I need to complete locally. It’s underpinned by the same desire to demonstrate the quality of learning and teaching processes and both plagued by the difficulties of developing anything such thing that is meaningful or broadly applicable.

Self-reflection reading

Self-reflection requires strong metacognitive capacities that have been demonstrated to be essential for developing the skills necessary for lifelong learning (Butler & Winne, 1995).

What follows is a summary of my reading of the paper.

Introduction

Definitions

  • Learning analytics – “the collection, analysis, and reporting of data associated with student learning behavior”
  • learning design – “the documented design and sequencing of teaching practice”

The aim is how “together these may serve to improve understanding and evaluation
of teaching intent and learner activity”

Makes the point that learning analytics doesn’t suffer from the difficulties of survey groups of focus groups

which rely on participants both opting to provide feedback and accurately remembering and reporting past events

But the difficulty with LA is “interpreting the resulting data against pedagogical intent and the local context to evaluate the success or otherwise of a particular learning activity”

The claim is

learning design establishes the objectives and pedagogical plans, which can then be evaluated against the outcomes captured through learning analytics.

Overview of Learning analytics

Starts with the accountability interest around “indicators of the quality of learning and teaching practices” and “the development in the education sector of standardised scalable, real-time indicators of teaching and learning outcomes”. And links to the difficulty of doing so and the need for “a thorough understanding of the pedagogical and technical context in which the data are generated”.

Uni QA data generally derived from student experience surveys and measures of attrition etc. The rise of the LMS has led to work around learning analytics.

LA research interrogates data to create predictive models of performance, attrition, as well as more complex dimensions as dispositions and motivations. In turn to inform future L&T practice.

To date the focus is on predictors of student attrition, sense of community and achievement and ROI of technologies. But by providing measures of the student learning process it can help teachers design, implement and revise courses. Much work to be done here.

Overview of learning design

Apparently learning design had two main aims

  1. to promote teaching quality
  2. to facilitate the integration of technology into L&T

A development of the 2000s in response to the capability of the Internet to share examples of good practice – related work includes learning design, pedagogical patterns, learning patterns and pattern language (that last one is drawing a long bow). A learning design

  • describes the sequence of learning tasks, resources and supports constructed by a teacher
  • captures the pedagogical intent of a unit of student
  • a broad picture of planned pedagogical actions, rather than detailed accounts (as per a traditional lesson plan).
  • provide a model for intentions in a particular context

The proposition is that a learning design can be used “as a framework for design of analytics to support faculty in their learning and teaching decisions”

Focus on reusability, repositories (e.g. http://www.learningdesigns.uow.edu.au/ and http://193.61.44.29:42042/ODC.html).

Most based on constructivist assumptions.

Many formats, general or specific, very short or very long, but essential elements

  • Key actors
  • What they are expected to do
  • What resources are used
  • sequence of activities

The fit with learning analytics is that learning designs describe pedagogical intent, but not how students are engaged in that design during or post-implementation.

Using learning analytics

Lots of data provided. LMS provides some date. But it is under-used. Why? Suggestion is

This is largely the result of the lack of conceptual frameworks and resulting common understanding of how to use and interpret such data, and models that can validly and reliably align such data with the learning and teaching intent (Ferguson, 2012; Mazza & Dimitrova, 2007). At present, the available LMS data are neither easily understood by teachers as they align with individual and group student engagement behaviors (activity patterns) nor presented in ways that provide easy interpretation.

The following starts to get interesting

It is the conceptual bridging and understanding between the technical and educational domains that remains problematic for learning analytics (Dawson, Heathcote, & Poole, 2010). This leads to questions surrounding how analytics can begin to bridge the technical–educational divide to provide just-in-time, useful, and context-sensitive feedback on how well the learning design is meeting its intended educational outcomes.

The suggestion is that the critical step is “the establishment of methods for identifying and coding learning and teaching contexts”. i.e. the marriage of learning design and learning analytics.

Learning analytics to evaluate learning design

The aim here is to explore the importance of understanding pedagogical intent (i.e. learning design) for accurately understanding exactly what learning analytics is showing (in this case, a social network diagram).

In this case, they are using a diagram that shows a facilitator centric pattern. Pedagogical intent is shown as helping understand the value of the pattern

  • If it’s a Q&A forum on course content, then there’s alignment with the results and the intent.
  • If the instructor is absent and there’s a student answering questions, this might show successful delegation or it might show an absent instructor.
  • Might show early stages of a student introducing a topic
  • If the aim was to promote learner-to-learner interaction, then doesn’t seem to fit.

Not sure that this is capturing learning design, or simply learning intent.

Perhaps more interestingly

learning analytics may allow us to test those assumptions with actual student interaction data in lieu of self-report measures such as post hoc surveys. In particular, learning analytics provides us with the necessary data, methodologies, and tools to support the quality and accountability that have been called for in higher education.

Aligning learning analytics with learning design

This is where the propose two broad categories of analytics applications as a way to “align the two concepts”.

  1. Checkpoint analytics.

    Snapshot data indicating a student has met the pre-reqs for learning by accessing the relevant resources. e.g. downloading required files etc.

    The value here “lies in providing teachers with broad insight into whether or not students have accessed prerequisites for learning and/or are progressing through the planned learning sequence”.

  2. Process analytics.

    These “provide direct insight into learner information processing and knowledge application (Elias, 2011) within the tasks that the student completes as part of a learning design”. e.g. SNA of a student discussion activity gives data on level of engagement, peer relationships and therefore potential support structures. Adding content analysis offers the potential for determining the level of understanding etc.

    Not sure I get this “The articulation of the nature of support available within learning designs helps to interpret process learning analytics”.

Learning design and analytics investigation

This is where a theoretical scenario is used to illustrate the potential of the above ideas. Takes a learning design from the UoW learning design repository and looks at the type of analytics that might inform the teacher about how students are learning during implementation and how the design might be adapted.

Adapts the learning design figure with the addition of checkpoint and process learning analytics. Then describes these

  • Have they accessed the cases via the website, generate reminder alerts (teacher generated or automatic) to prompt late starters.
  • SNA to evaluate the effectiveness of each small groups discussion.
  • Similar with a teacher led, whole class discussion with some expectation of evolution over time.
  • Similar in small group project discussions, with an emphasis on identifying outliers etc.
  • Checkpoint and process at the end, including content analysis.

Conclusions

Next stages of research has several parallel directions, several of which I don’t grok at the moment.

  • Engaging teachers and students in understanding and using visual patterns of interaction to encourage learning activity

    What about improving the affordances of the tools to aid in this?

Mentions the rise of various predictive models for student performance and progression. These provide the opportunity to establish pedagogical recommendations. But there remains the need to inform these with contextual insight.

References

Lockyer, L., Heathcote, E., & Dawson, S. (2013). Informing Pedagogical Action: Aligning Learning Analytics With Learning Design. American Behavioral Scientist. doi:10.1177/0002764213479367

About these ads

5 thoughts on “Aligning learning analytics with learning design

  1. Pingback: Aligning learning analytics with learning desig...

  2. Kay

    Thanks for your thoughts on this paper – your initial list of potential problems due to practicalities of teaching in HE resonates particularly strongly here:)

    I have started thinking about LA applications in real HE context and its scalability and transferability across different contexts as part of Data, Analytics and Learning MOOC (DALMOOC) and see many similar barriers to applying LA by individuals and, even more, at the institutional scale.

    I have to say that I see individual adoption of LA as more likely – as LA is just another form of data/analysis/reflection which can feed into reflective teaching practice and action research done routinely by many within HE (although, by no means, all). I am not sure if marrying LD and LA within theoretical repository realms will help with these on the ground, context specific application of LA by educators. Whatever they are – there are reasons why LD repositories remain underutilised by practicing educators (and most likely not due to educator’s ignorance either;) . Putting LA in the same basket may do little to help with its widespread adoption. On the other hand I like your idea of in-built LA functions into the existing tools. This can help their use to adjust teaching “on-the-go” in response to needs of the particular cohort – which is one of the most important abilities of a good educator. In fact, perhaps rather than aiming at high cost course, programme and institution level learning redesign based on LA we should use it for enhancing this micro-level flexibility of teaching, based on the analytical abilities and professional judgement of the educators?

    Reply
    1. David Jones Post author

      Good to hear that I’m not the only one pondering the problems of reality. The micro-level is really where I’m focused. In no small part because I want the functionality to use in my own teaching, but also because I do believe there’s some value to be gained in terms of the overall quality of learning/teaching by taking that approach.

      In fact, I’ve just today spent some time designing the next step in a tool I’ll be using myself next year. The latest blog post has some description of it and an early, ugly mockup.

      I’m hoping that this work – a continuation of my two 2014 publications – will explore this micro-level/on-the-go approach a bit more. Always happy to hear what others think or what they are doing.

  3. Pingback: Adding more student information to a Moodle course | The Weblog of (a) David Jones

  4. Pingback: Our Learning Analytics are Our Pedagogy, Are They? (#xAPI, #dalmooc) | Classroom Aid

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s