Interfaces for learning data visualisations – #ascilite.

Live blogging workshop from Prof Judy Kay. A computer scientist from the user modeling, AIED background, pervasive computing. A focus on personalisation. Putting people in control – personal data.

Interest in open learner models.

Learning analytics seen as a form of learner/user modeling – with interfaces.

How to create interfaces in LA?

  • User-centered approaches – start with where people are, hence need to understand mental models. – stakeholders – mental models – the problem
  • core tools and principles – starts to influence mental models
  • Case studies

It’s an exciting time as we can influence the shape of the core tools and principles.

Interfaces..visualisations

Why?

Fekete, Van Wijik, Stasko, North (2008) – The value of information visualisation.

We’re hardwired – preattentively processed task.

How? No simple rules

Some principles

  • Individual data takes on more meaning when comparisons are supported: others, temporal, contextual

Note: Application for MAV

Patina: Dynamic heatmaps for visualising application usage – Matekjka, Grossman, Fitzmaurice (2013)

  • Ability to show different footprints – allowing comparison

Case study

The problem – group work is hard and important, creates problems. Stakeholders – learner as individual, team leaders, facilitators. Students with capstone project – working in term for client. Also used by Masters Education students. Using trac.

Build a tool – Narcissus – Upton and Kay (2009).

Integrated into trak. – showing comparison of user participation. Different colours for different types. Click on the cell and see the details about participation from that user for that cell. Since using this, never had to fail a group. They see the data, it’s visible. Worked as a conversation starter. Students told the data would never be used for assessment.

Navigating the information space in an entirely new way based on what the people are doing.

Note: Potentially useful for BIM – self-regulation – comparisons

Sequence mining – identify individuals and what they are doing and group them into categories – managers, developers, loafers, other.

Current problems

  • Teacher – early identification of at-risk indviduals
  • Learner – decision suport: Am I doing well enough? Am I doing what is expected of me?
  • Insitution – effectiviness of learning and teaching.

General principles

Bull and Kay (2007) – Student models that invite the learner in: The SMILI:() open learner modelling framework

Using this work as the foundation/source of principles

OLM – any interface to data that a system keeps about the learner.

Note: this literature would have some general principles for Information in IRAC.

What is open? How is it presented? Who controls access?

Purposes

  • Improving accuracy
  • Promoting learner
  • Helping learners to plan and/or monitor learning
  • Facilitating collaboration and/or competition
  • Faciliting navigation of the learning system
  • Assessment

Scrutable user models and personalised systems. Systems are deterministic.

Note: links to IRAC

Interfaces to substantial learner models – analysis of an SPOOC.

Mental models

The set of user beliefs. Kay doesn’t see enough about mental models in the learning analytics literature.

The importance here is that mental models influences what a user can “see” and “hear”, how the interpret information. Clashes exists between user, programmer, expert MMs.

Pervasive technologies

Mention of orchestration as some of what drives this work.

Principles

  • Skill meters
  • Game elements
  • Good match to mental models

Summary

Some pointers to interesting research from the AI/HCI fields that could help inform learning analytics and prevent a lot of reinventing the wheel. But the observation made by the presenter that there are no principles, does mean that that promise of how to build these visualisations hasn’t been answered directly in the workshop.

Learning analytics is about learning – #ascilite

An attempt at some live blogging from #ascilite starting with the Australian Learning Analytics Summer Institute (#a-lasi).

Shirley Alexander chair of the keynote. Relating story of challenges of getting PCs adopted at Macquarie University – PCs are toy computers. The point is it’s all about the need. Mentioning a DVCs key concern is productivity. Good use of data the key.

Keynote – Associate Professor Dragan Gasevic (or perhaps here) – Learning Analytics is about Learning. Dragan’s sides are shown below.

Demand for learning is growing – linked to Alexander’s point about productivity.

Note: Raises a question about learning versus education.

Scalability is possible – drawing on Hattie’s finding that class-size has a low effect size.

Question: What are the limits on generalizability of Hattie’s finding? Across all education? Schools?

Current business is mostly content delivery. Which leads to xMOOCs. Replicating content delivery – not different types of models

That’s content. What’s motivation for learning analytics.

Hattie and Timperley (2006) – the power of feedback. Linked to the idea that feedback loops between students and instructors are missing.

Learners creating their own support networks – digital footprints. How doe educators draw on that to generate feedback/insight.

Question: Do students want us to do that?

Comment: Feedback on my course this semester – “It was the quietest course Facebook page”

Dangers

Dangers – predict-o-mania. The same predictive models for everything and everyone

Note:: Predict-o-mania – good phrase for my presentation.

Leads to the danger of treating students like cattle in a feedlot.

Student diversity is important and a better aim for learning analytics. Points to stats about how the diversity of the student body is increasing. From the US – over 75% of students are not the traditional university student.

Comment: Focused on formal education – which is what I’ll do.

Showing graph from an ?Australian? university showing diversity in the population – differing levels of females, international students, NESB, non-urban etc.

Same 9 courses – diversity in LMS functionality diversity.

Comment: WOuld like to drill down more on this data.

This leads to predictive power diversity – i.e. overall only 20%, but in some courses, it goes up to 70%. — Not sure what is being predicted here. The point may be that predictions from the use of technology only gives limited power.

Now onto retention – understandable that it’s a priority. Makes the point that retention is just an outcome. One outcome. And doesn’t tell us anything about learning. How do we enhance learning if our focus is on the outcome?

Directions

Back to the SoLAR definition of learning analytics.

Emphasis on the why – “understand and optimise learning and the environments”

Bandura (1989) “Human agency in social cognitive theory” – modern education psychology – “Human agency is central to learning”

Talks about the challenge the idea of agency poses for computer scientists.

Brings in Winne and Hadwin’s model of self-regulated learning.

Note: Could be interesting to look into this and related models.

Task and cogntive conditions are outlined – the point about knowledge of how to study.

Note: The need to provide more explicit scaffolding/direction around how to study with blogs/reflection etc.

Why human agency – knowledge society and knowledge economy.

Challenges – metacognitive skills – Bjork, Dunloscky, Kornell (2013) – Self-regulated learning: Beliefs, Techniques and Illusions

Links this to the idea of human agency and the fact that students don’t see the opportunities afforded by using technology. Students don’t have the skills necessary to search for information. Knowing how to ask questions and critical thinking

Graesser and Oldeb (2003). How does know whehter a person understand a device? THe quality of the questions the person asks when the device breaks down

Note: This connects with some needs in EDC3100.

Process and content focus is important for LA

Missed the heading

Example 1

Effects of learning context. Athabasca University examples – social, teaching and cognitive presence leading to educational experience.

Example 2

Effect size of the moderator role on critical thinking – the student plays the role of moderator – effect size of 0.66. The rest of the class is just asking questions. All this linked to cogntive presence.

Effective size of intervention – where categories of questions to ask – 0.95 non-moderators and 0.61 moderators.

Control groups – no correlation with participation and grades. Some correlation in the intervention group.

Comment: Am wondering how much of this example is learning analytics? It’s illustrating the importance of external factors. But link to LA?

The external context directly influenced the value of the learning analytics. That’s the link.

But what about if you don’t have assessments. Self-reflections in video annotations. Some courses non-graded. Had a group of 4 courses where students may do a graded course prior to a non-graded course.

Non-grading with no prior experience had little annotations. But students with prior grading experience had much great levels of annotation. The idea is that they are learning the value of a particular technology.

Knowledge of prior technological/pedagogical experience of students is important

Tausczik, Pennebaker (2010 The psychological meaning of words: LIWC and computerized text analysis methods

Cognitive disequiliburim – need to be confused prior to learning.

Example 3

Identify profiles of learners. Students receive intervention and then make a decision – apparently their usage remained low?

Challenges

What to measure – we don’t need page access counts only.

Wilson (1999) Models in information behaviour research.

Instrumentation – about specific contexts and constructs.

  • Capturing interventions.
  • Previous learning and memory of experience.
  • Social networks – communication, cross-class
  • Interaction types – transactional distances.

Note: Giving academics a picture of what LMS features, learning designs that students in a course have had experience with.

Anderson’s transactional stuff – exploring the impacts of different types of interactions.

Looking at student motivation.

Zhou and Winnie (2012) Modeling academic achievement by self-reported versus traced goal orientation

Siadaty (2013) Semantic web-enabled interventions to support workplace larning – PhD thesis.

Adding instrumention is expensive.

Scaling up qualitative analysis – workload involved high – mentions use of natural language processing.

Temporal processes – beyond counting – longitudinal studies.

Generating reports and nice visualisation is not enough.

Big challenge – building data-driven culture in institutions.

Privacy and ethics.

Data sharing and mobility. A focus more on students taking the data with them.

Questions

Alexander’s favourite question for leadership positions in teaching

  • What do your students need to do in order to learn
  • How do you use that to design your teaching

Note: Need to revisit this for EDC3100. Links nicely with some plans for re-design.

Thoughts

A good overview of the field and a pointer toward some useful directions. Has generated some thoughts for my own practice – both research and teaching. Thereby making it worthwhile being here.