Learning analytics is about learning – #ascilite

An attempt at some live blogging from #ascilite starting with the Australian Learning Analytics Summer Institute (#a-lasi).

Shirley Alexander chair of the keynote. Relating story of challenges of getting PCs adopted at Macquarie University – PCs are toy computers. The point is it’s all about the need. Mentioning a DVCs key concern is productivity. Good use of data the key.

Keynote – Associate Professor Dragan Gasevic (or perhaps here) – Learning Analytics is about Learning. Dragan’s sides are shown below.

Demand for learning is growing – linked to Alexander’s point about productivity.

Note: Raises a question about learning versus education.

Scalability is possible – drawing on Hattie’s finding that class-size has a low effect size.

Question: What are the limits on generalizability of Hattie’s finding? Across all education? Schools?

Current business is mostly content delivery. Which leads to xMOOCs. Replicating content delivery – not different types of models

That’s content. What’s motivation for learning analytics.

Hattie and Timperley (2006) – the power of feedback. Linked to the idea that feedback loops between students and instructors are missing.

Learners creating their own support networks – digital footprints. How doe educators draw on that to generate feedback/insight.

Question: Do students want us to do that?

Comment: Feedback on my course this semester – “It was the quietest course Facebook page”

Dangers

Dangers – predict-o-mania. The same predictive models for everything and everyone

Note:: Predict-o-mania – good phrase for my presentation.

Leads to the danger of treating students like cattle in a feedlot.

Student diversity is important and a better aim for learning analytics. Points to stats about how the diversity of the student body is increasing. From the US – over 75% of students are not the traditional university student.

Comment: Focused on formal education – which is what I’ll do.

Showing graph from an ?Australian? university showing diversity in the population – differing levels of females, international students, NESB, non-urban etc.

Same 9 courses – diversity in LMS functionality diversity.

Comment: WOuld like to drill down more on this data.

This leads to predictive power diversity – i.e. overall only 20%, but in some courses, it goes up to 70%. — Not sure what is being predicted here. The point may be that predictions from the use of technology only gives limited power.

Now onto retention – understandable that it’s a priority. Makes the point that retention is just an outcome. One outcome. And doesn’t tell us anything about learning. How do we enhance learning if our focus is on the outcome?

Directions

Back to the SoLAR definition of learning analytics.

Emphasis on the why – “understand and optimise learning and the environments”

Bandura (1989) “Human agency in social cognitive theory” – modern education psychology – “Human agency is central to learning”

Talks about the challenge the idea of agency poses for computer scientists.

Brings in Winne and Hadwin’s model of self-regulated learning.

Note: Could be interesting to look into this and related models.

Task and cogntive conditions are outlined – the point about knowledge of how to study.

Note: The need to provide more explicit scaffolding/direction around how to study with blogs/reflection etc.

Why human agency – knowledge society and knowledge economy.

Challenges – metacognitive skills – Bjork, Dunloscky, Kornell (2013) – Self-regulated learning: Beliefs, Techniques and Illusions

Links this to the idea of human agency and the fact that students don’t see the opportunities afforded by using technology. Students don’t have the skills necessary to search for information. Knowing how to ask questions and critical thinking

Graesser and Oldeb (2003). How does know whehter a person understand a device? THe quality of the questions the person asks when the device breaks down

Note: This connects with some needs in EDC3100.

Process and content focus is important for LA

Missed the heading

Example 1

Effects of learning context. Athabasca University examples – social, teaching and cognitive presence leading to educational experience.

Example 2

Effect size of the moderator role on critical thinking – the student plays the role of moderator – effect size of 0.66. The rest of the class is just asking questions. All this linked to cogntive presence.

Effective size of intervention – where categories of questions to ask – 0.95 non-moderators and 0.61 moderators.

Control groups – no correlation with participation and grades. Some correlation in the intervention group.

Comment: Am wondering how much of this example is learning analytics? It’s illustrating the importance of external factors. But link to LA?

The external context directly influenced the value of the learning analytics. That’s the link.

But what about if you don’t have assessments. Self-reflections in video annotations. Some courses non-graded. Had a group of 4 courses where students may do a graded course prior to a non-graded course.

Non-grading with no prior experience had little annotations. But students with prior grading experience had much great levels of annotation. The idea is that they are learning the value of a particular technology.

Knowledge of prior technological/pedagogical experience of students is important

Tausczik, Pennebaker (2010 The psychological meaning of words: LIWC and computerized text analysis methods

Cognitive disequiliburim – need to be confused prior to learning.

Example 3

Identify profiles of learners. Students receive intervention and then make a decision – apparently their usage remained low?

Challenges

What to measure – we don’t need page access counts only.

Wilson (1999) Models in information behaviour research.

Instrumentation – about specific contexts and constructs.

  • Capturing interventions.
  • Previous learning and memory of experience.
  • Social networks – communication, cross-class
  • Interaction types – transactional distances.

Note: Giving academics a picture of what LMS features, learning designs that students in a course have had experience with.

Anderson’s transactional stuff – exploring the impacts of different types of interactions.

Looking at student motivation.

Zhou and Winnie (2012) Modeling academic achievement by self-reported versus traced goal orientation

Siadaty (2013) Semantic web-enabled interventions to support workplace larning – PhD thesis.

Adding instrumention is expensive.

Scaling up qualitative analysis – workload involved high – mentions use of natural language processing.

Temporal processes – beyond counting – longitudinal studies.

Generating reports and nice visualisation is not enough.

Big challenge – building data-driven culture in institutions.

Privacy and ethics.

Data sharing and mobility. A focus more on students taking the data with them.

Questions

Alexander’s favourite question for leadership positions in teaching

  • What do your students need to do in order to learn
  • How do you use that to design your teaching

Note: Need to revisit this for EDC3100. Links nicely with some plans for re-design.

Thoughts

A good overview of the field and a pointer toward some useful directions. Has generated some thoughts for my own practice – both research and teaching. Thereby making it worthwhile being here.

About these ads

2 thoughts on “Learning analytics is about learning – #ascilite

  1. Pingback: Learning analytics is about learning - #ascilit...

  2. Pingback: OTR Links 12/02/2013 | doug --- off the record

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s