The institutional LMS seems to be having some problems, so I’ll post this instead.
Quite a few folk I work with have made observations about semester droop. i.e. attendance at lectures/tutorials dropping off as the semester progresses. @damoclarky and @beerc confirmed that the same droop can be seen in the longitudinal LMS data they have access to across most courses.
So the question I wanted to explore was
Does the course site from the S2, 2013 offering of my course show evidence of semester droop?
The quick answer is “Yes, a bit”. But it’s not huge or entirely unexpected. Might be interesting to explore more in other courses and especially find out what’s behind it.
Why do this?
I thought this would be interesting because
- I have a little tool that allows me to view usage of the site very easily.
If it were harder, I probably wouldn’t have done it.
- The S2, 2013 offering is entirely online, no on-campus students so the course site is the main form of official interaction.
- Part of the final result (no more than 3%) comes from completing the sequence of weekly activities on the course site.
- I’ve tried to design these activities so that they explicitly link with the assessment and Professional Experience (the course if for pre-service teachers, they teach in schools for 3 weeks during the semester).
Created two views of the S2, 2013 EDC3100 course site using MAV
- Clicks; and,
Shows the entire course site with the addition of a heat map that shows the number of times students have clicked on each link.
The same image, but rather than clicks the heat map shows the number of students that clicked on each link.
- Students – there is some drop off.
91 students completed all the assessment. 9 did not.
97 students is the largest number of students clicking on any link. This is limited to the Assessment link and a couple of links in the first week. Where did the other two go?
The activities in the last week range from 48 students clicking on a link up to 83 students.
So definite drop off with some students not completing the activities in the last few weeks.
Assessment link had the most clicks – 1559 clicks.
The “register your blog” link had 1211 clicks. This is where students registered and looked for other student blog addresses. The blog contributed to final result.
Discussion forums for Q&A and Assignment 3 – 977 clicks and 949 clicks.
Activities in the first week ranged from 177 clicks up to 352. Indicating that many students started these more than once.
Activities in the last week ranged from 83 to 146 clicks. The 146 clicks was titled “Pragmatic assignment 3 advice”.
Definite drop off. The most popular activity in the last week got less clicks than the least popular activity from week 1.
@palbion made the point that students are pragmatic and do what they think they need. It appears that the EDC3100 design addresses this somewhat in that they tend to stick with the activities as they need it.
However, by the last week the students have the results from two assignments that make up 59% of their assessment. I wonder if the small percentage associated with completing study desk activities and knowing their likely mark results in them making a pragmatic decision? One potential explanation for the drop off in the last week.
The other is they are probably busy with other assignments and courses they need to catch up on after being on Professional Experience.
@beerc has made the suggestion that perhaps by the end of semester the students are more confident with the value of the course site and how to use it. They’ve had the full semester to become familiar, hence less clicks searching around to make sure everything is checked.
Of course, asking them would be the only way to find out.