The lack of interaction/feedback between student and teacher in large, contemporary, Australian university courses has always frustrated me. With 350+ students currently enrolled in the course I’m teaching, I’m keen to address this problem. Enter the weekly “course barometer”, a simple practice I’m hoping we can keep up for the current semester. The following is a quick summary of the results for the first week and a description of how it works.
How it works
- I ask the students to complete a Google form at the end of their learning for a week.
- Their responses get put into a Google spreadsheet.
- The responses are examined, analysed and inform what we’re doing in the course over coming weeks.
The questions/tasks in the Google form are (all except #4 are free response)
- Write down the two most important things you have learnt in EDC3100 this week.
- What would you most like more help with?
- How do you feel about EDC3100 at the moment? (Select all the words that apply to you)
- What is the biggest worry affecting your work in EDC3100 at the moment?
- How could we improve EDC3100?
A process similar to this has been widely used. This particular set of questions arise from the following
The IMPACT procedure (Clarke, 1987 cited in Goos et al., 2007, p. 411) is one method for discovering the concerns and opinions of students. It involves the regular completion of the following simple questionnaire during class (for this unit during the Friday “Reality and Reflection” lessons) and the retention of responses over the period of the class. Goos et al (2007, p. 411) suggest that the success of this process “depends on respecting the confidentiality of student responses and acting on these responses where appropriate to improve students’ experiences of learning mathematics.”
taken from here
Due to the point about “confidentiality” and the novelty of this approach, I’ve decided (for now) not to open up access to the Google spreadsheet with the data to anyone except the teaching staff in the course.
First weeks responses
The following images (click on them to see a bigger version) are word clouds generated by sending the raw responses for each question through Tagxedo. I still need to look more closely at the feedback, but some initial thoughts.
How are you feeling?
Was happy and a little surprised to see some of the more positive feelings be visible. Had worried it was all negative. Week 1 was very challenging and time consuming.
Given students are given a fixed set of words and are able to add a few words of their own, this is perhaps the best question to analyse using a Word cloud. A word cloud is not so useful for the free text questions.
Two most important things
Need to look at these responses in more detail. Interesting at some level that “assignment” is not the biggest. Arguably having “learning” and “understanding” being more of a focus is potentially a good thing. But closer examination is needed.
There is the “how to use the tool” presence (blog, twitter and diigo)
Time and workload have been the big worry, at least via other communication mechanisms, and that appears to have come through in this.
Time and workload would appear to be a major area for improvement. Future weeks will see this improve and we’ll need to revisit the design of the course a bit. As it stands, week 1 is probably too much of an ask.
However, I’m going to be interested to see how this evolves over coming weeks. Much of the work in week 1 was setting up new tools and developing some foundational insights that should really help in subsequent weeks.