Analysing some course evaluation comments

The following reports on some analysis of students responses to open questions on the institutional, end of semester course evaluation survey for the course I taught in 2013. This initial post gives some background to the course, the evaluation process (and its limits), links to some of the raw results (including summaries of close responses) and an outline of the process I’m following.

This post starts with discussion of the results followed by a quick description of the process used.


The results reported below are drawn only from the Semester 2, 2013 offering of the course. I haven’t analysed in detail the responses from the Semester 1, 2013. In part, this is due to time limitations. However, as explained in the first post, it is also due to the Semester 1, 2013 offering of the course being developed as it taught. This did not work well and looking through the student comments this underpinned much of their perspective on the course. The semester 2 offering was essentially complete from the start of semester.

The semester 2 offering – as explained in the first post – was also (from one limited perspective) reasonably successful. Could be argued that I’m cherry picking the results I like.

One significant limitation of using only the Semester 2 responses is that Semester 2 is an online only offering. There are no on-campus students. A particular issue with this is that the increase in apparent workload for on-campus students from the apparent duplication between attending on-campus lectures and tutes and having to complete online activities is not an issue for online students.

The open coding of student comments into categories (see the method section below) was only done by 1 person – myself. Hence it’s not the only, nor likely the best categorisation.

As explained in the method section, I’ve also removed the “Teaching staff” category from the results. This category had by far the most positive comments (n=27) and no comments coded as negative or suggestions. It’s excluded because it made it difficult to get value from the chart and given that I’m already perfect there’s no need to consider those comments.


The following chart (click on the image to see a larger version) provides an overview of the analysis results and some description follows. In summary,

  • The overall course in S2, 2013 was well received, as was the Moodle course site.
  • The sample assignments and the assignment descriptions were positively received, though the workload remains an issue.
  • Workload remains perhaps the main issue, especially in the first few weeks. Though the students still appear to have enjoyed the course.

Student comments - EDC3100 by David T Jones, on Flickr

Study desk

The most positive comments were for the Study Desk category and its related child categories (see the method section for a list of all the categories)

  • 4 comments on activity completion, the students liked being able to track what they are up to.

    I have a fear hear about what this may encourage.

  • 6 comments on the learning path.
  • 2 comments on Moodle books.
  • 10 general comments on the study desk.

Course comments

Overall course comments are next. All positive comments. So not interesting in this context, but pleasing to the ego.


Content has 13 positive comments, but also has 1 suggestion (“course materials that link to readings and information” in the context of what can be improved) that I’m not sure I can parse. The same student also gave a negative content comment “there wasn’t any information/textbooks off the net for information”. This combination plus the wealth of content pointed to during the course seems to suggest that this is a mistake. But uncertain.

The other negative content comment was “Some times it was difficult to navigate back through all of the moodles to find info previously learn”. This is pointing to the absence of search engine barrow I’ve been pushing.


Assessment was the next most positive with 6 positive comments mentioned “All … were so well explained”, a couple mentioned something along the lines of “content directly linked to assessment and content”, a couple also liked “the examples given they helped a lot when completing my own assignment” and one even “Loved the assessment structure”

In terms of negatives or suggestions, mention was made of more time prior to assignment 1 (thrice); a mention of the final assignment being due 3 days after Professional Experience, and the size of the assessment tasks (twice). One negative comment was more fundamental “assessment tasks changed”.


Also with 6 positive comments, but also the first category with more negative comments (9) than positive.

The positive comments were generally of the form “So all content was necessary but just way too much!!”.

The negative comments all indicated too much work, with a particular emphasis on the first few weeks (a known problem). For example “This course nearly broke me but I have come a long way”.


This category included both setting up a PLN and using a range of tools (Diigo, twitter, blogs) to do that. Only 8 comments overall. A couple of positive ones mentioned the value of the PLN and the tools. The negatives included one that complained about managing multiple accounts for all the tools, one that complained that the tools set up unrealistic expectations for Professional Experience, and one raised not being confident enough to learn all those tools early on.

The remaining categories included very few comments.


The analysis was used as an initial exploration of using NVivo. The method used was:

  1. Import data from the institutional course evaluation website into NVivo.

    Due to the limitations of this website there are a number of limitations of this data including: no ability to link between responses to closed questions and responses to other open questions; and, only able include responses from 37 of the 42 responses to the survey.

    I’m particularly displeased about this institutional inability.

  2. Initial coding of student comments (or part thereof) into three categories
    • Suggestion – the comment makes a particular suggestion to change the course.
    • Negative – the comment criticises some aspect of the course.
    • Positive – the comment praises some aspect of the course.
  3. Open coding of student comments into categories based on an aspect of the course.

    A description of the resulting categories is given below. Where needed I’ve added an explanation. This is a hierarchical list. In the chart above, the child categories have been aggregated into their parent categories (i.e. you won’t see “Learning path” in the chart above, all those responses appear as part of the “Study Desk” category). The links below will take you to web pages (produced by NVivo) that show you the comments coded in each category.

    • Assessment (n=12)
    • Blogging (n=2) – any comment around the assessable requirement to write blog posts throughout the semester.
      • Blog updates – throughout the semester I sent out emails summarising students’ blogging progress against what was required for assessment.
    • Content (n=17) – the actual content of the course
    • Course (n=15) – general comments about the course e.g. “bad course”, “best course ever”
    • Misc other (n=1) – for comments that I couldn’t think of an appropriate category for
    • PLN (n=7) – about the requirement for students to engage with social media (blogs, diigo, twitter etc) to build a personal learning network
      • ICTs – the use of the various ICTs (diigo, twiiter, blogs) as part of the PLN process.
    • Professional Experience (n=3) – as part of the course students spend 3 weeks in schools teaching
    • Study Desk (n=22) – the USQ label for the Moodle course site
      • Activity Completion – a Moodle feature that will display a tick beside activities that the student has completed
      • Learning path – each week on the study desk was designed as a learning path. A series of online resources and activities that all students had to complete
      • Moodle Books – a Moodle feature used to structure the learning path
    • Teaching Staff (n=27)
    • Workload (n=21) – a comment about the level of work required to complete the course
      • Start of course – the workload for this course is quite heavy in the first few weeks.
  4. The use of NVivo’s matrix coding capability to produce the graph above that compares which aspects of the course the students commented on negatively, positively or as suggestions.

2 thoughts on “Analysing some course evaluation comments

  1. Pingback: What should be covered in EDC3100? | The Weblog of (a) David Jones

  2. Pingback: Building a CASA for student evaluation of teaching results – The Weblog of (a) David Jones

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s