Backup for BIM 2.0

What follows is a journal of the attempt to bring BIM 2.0’s backup functionality into line with the new approach in Moodle 2.x.

Done. Appears to be all working. Will work on restore next and do some testing.

Backup

First up is trying to understand the developer docs on the new backup process. What follows is an attempt to both summarise/understand those docs and explain what changes I’ve made to BIM 2.0. The Backup 2.0 general architecture documents are also used.

What I believe it all boils down to is the ability to convert the database structure for a BIM activity into an XML file/structure. The aim here will be to keep the XML structure produced as close to that produced by 1.9 as possible.

Steps required

  1. Preparation – knowing what to backup
    Much of this is done in the “BIM data” section below.
    1. Draw the DB schema.
    2. Identify where the user information is located in the schema.
    3. Determine correct order of backup.
    4. Identify attributes and elements.
      All “id” fields should be attributes.
    5. Identify not needed elements.
      Any field except those in parent elements should be included.
    6. Identify the file areas used.
      Text fields and attachements appear to fit into this sector. This appears to be a bit new in Moodle 2.
    7. Annotating important bits
      e.g. the ID fields.
  2. Remove the old backup stuff.
    Basically backuplib.php in the bim directory.
  3. Tell Moodle that BIM 2.0 is supporting backups.
    Add the following to mod/bim/lib.php
    case FEATURE_BACKUP_MOODLE2:          return true;
  4. Set up the directory for the code
    create mod/bim/backup/moodle2
  5. Set up and test the backup process (which won’t work at the moment).
    The backup documentation includes a simple script that speeds up the develop/test cycle for backups. put that in place and run it. Breaks as expected
  6. Start putting in the code
    1. create empty mod/bim/backup/moodle2/backup_bim_settingslib.php
    2. backup_bim_stepslib.php – another empty file
    3. backup_bim_activity_task.class.php the place the above files are used. For now just some skeleton code with empty methods.
  7. Run the backup again.
    Which runs without error as expected.
  8. Create the bim.xml file – as an empty file
    • Some empty code into backup_bim_stepslib.php
    • Call the method from the steps file from backup_bim_activity_task.class.php.
  9. Now to define each of the elements essentially a translation of the provided code with the description of the bim data below. This produces an empty backup file for bim.
  10. Define the tree of data following the skeleton code.
  11. Connecting it all to the database
    A fairly simple set of method calls building on the above. Tested and all seems to be working. Woo hoo!
  12. Annotating IDs
    This appears to be related to signposting user (and other) information, something I missed the first time.
    For BIM, the relevant fields to annotate are user and group.
  13. Annotating files
    Not sure about this section. Need to read some more and update.
  14. Encode references to URLs?
    Done as per example.

BIM Data

The following is based on this 2010 post documenting the development work on the backup process for BIM 1.0. With some extra work based on the preparation information from above.

The bim data hierarchy (bullet points represent table names)

  • bim
    id (attr)
    course (not needed) **** CHECK IF THIS IS INCLUDED ****
    name
    intro (????file area???)
    introformat
    timecreated
    timemodified
    register_feed
    mirror_feed
    change_feed
    grade_feed

    • bim_group_allocation
      id (attr)
      bim (not needed)
      groupid
      userid
    • bim_questions
      id (attr)
      BIM (not needed)
      title (???? file area ???? )
      body (??? file area????)
      min_mark
      max_mark
    • bim_student_feeds
      id (attr)
      bim (not needed)
      userid
      numentries
      lastpost
      blogurl
      feedurl
    • bim_marking
      id (attr)
      bim (not needed)
      userid
      marker
      question (this is an id back into bim_questions)
      mark
      status
      timemarked
      timereleased
      link (???file area??)
      timepublished
      title (??file area??)
      post (?? file area?? )
      comments (?? file area??)

In BIM 1.0 the user data includes: student feeds, marking and group allocation.

Major (Moodle) requirements for BIM 2.0

The next step in the development of BIM 2.0 is identifying the list of major (Moodle) requirements that need to be implemented. BIM is a Moodle activity module. Moodle has a range of expectations that such modules are meant to meet. The following is an attempt to identify what needs to be done.

It has resulted in a renewed effort to use the github issue list to record and manage what needs to be done. Not only have I started adding issues for BIM 2.0, I’ve also been through the old issues and decided which apply to BIM 2.0

In short, some major work to be done to get backup/restore migrated. Some minor tweaks (it appears) to get gradebook integration working. Logging is working as is.

Summary of the requirements

A summary of what was found follows. It includes some compulsory/important type requirements:

And also some that would be nice future additions:

What has changed?

Now to find out what has changed in the requirements that have to be addressed now.

Backup and restore

This has definitely been changed. It’s listed in the migrating CONTRIB code document.

backuplib.php is now replaced with a backup directory. It also appears to be a more OO-based approach. Some major re-work to be done here. Will leave this to another post.

Gradebook

This isn’t working. Any attempt to turn on the BIM gradebook integration generates an error on line 313 of lib.php due to a problem with a database insert

Debug info: Column ‘grademax’ cannot be null
INSERT INTO mdl_grade_items

The question will be whether this is a problem in BIM or evidence that the Gradebook API has changed significantly.

According to the Gradebook API there should be a mod/bim/grade.php file. Certainly not one in BIM 1.0. But then the forum module doesn’t have one either and yet it does use the gradebook, so it would appear to be optional.

grademax can be changed in the gradebook, but the help text located there suggests it should be set on the activity settings page. i.e. I need to add the ability to set grademax on the BIM config screen.

This has identified that the problem is because the existing BIM code is does not have a value for the grademax field for the gradebook. It appears that the Moodle 2.x code has required that this be not null.

Actually, the BIM 1.0 code doesn’t seem to have this set. A mystery change? Perhaps some boilerplate with a search and replace I put in place when setting up BIM 2.0? Moodle 1.9 doesn’t seem to have required a grademax value. So what does grademax imply?

Common sense would seem to imply the maximum value that can be entered into the gradebook for this component. BIM currently asks for maximums for each question, so a grademax could be calculated. The problem is that BIM only uses the maximums to generate a warning, it doesn’t enforce it. If the gradebook enforces grademax, then this could create some dissonance with BIM’s operation.

As it happens the hard coding of grademax to 10 results in gradebook integration. Or at least the activity being added to the gradebook. When I try to release some results (which includes adding marks to gradebook) I get a coding error which I’ll need to fix. Have added this to the to do list.

Will leave working on this until later.

It also suggests that in lib.php the bim_supports function should report that it has FEATURE_GRADE_HAS_GRADE. I’ll add that for now.

There are also a few examples that provide some extra code missing from BIM. Will add that as well.

Logging

The logging API in Moodle is likely to be replaced in a little while as part of an increasing importance of logging, analytics etc. The new work includes some references which could be used to inform a rethinking about BIM logging. This is one of my areas of interest.

But at the moment, the current BIM logging is working. At least there are BIM entries being added into the dummy course that I’ve been testing with.

Bug fix and to do for BIM

After a short Xmas break it’s time to continue work on getting BIM 2.0 up and going. In this post I’m trying to continue the work from a week or so ago. The main aim is to fix a bug with the manage marking page.

Status: The manage marking bug has been fixed. Mostly related to further migration work from Moodle 1.9 to Moodle 2.x

The manage marking bug

The bug is summarised nicely by the following screen short from the last post.

Manage marking has an error

There appears to be a problem with one of the data structures that results in the BIM crashing and burning. There’s some evidence of an earlier attempt to investigate this, so time to revisit prior posts on BIM development. This post identifies the location of the problem.

The problems are all related to the changes in the database API from 1.9 to 2.x

These are fixed. get_all_marker_stats is working, however, the displaying of the data also needs to be fixed. Replacing flexible table with HTML table.

To do

  • Table of unregistered students is showing some number (student id?) that shouldn’t be there.
  • It isn’t showing the left hand column.
    A broken div

Unregistered students

A few of the pages display a table of students who have not registered their blogs. This needs to be updated to html_table.

  1. Find where it is shown.
    Done using the bim_setup_details_table with the last parameter being unregistered. Once with the marker code and twice in the coordinator code.
  2. Identify the fix
    • replace add keyed data with similar to
      $table->data[] = array( $row['username'], $row['name'], $row['email'], $row['register'] );
    • replace table..print_html with
      echo html_writer::table( $table );
  3. Fix each of those.
    Fixed.

Help text for Manage Marking

The problem with manage marking seems to have delayed the provision of the help text. Need to add that in.

Only the one, but there does appear to be some scope to provide more detailed help messages throughout.

To do list

This post has a list of what was working and not with the coordinator interface and a later post updates some of this. Need to revisit these and start a list in basecamp.

Misc to do

  • Manage marking
    • view students with the missing status appears to be showing a student who has 1 question that has been marked. What is the meaning of the MISSING status?
  • Re-visit the use of tables and how implemented.
  • Help messages
    • Check out other help icons in coordinator views.
    • Think about provide more detailed help via sprinkling help icons throughout all of the views.
    • Look into how some of the older help text can be reused.

BIM: another restart?

The following is essentially an activity log/diary or the first steps of getting back into work on BIM. I’m hoping to have it ready to work with some course redesign I’m working on, but timelines may make that difficult.

The aim of this is to get the current version of BIM for Moodle 2.x up and running with Moodle 2.4+. The next step will be to determine what work needs to be completed on BIM and what new features might be useful.

In summary, it’s surprisingly functional as is, much more than I remembered.

Download and install Moodle 2.4

Moodle 2.4+ downloaded from here

Stick it in an m24 directory under xampp and follow the instructions.

All installed.

Installing bim2

And now to get bim2 off github. Mm, 8 months since I worked on the code. Not good.

mkdir bim
cd bim
git clone https://github.com/djplaner/BIM.git
mv BIM/* .
mv BIM/.git .
rm -rf BIM

Task: I really need to look into the naming of that folder and using of git so there’s no need to play with the file structure.

Visit the local Moodle website, picks up BIM ready to install. Oops, error.

Plugin “mod_bim” is defective or outdated, can not continue, sorry.
Debug info: Missing mandatory en language pack.
Error code: detectedbrokenplugin

That’s because I didn’t clone the bim2 branch

sudo git clone -b bim2 https://github.com/djplaner/BIM.git

And that has updated successfully. Now does it actually work?

Testing it out

Ohh, pretty new interface for Moodle 2.4. Looks like the BIM icon will need to updated to work with the slightly bigger and different design for the module icons. (Click on the following images to see bigger versions)

After adding the activity you need to enter the basic configuration details

Add some questions that the students will blog in response to.

What about allocating markers to mark the influx of posts?

No users allocated to the course, so nothing there. Nice to see I’d thought of this condition. Time to allocate some students and teaching staff. So staff enrolled in the course. Can I manage marking now?

Not yet. I need to create some groups for the course. Markers aren’t allocated individual students within BIM. They are allocated groups.

So with groups allocated, I can allocate a marker. Can I manage the markers? The coordinating teacher can see a list of all the markers and what they have (or haven’t) marked yet.

Oops, that’s the first error in the code. Will have to revisit that.

Can I see the students I have to mark as a marker? This is the overview. It shows which of my students have registered their blogs (and for which I can mark something) and which haven’t yet.

Now, let’s see if I can do some marking.

Not really because none of the posts from this single student have been allocated to one of the set questions. I’ll need to allocate one of his posts to a question using the “allocate question” screen.

Now I should be able to mark that allocated question

Student perspective

So, does it work from the student’s perspective. Does the activity show up when they login to the course?

Can the register their blog?

Does it actually work as expected?

What’s next?

Time for a road trip. So no progress for a few days, after that it will be revisiting what outstanding tasks are left to make this truly useful. Gradebook integration is probably the top of the list. Backup/restore may be the next step.

Why Moneyball is the wrong analogy for learning analytics

Learning analytics is one of the research areas I’m interested in. Consequently, I’ve read and listened to a bit about learning analytics over recent times. In that time I’ve often heard Moneyball used as an example or analogy for learning analytics.

I can see the reason for this. It’s a good example of how data can inform decision making in a field many people (especially those in America) are familiar with. Having a best selling book that’s turned into a Brad Pit movie doesn’t hurt either. But I think it’s the wrong analogy for learning analytics.

Moneyball by Kei!, on Flickr
Creative Commons Attribution-Share Alike 2.0 Generic License  by  Kei! 

Why?

As it happens I’ve been reading Nate Silver’s book The Signal and the Noise: Why most predictions fail bus some don’t over recent weeks and I’ll use it to make my case. Silver has had success in applying “analytics” to make predictions in both baseball and US politics and in the book he talks to experts from a range of fields about predictions. Through this process he concluded

I came to realize that prediction in the era of Big Data was not going very well

One of the reasons he gives is

Baseball, for instance, is an exceptional case. It happens to be an especially rich and revealing exception

Why? Well one reason is given when talking about economics, a discipline with a poor track record when it comes to predictions.

This isn’t baseball, where the game is always played by the same rules.

If you don’t play by the rules set down in baseball you are going to get pulled up. What are the rules for learning? How can you be sure that each of the students are aware of the rules, interpreted them the same way, and are playing by them?

A little further on in Silver’s book comes this

The third major challenge for economic forecasters is that their raw data isn’t much good

If the raw data isn’t much good, any predictions you make based on that data is going to have some flaws.

How good is the data in learning? Well, in the face-to-face classroom it’s next to non-existent. At least in the hard, quantitative, consistent form required for most learning analytics. If it’s e-learning, well the data is currently limited to usage logs from the LMS, which are at best a vague indicator of what’s going on.

Intelligent Tutoring Systems tend to solve these problems by having a fixed set of rules (a model) of learning and learners in a particular area. These rules, however, would appear to limit the adoption of the system. How many other contexts can these rules be applied to? Can you actually create such rules for all contexts?

I’m not convinced you can. Especially when broader trends are pushing for an increasingly diverse set of students, but also when learning is seen as a broader, more open and individual happening. Are there “rules” for learning that are broadly applicable?

What the economics analogy suggests for learning analytics

Is learning analytics about prediction? I’d argue that largely it is. You are wanting to understanding what is happening and make predictions about what will happen next. If the learner isn’t going to learn, you want to know that and be able to intervene. You want to make predictions that enable intervention.

The lack of success in prediction in economics suggests that the future of learning analytics may not be bright. At least if it relies on the same models and assumptions as economics. So what needs to change?

Beyond the early adopters of online instruction: Motivating the reluctant majority

The following is a summary and some reflection on Hixon et al (2012). I’m particularly interested in this topic due to my belief (based on 20 years experience and observation) that most institutional approaches to change in learning and teaching has only been successful in moving the same 10% of staff. A 10% that didn’t need a lot of help in the first place.

Thoughts and to do

Fairly disappointed with this paper. Didn’t engage at all with the perspective of Geoghegan (1994) who took the implications from the adopters categories a lot further and questioned some of the fundamental assumptions.

Misc thoughts, questions and to do

  • Is there anyone doing interesting research/thinking around the inherent diversity in academics and the inherent consistency in what passes for institutional e-learning?
    Looking at the work referencing Geoghegan would probably be a good start.
  • How/what does the increasingly universal adoption of e-learning in Oz Universities imply for the adopter categories and from there how e-learning is supported and the subsequent quality of it?
  • Can learning analytics of LMS usage identify/support the adopter categories? Or at the very least some difference between staff?
  • Look at the conceptual paper (Barczyk et al, 2010) that informed the survey

In the following, where I remember, my comments are emphasised. Other text is a summary of Hixon et al (2012)

Abstract

Now that most of the innovators and early adopters of online instruction are comfortably teaching online, many institutions are facing challenges as they prepare the next wave of online instructors. This research how faculty in this “next wave” (the majority of adopters) differ from the innovators and early adopters of online instruction. A specific online course development program is described and the experiences of the “majority” in the program are examined in relation to the experiences of previous participants (the innovators and early adopters).

There is probably a refinement to be made here. There are a number of universities in Australia where the majority of, if not all, courses are taught online. These institutions already have the “next wave” online. The problem though is that the quality of the online experience leaves a great deal to be desired.

Introduction

More folk have to move online. This study designed to help inform best practices “in bringing the ‘majority’ online”

Based on the Distance Education Mentoring program at a Midwestern university. Cohort-based mentoring program to help faculty develop an online course. Over four years of operation it’s been noted that faculty participants are changing. Looks at Rogers’ Diffusion of Innovation theory to understand the changes. A few paragraphs explaining DOI getting to categories of adopters. Moves onto some literature on those using it exploring technological innovations.

Interestingly, they don’t reference Geoghegan (1994) who used Moore’s extension of these to make some points along these lines. Interesting largely because Geoghegan is one of my theoretical/literature “hammers”.

Approaches to development of online courses

Posits two approaches

  1. faculty-driven approach;
    An approach that can work if faculty have the skills.
  2. collaborative approach.
    Seen as the solution to overcome the difficulties (especially pedagogical) of the faculty-driven approach.

I’d argue that the same observation (lack of skills) can be made with the collaborative approach. I’ve been in situations where the “collaborators” haven’t had the necessary skills either. What isn’t explicitly noted in the above is that both normally assume that course development is separate from teaching. A course is redesigned before/after teaching.

Struggles faced by academics from the literature include

  • learning the necessary skills;
    A contributing factor here is the really poor quality of the technical tools. Some of that difficulty arises because the tools are from another context and don’t match the local context.
  • adapting the pedagogic strategies for the online environment;
    At the same time workload formulas, room allocation, legal requirements, policies and student prejudices mitigate against the adoption of those pedagogical strategies.
  • conceptualising their course for the new environment;
  • finding increased time required to develop quality online courses.
    Wonder if the faculty had developed “quality” face-to-face courses? What’s the source of this difficulty online?

The assumption here is that it is course development that must be collaborative. What would it look like, how would it happen and what would be the impacts if the course delivery process was collaborative? i.e. don’t assume that faculty skill-development and course redesign only occurs before the course is taught. What happens if as I’m teaching the course I make changes and am able to learn more about what works. More importantly, that the organisational e-learning systems/policies/processes can learn more.

Method

The mentoring program “is design to educate and certify faculty members in the principles of instructional design”. Each faculty participant (protege) has a mentor from outside the discipline. Uses the Quality Matters rubric. More detail given on how it works.

Courses are taught and then evaluated and given a pass/fail based on the rubric.

By the 3rd year of the program, noticed participants “are hesitant, or even resistant, to consider new approaches and technologies, or even to teach online”. Which is argued to be fitting to the idea that they are “the majority”. Would be interesting to dig further into this? Were this “majority” being required to participate?

Program changed in fourth year. More structure. More defined structure in the online course they complete. Submission by specific deadlines. Formal meeting schedule. A contract required to be signed. Mmm, not a great fan of that. I wonder if they actually looked at when Rogers and others have said about the characteristics of the majority and if the changes to the program were based on those insights? e.g. their communication networks tend to be vertically oriented, wouldn’t having a mentor from outside the discipline be a poor match?

Research questions

  1. In what ways do faculty members participating in the 4th offering differ from prior offerings?
  2. In what ways do the experiences/perceptions of the 4th years differ?

Survey questionnaire developed to ask: skill development, mentoring relationship, its effectiveness, perceptions of teaching as a result, program satisfaction, general beliefs about online education. Wouldn’t connecting this to DOI have been sensible?

47 of the 92 proteges completed the survey: 27% of year 1, 52% of year 2, 57% of year 3, 58% of year 4.

Responses from years 1-3 combined to compare. But they’ve said they noticed differences in year 3?

Results

Those in 1st three years had been teaching longer than 4th years. Simlarly, the earlier group had higher ranks. Oh dear, it appears the 4th years might “junior” academics fighting to establish themselves as researchers forced to engage with the program

Year 4 less likely to identify as early adopters of technology. Continuing stereotypes would have required age to have been mentioned by now wouldn’t it? No significant difference in age.

Both groups were equally looking forward to the program.

Year 4 group reported more benefit from online course. Well this measures the changes in the course, rather than the people Both groups satisfied similarly with program.

References

Barczyk, C., Buckenmeyer, J., & Feldman, L. (2010). Mentoring professors: A model for
developing quality online instructors and courses in higher education. International
Journal on E-Learning, 9(1), 7–26.

Geoghegan, W. (1994). Whatever happened to instructional technology? In S. Bapna, A. Emdad, & J. Zaveri (Eds.), (pp. 438–447). Baltimore, MD: IBM.

Hixon, E., Buckenmeyer, J., Barczyk, C., Feldman, L., & Zamojski, H. (2012). Beyond the early adopters of online instruction: Motivating the reluctant majority. The Internet and Higher Education, 15(2), 102–107. doi:10.1016/j.iheduc.2011.11.005

Developing personal learning networks for open and social learning

The following is a summary and touch of reflection on Couros (2010) and is the another step in thinking about the design/implementation of a course I’m working on.

Thoughts and to do

As expected a good overview/rationale for the type of approach I’m interested in exploring with EDC3100. Some interesting departures to think about. For example,

  • Alec’s course had 16 registered students, mine will have 200+ (first semester), perhaps another 80+ (second semester) and possibly have to be taught by someone else in semester 3.
  • Alec’s much more effective and engaged with his PLN than I am.
  • Alec’s course is post-graduate, mine is under graduate.

to do

  • Look at Tabak’s (2004) concept of distributed scaffolding.
  • Engage in an analysis of the learning environment available for EDC3100. Is Moodle appropriate? Would a self-hosted WordPress be a better fit? Having 200, rather then 20, registered students is an argument for Moodle (perhaps).
  • Think about the question about whether to be overly explicit in terms of what students should post to the blog, or take the more open approach.
  • How many of the student blogs are still active today?
  • Over time it appears there’s been a move away from the Wiki assessment, I wonder why that is?
  • Is it still difficult/different to read social media?

Abstract

Tells the story of EC&I831 an open access, graduate level, educational technology course at the University of Regina in 2008. 8 non-registered participants for every official student. Experience provided insight into the potential for leveraging PLNs in open access and distance education.

Introduction

Course title – “Open, Connected Social”. Fully online course developed using FOSS and freely available services. Design demonstrate “open teaching methodologies: educational practices inspired by the open source movement, complementary learning theory and emerging theories of knowledge. Students builts PLNs to “collaboratively explore, negotiate and develop authentic and sustainable knowledge networks”. Couros (2010, p. 110) writes

It is my hope in writing this chapter that I capture and document relevant reflections and activities to provide starting points for those considering open teaching as educational innovation

That’s what I’m looking for Alec.

Three sections

  1. key theoretical foundations;
  2. the course experience
  3. discoveries related to the role of PLNs, techniques for developing and leveraging PLNs in DE courses and the role of emerging technologies.

Theoretical foundations

  1. The open movement
    Educators participating in FOSS communities had strong tendencies toward: collaboration, sharing, openness in classroom activities and professional collaborations. Technology was a barrier, but Web 2.0 etc addressed this. Now they could easily create, share, collaborate. Added is the greater availability of educational relevant content. So much so that

    The dilemma of the educator shifted quickly from a perceived lack of choice and accessibility to having to acquire the skills necessary to choose wisely from increased options.

  2. Complementary learning theories
    Influences include:

    • social cognitive theory – suggests it is the combination of behavioural, cognitive and environmental factors that influence human behaviour. People learn through observations of others. Self-efficacy is important.
    • social constructivism and
      Related to the above. Sociocultural context and social interaction are important for knowledge construction. Tabak’s (2004) concept of distributed scaffolding – an emerging approach to learning design.
    • adult learning theory.
      Adults learn differently from kids, which results in principles such as: adults be involved in planning/evaluating their instruction; experience (including mistakes) provides the basis for learning activities; interest is generated from subjects that have immediate relevance to their job/life; learning is problem-centred rather than content-oriented.
  3. Connectivism
    Heavily influenced by theories of social constructivism, network theory and chaos theory. Digital technologies become important to learning. Stresses metaskills of evaluating and managing information and the importance of pattern recognition as a learning strategy.
  4. Open teaching
    Defined as Couros (2010 p. 115)

    Open teaching is described as the facilitation of learning experiences that are open, transparent, collaborative, and social. Open teachers are advocates of a free and open knowledge society, and support their students in the critical consumption, production, connection, and synthesis of knowledge through the shared development of learning networks.

    Typical activities of open teachers include

    • Use of FOSS tools where possible and beneficial.
    • Integration of free/open content into L&T.
    • Promotion of copyleft content licences
    • Help students understand copyright law.
    • Help students build PLNs for collaborative and sustained learning.
    • Development of learning environments that are reflective, responsive, student-centered and incorporate diverse learning strategies.
    • Modelling openness, transparency, connectedness and responsible copyright etc. use.
    • advocacy for the participation and development of collaborative gift cultures in education and society.

    That last one is interesting

The course

20 registered students. Mostly practicing teachers or educational administrators. Normally there is a maximum of 16 students (I wish). Development funded by $30,000 government grant. Important: this funding was not used in the design and development side, but instead on hiring learning assistants who “were hired as social connectors, and their primary responsibilities were to support students in the development of PLNs” (Courous, 2010, p. 117)

In terms of selecting a learning environment, WebCT, Moodle, and Ning were rejected. Wikispaces was adopted. Wikispaces (hosted) was used. The site 2007-2010) and now. Have moved to a WordPress site (by the looks).

Course facilitation model

  • Major assessment (3) guided the activities
    1. Personal blog/digital portfolio
      Student responsible for developing a digital space to document their learning through readings and activities. For many these became showcases and acted as distributed communication portals. Most remain active beyond the end date.
    2. Collaborative development of an educational technology wiki resource
      Wiki with collaborative content.

      I’m wondering how collaborative this process was? A group or a network (a la Downes).

    3. Student-chosen major digital project.
      Range of projects (produce videos, instructional resources, social networking activities, participation in global collaborative projects, development of private social networks etc) developing resource specific to their professional context.

    It’s changed a bit and is described somewhat on the assessments page

  • Tools and interactions

    1. Synchronous events
      Two each week. 1.5-2 hours. First on content knowledge in form of invited presenters. Connect/elluminate and ustream.tv/skype used and associated recordings. The second was a hands-on session for technical skills and pedagogical possibilities.

      Combination Skype and ustream.tv became the preferred method for video conferencing. How is explained here

    2. Asynchronous activities
      • reading, reviewing and critiquing course readings in blogs.
      • sharing resources through social bookmarking.
      • Creation of screencasts, tutorials etc for personal learning and that of others.

      And a bunch of others including reading blogs, participation in open professional development, posting content to open sites, microblogging, collaborative wiki design and collaborative design of lesson plans. Most were unplanned.

PLNs in Distance education

First session in course was closed and explanatory. The author’s PLN became important to support the model. Which does raise the question of how someone without the author’s PLN might go.

Conceptualising PLN

Mentions absence of definition and the need to discern PLE from PLN. Offers two images (click on these to see the original) The old and new style “PLN”. A discussion picked up a bit more online here. In short it appears that the PLE are the tools, processes etc that allow management of learning. The definition used for PLN

personal learning networks are the sum of all social capital and connections that result in the development and facilitation of a personal learning environment.

TypicalTeacherNetwork by courosa, on Flickr
Creative Commons Attribution-Noncommercial-Share Alike 2.0 Generic License  by  courosa 
Networked Teacher Diagram - Update by courosa, on Flickr
Creative Commons Attribution-Noncommercial-Share Alike 2.0 Generic License  by  courosa 

PLNs for teaching and learning

Strategies deemed effective in the course

  • Immerse yourself.
    i.e. use and understanding of the social media tools, how they can be used, and how the students can use them.
  • Learn to read social media.
    Social media is read much differently than traditional media. Tools aren’t great. Need to use what’s available.
  • Strengthen your PLN
    Creating content and commenting on the work of others is important.
  • Know your connections
    Be aware of skills/backgrounds of PLN allow identifying who can help students.
  • PLNs central to learning
    Courses/communities hosted in the institutional LMS die. The community in this course lived on.

Final thoughts

two questions often asked after conference presentations on this

  1. How did you get away with this?
    Institutional support for open teaching is essential. Colleagues are “constructively critical of technology, but strongly supportive of innovation in teaching and learning”.
  2. Where did you find the time to teach this way?
    Good teaching always requires more time

References

Couros, A. (2010). Developing personal learning networks for open and social learning. In G. Veletsianos (Ed.), Emerging technologies in distance education (pp. 109–128). AU Press.

Tabak, I., (2004). Synergy: A complement to emerging patterns of distributed
scaffolding. The Journal of the Learning Sciences, 13(3), 305–335.