Assessment, #edcmooc style

I guessed it would happen, because of our experiences on the MSc in Digital Education, but I am delighted to see what is happening around the edcmooc digital artefact assignments. There are three things in particular that stand out: the effort, the sharing, and the feedback. These observations are based only on what’s been shared in forums, twitter and blogs – I haven’t looked at any of the formal submissions yet.

1. Effort. Digital artefact creation seems to inspire great creativity and effort. Inviting people to go ‘beyond text’ to represent academic knowledge, and to create work in public, appears to be motivating in a way that, say, writing an essay that only three people will read, generally isn’t. I think this is about audience, and about the pleasure of sharing and expressing understanding in multimodal ways. We are seeing people drawing on their own personal and professional expertise (coding, writing, film-making, music, photography, expertise with web-based tools, knowledge of interesting bodies of literature, and so on) to really grapple with course themes. I love it and will have a terrible time tearing myself away.

2. Sharing. A number of MOOC participants are rejecting (sometimes explicitly) protectiveness and secrecy in favour of sharing their ideas, drafts, processes and final artefacts. We could claim that we have reached a utopian state of trust within the MOOC (of course!!), but the networked nature of the assignment makes it possible, too. In a public, web context, sharing doesn’t take anything away from the sharer, but rather stamps ideas as theirs, drives traffic to their artefacts, and gives a reputational boost. This is digital scholarship in action. Those who are sharing in this way are providing helpful examples for others to build on, too. (The merits of the ‘exemplar’ are sometimes contested, but in general I am a fan.)

3. Feedback. The generosity of those sharing is matched by the generosity of those responding with enthusiasm, suggestions, constructive advice and audience responses. Particularly impressive has been the references to the assessment criteria, which a number of people in the forum are making in their informal feedback. By bringing these criteria into play at this stage, discussing and debating what they mean, and trying to apply them in context, MOOC participants are involving themselves in the feedback process in ways that I think will be extremely helpful during the peer assessment activity, and for those still working on their artefacts.

To sum up: heaps of praise for EDCMOOC participants, and the work they are beginning to do on these final assignments.

Advertisements

14 thoughts on “Assessment, #edcmooc style

  1. Pingback: Shoring the fragments of #edcmooc | Teaching 'E-learning and Digital Cultures'

  2. Pingback: Assessment, #edcmooc style | The Networked Ecosystem | Scoop.it

  3. Although this xMOOC (in my personal opinion) has incorporated more connectivist elements and, as a result, represents higher engagement and cooperation, I would still ask what % of participants are so actively involved? What then about the others?

    • good question, Vanessa. In return, though, I would ask if we need to know more about people’s intentions in enrolling on a MOOC, to decide if we should be concerned about those who do not ‘complete’? Some (very large) percentage never log in even once, so we need to figure out who are the people who were ever potential participants, and consider if the process of ‘reverse selection’ (enrolling is the easiest part, people find out if the course suits or if they can invest the time in it them only after it begins) is working as it should. If it is – if people are finding out that the MOOC isn’t for them for a range of good reasons – then I wouldn’t be concerned, no matter what the % still active at the end is. If, however, there are systemic factors that are keeping people out who need or want to be in, that is (I think) a different matter.

  4. Pingback: On how #edcmooc did a cmooc on Coursera | Doing by Learning (and vice versa)

  5. I have a bit of an obtuse question on the artefact. Did you choose the word deliberately as a nod to “2001: A Space Odyseey”? I’m almost certain they use it to describe the monolith in the film, or Arthur C Clarke uses it in his short story. Was this deliberate, or just a co-incidence I am looking too much into? 🙂

    Chris Swift

    • hmm… Sian and I have been using ‘visual artefact’ as a description for work on our MSc-level EDC course for several years… I think its origins are lost in the mists of time… 🙂

  6. I wonder what pre-course “intention” and post course surveys might indicate. I would describe myself as a serial mooc follower (and non-completer) who participates moderately (sometimes more, sometimes less), is hugely interested in the evolving model, often acts as a “translator” and explainer to academic and local connections, but has little interest in certificates. New models might call for new categories and vocabularies. For now,we are stuck with pre-existing terms.

    http://mfeldstein.com/the-four-student-archetypes-emerging-in-moocs/

    • “New models might call for new categories and vocabularies.” I think that is a great point, Vanessa – it’s certainly one that interests me a lot. Our colleagues in Informatics have tried specifying three different levels that MOOC participants might aim to achieve:

      “You can engage with the course at a number of levels to suit your interests and the time you have available:

      Awareness Level – gives an overview of the topic, along with introductory videos and application related features. This level is likely to require 2-3 hours of study per week.
      Foundation Level – is the core taught material on the course and gives a grounding in AI planning technology and algorithms. This level is likely to require 5-6 hours of study per week of study.
      Performance Level – is for those interested in carrying out additional programming assignments and engaging in creative challenges to understand the subject more deeply. This level is likely to require 8 hours or more of study per week.” https://www.coursera.org/course/aiplan

      Seems like a very useful model that more MOOCs might follow.

      • Thanks, this is a good start and useful. I like the mid course break idea too

        Also, we all (instructional designers, techies, content providers, instructors, participants, followers) are still working on figuring out how this new model is going to working. It is still emergent ~ or as I have explained to complainer, in Beta and we are the testers.

        The Venture Labs (http://venture-lab.org/) DNLE course opened with a survey asking participant similar questions, including how they self-classify, how many hours a week they plan to spend on course and level of involvement – with the option of changing their answers.

        Such models and more orientation (when everyone knows better what to orient to) should make a difference too. So many of these courses are on their first run. Think how long it takes to polish and refine the garden variety syllabus for a new or updated course for a class of under 100

  7. Pingback: the accidental technologist » Blog Archive » Assessing the #edcmooc digital artefact

  8. “Think how long it takes to polish and refine the garden variety syllabus for a new or updated course for a class of under 100” – lol, yep! 🙂

  9. Pingback: Digesting #EDCMOOC feedback | Teaching 'E-learning and Digital Cultures'

  10. Pingback: Assessing the #edcmooc digital artefact | Wayne Barry

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s