Fig.1. Facts in an essays are like pepper in soup
How do you compare and mark a variety of Massive Open Online Courses (MOOCs)?
We need to treat them like one of those challenges they do on Top Gear, where Jeremy Clarkson - Richard Hammond - James May set off to Lapland in a Reliant Robin or some such and then get marks across six or so criteria. Hardly scientific, but it splits the pack.
So, let's say we take THREE MOOCs, what criteria should there be?
- Commitment. What percentage of participants signing up complete the course?
- Comments. I use the word 'vibrancy' to judge the amount and nature of activity in the MOOC, so this is crudely reduced to the number of comments left.
- Likes. Another form of vibrancy where comments left by the team and by participants are 'liked'. It has to be a measure of participation, engagement and even enjoyment
- Correct answers. Assuming, without any means to verify this, that participants don't cheat, when tested are they getting the answers right. This is tricky as there ought to be a before and after test. Tricky to as how one is tested should relate directly to how one is taught. However, few MOOCs if any are designed as rote learning.
You could still end up, potentially, comparing a leaflet with an Encyclopaedia. Or as the Senior Tutor on something I have been on, a rhinoceros with a giraffe.
It helps to know your audience and play to a niche.
It helps to concentrate on the quality of content too, rather than more obviously pushing your faculty and university. Enthusiasm, desire to impart and share knowledge, wit, intelligence ... And followers with many points of view, ideally from around the globe I've found as this will 'keep the kettle bowling'. There is never a quiet moment, is there?
I did badly on a quiz in a FutureLearn Free Online Course (FOC). World War 1. Paris 1919. A new world order ...
I think I got half right. I chose not to cheat, not to go back or to do a Google search; what's the point in that. I haven't taken notes. I wanted to get a handle on how much is going in ... or not. Actually, in this context, the quiz isn't surely a test of what has been learnt, but a bit of fun. Learning facts and dates is, or used to be, what you did in formal education at 15 or 16. This course is about issues and ideas. A 'test' therefore, would be to respond to an essay title. And the only way to grade that, which I've seen successfully achieved in MOOCs, is for us lot to mark each others' work. Just thinking out loud. In this instance the course team, understandably could not, nor did they try, to respond to some 7,000 comments. They could never read, assess, grade and give feedback to a thousand 4,000 word essays. Unless, as I have experienced, you pay a fee. I did a MOOC with Oxford Brookes and paid a fee, achieved a distinction and have a certificate on 'First Steps in Teaching in Higher Education'.
As facts are like pins that secure larger chunks of knowledge I ought to study such a FutureLearn FOC with a notepad; just a few notes on salient facts would help so that's what I'll do next week and see how I get on. Not slavishly. I'll use a pack of old envelopes or some such For facts to stick, rather than ideas to develop, the platform would have needed to have had a lot of repetition built into it. Facts in an essays are like pepper in soup.
Armed with an entire module on research techniques for studying e-learning - H809: Practice-based research in educational technology - I ought to be able to go about this in a more academic, and less flippant fashion.