Home » assessment
Category Archives: assessment
For a decade I’ve settled into a blog post every morning.
For the last couple of weeks I’ve at least been forming a new habit. Whether 3.30am or 5.40am I get up and work on fiction ’til breakfast. That out of the way I can get on with the rest of the day. This leads to early nights. But am I missing much? ‘I’m a celebrity get me out of here?’ No thanks.
A first from The OU – a TMA that was an audio file that came back with written comments and an audio file.
A very different beast learning a language. Both in the text and the spoken word there is lots to pick up on.
|From E-Learning V|
Fig.1. Facts in an essays are like pepper in soup
How do you compare and mark a variety of Massive Open Online Courses (MOOCs)?
We need to treat them like one of those challenges they do on Top Gear, where Jeremy Clarkson – Richard Hammond – James May set off to Lapland in a Reliant Robin or some such and then get marks across six or so criteria. Hardly scientific, but it splits the pack.
So, let’s say we take THREE MOOCs, what criteria should there be?
- Commitment. What percentage of participants signing up complete the course?
- Comments. I use the word ‘vibrancy’ to judge the amount and nature of activity in the MOOC, so this is crudely reduced to the number of comments left.
- Likes. Another form of vibrancy where comments left by the team and by participants are ‘liked’. It has to be a measure of participation, engagement and even enjoyment
- Correct answers. Assuming, without any means to verify this, that participants don’t cheat, when tested are they getting the answers right. This is tricky as there ought to be a before and after test. Tricky to as how one is tested should relate directly to how one is taught. However, few MOOCs if any are designed as rote learning.
You could still end up, potentially, comparing a leaflet with an Encyclopaedia. Or as the Senior Tutor on something I have been on, a rhinoceros with a giraffe.
It helps to know your audience and play to a niche.
It helps to concentrate on the quality of content too, rather than more obviously pushing your faculty and university. Enthusiasm, desire to impart and share knowledge, wit, intelligence … And followers with many points of view, ideally from around the globe I’ve found as this will ‘keep the kettle bowling’. There is never a quiet moment, is there?
I did badly on a quiz in a FutureLearn Free Online Course (FOC). World War 1. Paris 1919. A new world order …
I think I got half right. I chose not to cheat, not to go back or to do a Google search; what’s the point in that. I haven’t taken notes. I wanted to get a handle on how much is going in … or not. Actually, in this context, the quiz isn’t surely a test of what has been learnt, but a bit of fun. Learning facts and dates is, or used to be, what you did in formal education at 15 or 16. This course is about issues and ideas. A ‘test’ therefore, would be to respond to an essay title. And the only way to grade that, which I’ve seen successfully achieved in MOOCs, is for us lot to mark each others’ work. Just thinking out loud. In this instance the course team, understandably could not, nor did they try, to respond to some 7,000 comments. They could never read, assess, grade and give feedback to a thousand 4,000 word essays. Unless, as I have experienced, you pay a fee. I did a MOOC with Oxford Brookes and paid a fee, achieved a distinction and have a certificate on ‘First Steps in Teaching in Higher Education’.
As facts are like pins that secure larger chunks of knowledge I ought to study such a FutureLearn FOC with a notepad; just a few notes on salient facts would help so that’s what I’ll do next week and see how I get on. Not slavishly. I’ll use a pack of old envelopes or some such For facts to stick, rather than ideas to develop, the platform would have needed to have had a lot of repetition built into it. Facts in an essays are like pepper in soup.
Armed with an entire module on research techniques for studying e-learning – H809: Practice-based research in educational technology – I ought to be able to go about this in a more academic, and less flippant fashion.
|From River Ouse Low Tide|
My tried and tested methodology, beyond the doomed ‘winging it’ is ‘concrete aggregate’. Other weeks or months I accumulate a lot of stuff, much of it in a blog like this; not quite a relational database but the ‘stuff’ is here, tagged and of reasonable relevance. In a now defunct OU ePortfolio called ‘MyStuff’ or ‘MyOU’ – I forget, you could then shuffle and rank your gobbets of nonsense and so, discounting the volume of stuff, potentially, have a treatment that could then be turned into an essay.
Such stuff, if it contains, 10,000 words, often with chunks of verbatim passages, can be a hell of a task to hack into shape. You build in bold forms out of concrete and can only get it to look like a garden, or park sculpture, with a pneumatic drill and chisel. Sometimes it works. You get there. It is dry and workable. You’ll more than pass. It depends on the subject, the module and the specific expectations of the assignment. Where you need to tick many boxes this approach may work well.
Clay is the better way forward in most situations. Here you build up your arguments in logical steps then refine them at the end. This, particularly in the social sciences, is where the tutor wants to see how you argue you case, drawing together arguments and facts, mostly those you’ve been exposed to in the module, though allowing for some reading beyond the module. You have to express your opinion, rather than listing the views of others. Get it right and this is the only way to reach the upper grades? Get it wrong, which is the risk, and you may end up with a hollow or limp structure with grades to match.
I have in-front of me an Amateur Swimming Associations (ASA) paper for the Level III Senior Club Coach certificate. There are 12 sheets, facing side only. The paper is waxed, copyrighted and stamped with the ASA logo. Having attended a day long workshop on the topic, done some reading and from my own experience I complete these assignment and submit. It ought to be submitted as is; this is in part a test of authenticity. I have handwritten my responses. My habit and way of doing things is to have it in a word document, so I load the text and tables, complete the required questions/tasks, print off and submit both parts. Invariably I get a note about the typed up/printed off version being so much better … it takes skills that even I lack to write something in some of the minuscule boxes.
I was discussing on Monday with the ASA how to avoid plagiarism with e-assessments.
I mentioned Nottingham University medical students attending a computer-based assessment. I mentioned software that can spot plagiarism. I struggled however with the kind of forms the ASA uses as these tests seem to be have written with the EXAMINER in mind … i.e. to make them easy to mark. Which also makes it easy to cheat. The answer is the same, not open to interpretation. More or less. This isn’t strictly fair … papers are returned covered in red ink – I have redone one paper.
There has to be a sign in process that is used to identify a person.
How many people cheat? Is it such a problem?
Apparently so. Even with certificates and qualifications it appears easy to falsify documents. And often, these determined people are excellent teachers/coaches who have learn their trade as competitive swimmers and/or on the job, so they know what they are doing, they simply don’t have the piece of paper.
I also have in front of me a set of handwritten cards given to me by a colleague who has just taken her Level II Coaching certificate. She failed the written paper. She used these cards to test herself. My intention is to put these into Spaced-Ed, as an exercise, possibly to create or to begin to create a useful learning tool.
I like the way Space-Ed prompts you over the week, tests you on a few things, then leaves you alone. You have time to assimilate the information. Is it easy learning? It is easier learning … nothing beats a period of concerted effort and self-testing to verify that you know something or not.
Whether electronic, or paper … or the spoken word, there is always a bridge to gap, a translation, as it were, of the information a person wants or needs to assimilate and this assimilation process.
Common to all is EFFORT.
Do you work hard at it for longer periods of time … or divide the task up into smaller chunks? Which works best? For you, or anyone? Is there a definitive answer? No. It will vary for you, as with anyone else. It will vary by motivation, inclination, time available, the nature and importance of the topic, the degree to which this topic is covered in print or online, or in workshops and in the workplace. In deed, my contention, would be that the greater the variety of ways to engage with the information the better it will be retained and the more useful it will be when required in a myriad of ways to be applied or is called upon.
I learn from writing something out by hand. I learn again when I type it up. I may not be engaging with it ‘in the workplace;’ but there is engagement non the less through my eyes, hands and fingers. Similarly the person who wrote out this pack of 71 cards (both sides written up) was preparing themselves, afterall, for a written exam. She knows her stuff poolside, her struggle (as I know is the case for many) is translating this into exam-like responses in a highly false setting, away from a pool, from swimmers, having to read words to respond in text, rather than reading an athlete (observation) and responding with a fixing drill or exercise.