I use multiple choice quizzes a lot in assessing my information literacy classes with DCU Business School (an example described here). The downside of MCQ quizzes is that they’re a bit boring, they can be easy to bluff your way through and they arguably encourage (or can only assess) surface learning (Nicol 2007 p.54).
An interesting twist on MCQ quizzes is to instead get the students to write the questions, share them among a class and ask them to answer, rate and comment on the questions. Assessments like this have been tried for a while without tech. (Denny 2008) and for a few years there has been a free social platform, PeerWise which does all this online.
In a recent webinar, DCU’s Eamonn Costello spoke of his own success using PeerWise:
- Students enjoy it - it can get very high participation rates. Many students surpass requirements - creating and answering extra questions because they enjoy it and find it slightly addictive.
- It encourages a higher level of learning. Designing a question demands a clearer, deeper understanding of the topic, forcing students to “made explicit their understanding of the complexities of the subject matter” (Fellenz 2004)
Here’s an example or how PeerWise might be used in an assessment. A lecturer might ask students to use it to:
1. answer 10 MCQs written by other students
2. write 3 MCQs of their own
Not all the questions created by students will be great - some will be poorly phrased, be unclear, or a student who wrote it might accidentally set a wrong answer as the correct one. Peerwise addresses this by also allowing students to rate and comment on each other’s questions, so in this example the class could also be asked to:
3. rate all the questions they answer (good to bad, easy to difficult)
4. comment on 5 of them (“I think this Q might have been better if you had instead written....”, “good question - I had been unclear on that idea but answering the Q forced me to understand this. I found a good explanation from this web page...”)
Participation is anonymous among the class, but the instructor can export the data with each profile linked to an identifier (say, their student numbers) for marking.
I'm considering using this tool for assessing library orientation for a large cohort of first year undergrad students later this year. As far as I'm aware, Peerwise has never been used in an info. lit. library context before.
With this in mind, I’d like to invite any subject librarians / liaison librarians / IL practitioners interested in it (or anyone else for any reason) to play around with a test Peerwise class I’ve set up. This may inspire your to try Peerwise in your own work. If enough librarians try it out, it could become a useful shared resource - a pool of test-driven MCQs to be reused elsewhere. It’ll just take you a few seconds to register. Once you’re in you can take a look around, answer a few questions and add some yourself - I have already added a few questions to get things started. Here’s a guide to registering and getting started with it.
Let me know what you think, either with comments below or on Twitter. Do you think it can work for an info. lit. assessment?
- Denny, P. et al., 2008. PeerWise. Proceeding of the fourth international workshop on Computing education research - ICER ’08, pp.51–58. Available at: http://portal.acm.org/citation.cfm?doid=1404520.1404526.
- Fellenz, M.R., 2004. Using assessment to support higher level learning: the multiple choice item development assignment. Assessment & Evaluation in Higher Education, 29(6), pp.703–719.
- Nicol, D., 2007. E‐assessment by design: using multiple‐choice tests to good effect. Journal of Further and higher Education, 31(1), pp.53-64.
- Sykes, A., Denny, P. & Nicolson, L., 2011. PeerWise-The Marmite of Veterinary Student Learning. Proceedings of the 10th European Conference on e-Learning Brighton Business School, University of Brighton, UK. 10-11 November, 2011, Vols 1 and 2 (S Greener, A Rospigliosi, eds.), pp.820–830. Available at: http://eprints.gla.ac.uk/90693/1/90693.pdf.