Here’s an innocent little grenade-with-the-pin-out question rolled into the conversation about whether TED-ED has provided us with a whole new way of engaging students by moving content out of class time: on the same day, Plashing Vole is asking whether we shouldn’t be making attendance at conventional university lectures compulsory?
It looks like exactly the kind of retro thinking that academics get accused of, given how much we hear about flipping, collaborative learning, students as producers etc. It could be dismissed as a product of the British higher education system, some kind of wistful cultural preference for discipline and proper behaviour. But as it happens I’ve recently been a bystander to the same deliberation, so it’s global. If we put so much effort into preparing lectures, if we pay a higher rate for their delivery, and if we still structure quite a bit of the discussion and assessment in our teaching around the lecture as marquee event, the logic goes, then why don’t we back ourselves up by making students attend them?
There are some messy vanities bundled up in this question. What does it say about my lecturing style or content if students vote with their feet and don’t show up? (Worse, what if they do, but spend the whole time quietly sledging on social media?) On the other hand, what happens to the students who do continue to attend, and start to feel like the last parishioners in a declining Anglican congregation? Surely they have a right to feel aggrieved?
But pride isn’t the real issue. My colleagues are genuinely worried that students who bypass lectures miss out on key content that would help them perform effectively in assessment. None are sure how far to go in providing compensatory alternatives, including lectures slides, lecture recordings, and even potted versions of key points in person afterwards. We’ll do our best, but there’s a point at which the email that says “I wasn’t at your lecture this week, if I missed anything important can you send me the notes?” does touch a tiny nerve.
So cheer up everyone: the case for the correlation between lecture attendance and grades is on our side, apparently. The data is presented very effectively in the presentation embedded in this post by Jon Tulloch. Jon is responding to Plashing Vole, and the detailed evidence he’s gathered is worth working through (it’ll take two minutes, you won’t even need a cup of tea.)
There’s just the small problem—and Jon himself raises it in the final slide—that the clear correlation between attendance and grades doesn’t prove that attendance results in good grades; things could just as easily swing the other way.
So is there a good reason to make students attend lectures? Should we try to manipulate attendance like a kind of loyalty program or radio competition, with prizes for showing up? Or are we looking at it like welfare reporting and parole, with penalties for missing an appointment? And how are we going to know who shows up, as class sizes increase? If you’re going to make something compulsory, you do need a standard of evidence on which you can make a case for either incentives or penalties stick.
Obviously, Blackboard have a future vision for student end-to-end-lifecyle swipe cards at every corner of the campus, and will no doubt eventually microchip students for us, but until then we’re left with the pen and paper methods that already make the seminar roll call one of the most anachronistic and school-like of university practices. Is this really the tone we want to set, as we also try to explain the complexities of self-managed professionalism that university graduates will need in a churning employment market?
Perhaps a better question is this: if we were going to invent higher education right now, using all the tools available, and knowing what we know about how people learn, what would we include?
Dean Dad is asking if we’d include the standard length term or semester, for example; or whether we’d trial teaching broken into smaller chunks of time, given that completion rates weaken the longer it takes to complete a standard course. In the same vein, I think we can ask whether if we were starting the whole show this week, we’d think “I know! In order to deliver the most important concepts and ideas, that I’ll want students to be able to retain accurately and review extensively over time, I’ll use spoken word. Brilliant.”
Of course we wouldn’t. In one of today’s articles about TED-ED and the capacity it offers for teachers to customise high quality content that can be used as preparation for time spent together, high school teacher Aaron Sams puts it like this:
“I asked myself, ‘What’s the most valuable thing to do with the face time I have with my students?'” he says. “And the answer was not, ‘Stand up and lecture them.'”
So that’s one thing: lecturing isn’t the best way to use people’s time together. It just isn’t.
But the big thing for me is that university education itself is post-compulsory. This is both simple legal practicality, and a principle that we should be careful not to mess with. Our governments might want more students to enrol, and they might want to hold us reponsible for their retention. But we have the privilege of working with adults who have chosen to enrol in a university degree in the context of each of their lives, and it’s this hard and entirely personal choice—rather than any sparkly edtech solution or educational philosophy that we rustle up—that is the foundation of their agency as learners. That’s worth defending.