The inspectors are in. They are focusing on the progress the school is making in developing assessment post-levels at Key Stage 3? They decide to interview some subject leaders. And it’s the turn of Dawn, the Head of RE. The conversation begins:
HMI: Good to meet you Dawn. As you know we’re taking a close look at your KS3 assessment. We are confident that arrangements at KS4 are secure. We know you use GCSE criteria effectively to assess students and set targets. And we know the school uses data well to track student progress. So tell me: how you have been developing your practice in RE at Key Stage 3?
Dawn: It’s been a challenge. We had been wedded to levelling and sub-levelling students but dropping levels has liberated our thinking about assessment. We have a great SLT here and they listened carefully to the needs of each subject and didn’t impose a whole school model on us. SLT acknowledged that the data we submitted in the past was pretty hopeless as a way of monitoring the effectiveness of our work!
HMI: I like what I am hearing, tell me more your approach in RE.
Dawn: We did a lot of reading around about assessment and I think there were three things that have driven our thinking. First, we recognised the difficulty of defining progress in RE as a linear process. We always had grave doubts about the 8 levels and found sub-levelling a nightmare. It always seemed artificial and unreliable. Frankly, we doubted whether the data it generated had much credibility. Although SLT liked it when we submitted our data, I always suspected they looked at the figures with a raised eyebrow. We were determined not to start re-inventing new levels!
HMI: Still like what I hear; that makes perfect sense and I know that Ofsted’s own reports on RE have cast serious doubts on the use of levels in RE. What was your second reason?
Dawn: We had become really worried that our assessment practice was damaging our teaching and learning. We were distorting lessons by constantly referring to target levels. There was no real formative assessment going on. We wanted to restore opportunities to have meaningful conversations with students about the subject matter we are trying to study. We were determined not to start re-inventing levels again.
The third reason was about our own work as teachers – we were expending too much time and energy on the process of applying artificial levels to assessments rather than having real conversations with students about the ideas and insights in their work and how they might extend their thinking. We recognised that really great curriculum design is essential to effective assessment. We used some national guidance to help us re-think our curriculum design: http://reonlineorg.wpengine.com/religious-education-in-the-new-curriculum/section-2-learning/designed/
HMI: So what are you doing now?
Dawn: We focus our efforts on three things: making sure that we build real progression into the design of our curriculum; being really clear about what it would mean for different students to master the content of our curriculum; and, helping students understand what each topic is about and how they can get on top of the material they are going to study. What we now do is ask questions and set assessments designed that show students what they have learned and what they still need to work on, and identify ways to help them do this. No more mechanistic processes!
We spend a lot more time talking to students about the subject matter rather than their level. It helps us get to know our students. We know that ‘mastery’ means different things for different students and we are able to celebrate every student’s success. We also know when students are under-performing and need extra help or a kick up the backside! It’s not perfect, not when you teach 400 students a week, but it is so much better than the levelling nightmare of the past!
HMI: So what sort of data are you now providing to show student progress?
Dawn: There is no single imposed model in the school. SLT recognise there is no point in collecting ‘data’ that provides no meaningful information about genuine learning. At the end of each unit we can provide evidence about how well the students have mastered the content. We know which students have done well relative to their starting point and who needs to improve. We also know whether the topic has worked and if it needs to be improved next time. It was clear to all of us that Ofsted wasn’t looking for a ‘one-size-fits-all’ model. And frankly it was recognised that Ofsted were unlikely to ask searching questions about data on pupil progress in RE. What matters is the student experience not the needs of the data monster. Our SLT have observed our practice and talked to the students. They know that students understand what the subject is about, what making progress means, how well they are doing and what they need to do to get better. We talk about these things all the time.
HMI: And you are right!