Fourteen Oxfordshire primary schools have collaborated in a research project that used new digital assessment technology to improve the quality of writing and resulting attainment levels of their children. It also reduced the workload of teachers and aided their Continuous Professional Development (CPD).
The trial was conducted using RM Compare, a new digital assessment solution from Abingdon-based e-assessment technology company, RM Results. The new approach to assessment and moderation has been pioneered by Steve Dew, head teacher at Church Cowley St. James School in Oxford, who brought together the group of previously unlinked schools to work together on assessing and moderating writing tasks undertaken by Year 6 children.
Steve Dew is now set to appear at a global conference, E-ATP 2019 in Madrid, later this month alongside RM Results, to share the findings of the research with the world’s assessment community.
He said: “We had previously led local partnership moderation and were looking for a valid way of assessing writing, whilst also hoping to cut down on the time it would take to get everyone involved. RM Compare and the power of Adaptive Comparative Judgement gave us the ability to work and achieve our goals in a completely different way. It’s such a simple process and had really positive feedback from all the teachers and head teachers involved. Each teacher receives a comprehensive overview of all of the children’s work, so we are collectively raising standards in a very collaborative way.”
RM Compare is designed to improve formative assessment and collaborative learning, especially in subjects where work is more open-ended, such as English or Art. This Adaptive Comparative Judgement (ACJ) technology is based on the Law of Comparative Judgement, which proves that people are better at making comparative, paired judgements, rather than absolute ones.
During the trial, Year 6 children in each school were set the same writing task, which was then uploaded to RM Compare. All of the teachers were then invited to assess the work, which was anonymised, as part of the moderation process.
RM Compare showed teachers successive pairs of work side-by-side on a screen, and the teachers judged which best met the assessment criteria. By the end of each assessment at least 20 teachers across the schools had viewed each child’s writing, creating a collective professional consensus of what “good” work looks like across the 14 schools taking part.
Schools received the resulting data, which allowed them to see, for the first time, not only where children sat within their own school, but within the whole cohort of schools taking part. This new method of assessment and moderation allowed teachers to forensically understand the detail of children’s performance in each writing task, across various genres, so that raising attainment was more achievable.
Teachers could compare their own judgements with that of other teachers from other schools to see whether they were in line with the professional consensus. Teachers were also able to see best practice in other schools and compare this with work from their own children. These outcomes helped to inform strategic lesson planning, whilst also aiding teacher CPD.
Steve Dew added “RM Compare made it easy for us to connect. Whilst we undertook the trial with 14 schools that were geographically close, there is no reason why we couldn’t use this on a larger scale, nationally and even internationally.
“It’s reduced the teacher workload hugely – most of our teachers now leave school at half past four and don’t take any books home. The time that we used to spend marking, we can use more strategically – planning and delivering the highest quality of learning in the classroom.”