Do teacher training students trust marking criteria in UK higher education?

By Student Voice Analytics
marking criteriateacher training

Mostly, no. Across the National Student Survey (NSS) open-text, the marking criteria category skews negative, with 87.9% Negative and a sentiment index of −44.6. In teacher training, sentiment on criteria is similarly low at −45.6, while students place heavy emphasis on placements (16.1% of comments) and often discuss feedback with a −18.8 tone. The category aggregates sector-wide NSS comments about how criteria are presented and applied, and teacher training refers to the UK subject grouping used for like-for-like comparisons. These patterns shape the experiences analysed below and point to practical fixes.

Programmes benefit when they prioritise usable, transparent criteria and foreground the student voice. Analysing structured surveys alongside open-text feedback enables staff to calibrate criteria, confirm whether they help students understand progress, and adjust assessment design where needed. Doing so aligns marking approaches with programme outcomes and the realities of school-based practice.

What is unique about marking criteria in teacher training?

Marking criteria need to span evidence-informed theory and assessed practice. Students must integrate pedagogy, policy and subject knowledge with classroom management, lesson design and reflective practice. Traditional academic metrics alone rarely capture this duality. Programmes that translate learning outcomes into criteria that explicitly articulate what good looks like in both written work and practice-based tasks help students track progress and recognise strengths. Transparent, consistent criteria tailored to teacher education reduce mixed signals about performance and progression.

Where do expectations about criteria diverge from reality?

Students expect criteria to be relevant, fair and actionable. They report gaps between criteria published in assessment briefs and how they are applied in marking. Variation in marker interpretation undermines confidence and masks how to improve. Involving students in reviewing criteria and exemplars, and inviting questions before submission windows, surfaces points of ambiguity. Programme teams can then refine wording, align expectations across modules and clarify intentional differences where outcomes diverge.

How does feedback connect to marking criteria?

Feedback links performance to the criteria. Formative comments guide learning within modules; summative comments should reference rubric lines and explain judgements. Students say feedback that arrives late, is generic or lacks alignment to criteria is hard to act on. Referencing the rubric directly, signposting the next step, and sequencing feed-forward opportunities before major submissions increase utility. Given teacher training feedback sentiment trends negative in sector data, programmes that standardise turnaround expectations and embed brief feed-forward touchpoints lift both learning and perceived fairness.

How can we improve transparency and clarity?

Students need to see criteria early, with the assessment brief, and to explore them in class or online. Checklist-style rubrics with unambiguous descriptors, weightings and common error notes reduce interpretation drift. Annotated exemplars at key grade bands demystify standards and support self-assessment. Where modules share outcomes, standardising criteria and highlighting any intentional differences up front prevents confusion. A short “how your work was judged” summary with each grade helps students connect output to judgement.

How can programmes improve consistency in marking?

Reliability improves when teams calibrate. Short calibration sessions using a bank of shared samples, with agreed notes recorded for students, align expectations. Exemplar libraries, moderation and light-touch audits identify patterning in marks and language. These steps do not constrain academic judgement; they scaffold consistent application of standards across assessors and placements.

How do criteria shape professional development?

Criteria signal what the profession values. When they point directly to effective lesson planning, adaptive pedagogy and evidence-informed decision-making, students internalise standards they will later apply. If criteria become rigid or detached from practice, they risk misdirecting effort. Alumni reflections, mentor input from schools, and structured self-assessment can test fit with classroom realities and prompt iterative refinement.

What should programmes change next?

  • Involve students in criteria reviews and Q&A ahead of assessment windows.
  • Publish annotated exemplars for common tasks and align them to rubrics.
  • Calibrate markers and share “what we agreed” summaries with cohorts.
  • Standardise criteria where learning outcomes overlap; explain differences where they do not.
  • Provide rubric-referenced feedback within agreed timelines and include a short feed-forward action.

By implementing these steps, programmes make criteria more legible, reduce inconsistency, and strengthen readiness for school-based practice.

How Student Voice Analytics helps you

Student Voice Analytics surfaces where sentiment on marking criteria deteriorates and why, with drill-downs from provider to programme and cohort. It enables like-for-like comparisons for teacher training against the wider sector, including mode, domicile and age, so teams can target modules where tone is most negative. Exportable, anonymised summaries highlight priority fixes, track the impact of calibration and rubric changes over time, and help programme teams evidence progress to boards and external reviewers.

Request a walkthrough

Book a Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and governance packs designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready governance packs.
  • Benchmarks and BI-ready exports for boards and Senate.

More posts on marking criteria:

More posts on teacher training student views: