Are we closing the feedback gap in health sciences?

By Student Voice Analytics
feedbackhealth sciences (non-specific)

Not yet. Across the National Student Survey (NSS), the Feedback theme remains net negative (57.3% negative; sentiment index −10.2), and students in health sciences (non‑specific) mirror that pattern where clarity and timeliness matter most. Within this area, marking criteria attracts the lowest sentiment (−42.8) and scheduling pressures undermine the usefulness of feedback (−16.0). Placements and fieldwork remain a visible strength, featuring in 7.9% of comments, yet the gap between what students submit and the guidance they can act on persists. The sections below set out practical moves—calibration, exemplars and visible service standards—to close that gap.

What defines effective feedback in health sciences?

Feedback that maps to the assessment brief, references marking criteria and includes specific feed‑forward gives students a route to improvement. Students often describe comments as generic or insufficiently detailed; using concise rubrics with annotated exemplars reduces ambiguity and raises consistency across modules. Providers should track turnaround to an agreed service level and sample feedback quality for specificity and actionability. Text analysis helps spot patterns of vagueness or misalignment by module or marker so programme teams can intervene early.

How can we reduce marking inconsistency?

Variability between tutors erodes confidence in grades and undermines learning. Run short calibration sprints where tutors co‑mark samples and reconcile standards, then document decisions in shared exemplars. Strengthen moderation with targeted spot checks on alignment to criteria and feed‑forward quality. Schedule regular, brief marker development sessions focused on borderline decisions, common errors and how to evidence judgement in comments. Communicate outcomes to students so they can see how consistency is maintained.

How should course organisation support actionable feedback?

Organisation shapes whether feedback arrives in time to be used. Health sciences students balance academic work and practice learning; unclear requirements (for example, NHS ambulance service elements) or unmarked coursework stall progression and make feedback less useful. Name an owner for timetabling, keep a single source of truth for changes, and align assessment points so students can apply feedback in the next task. Where possible, sequence related assessments to enable iterative improvement.

How do we strengthen communication around assessment?

Busy staff calendars cannot be a barrier to feedback students can act on. Establish predictable channels—brief drop‑ins in teaching weeks, online Q&A windows tied to assessment deadlines, and short “how to use your feedback” guidance in each module. Share termly “you said → we did” updates on turnaround performance and format changes. Make expectations explicit in the assessment brief and invite quick clarification questions early in the cycle.

What revision support best prepares students for OSCEs?

Students benefit most from timely, specific guidance tied to observed performance. Provide OSCE‑aligned checklists, brief video exemplars and low‑stakes practice stations with rapid, criteria‑referenced feed‑forward. Use short, structured debriefs to help students prioritise the next skill to practise. Where timing is tight, triage comments to two strengths and two priorities, plus a signpost to a relevant resource or session.

How should assessment structure drive learning?

Assessment design should surface the intended learning and make progress visible. Write unambiguous assessment briefs, map comments to criteria, and avoid generic phrasing. Require markers to reference exemplars when noting threshold performance, and to provide one action the student can attempt before the next submission. In clinical skills, anchor comments in the scenario: what the student did, why it matters, and what to change next time.

How do we improve resource accessibility for feedback?

Students need straightforward access to teaching materials, exemplars and released feedback. Ensure the VLE (e.g., Learning Central) houses current rubrics, anonymised exemplars and short walkthroughs of marking criteria. Simplify navigation, standardise folder structures across modules, and support staff to use the platform consistently. Release summary patterns (common strengths, common pitfalls) so the cohort can adjust promptly.

Which support services make feedback more useful?

Personal tutors, skills teams and the Students’ Union can translate feedback into an improvement plan. Offer brief 1:1s focused on interpreting comments against criteria, and workshops on using feedback to plan revision or improve clinical technique. Train support staff in health‑sciences‑specific assessment language so advice is precise and consistent with programme standards.

How Student Voice Analytics helps you

Student Voice Analytics turns NSS and local survey comments into focused priorities for health sciences. It tracks sentiment and topics for feedback, assessment methods and scheduling, with drill‑downs to programme and module where available. You can benchmark against the wider subject area, spot where tone is weakest, and evidence change through on‑time rates and quality spot checks. Exportable summaries help module teams calibrate standards, refine assessment briefs and publish visible “you said → we did” updates.

Book a Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and governance packs designed for OfS quality and standards and NSS requirements.

More posts on feedback:

More posts on health sciences (non-specific) student views: