What UK Engineering (Non-Specific) Students Say: NSS Feedback Analysis (3,300 Comments, 2018–2025)

Key findings

  • 3,300 comments analysed across UK engineering (non-specific) programmes (2018–2025); 52% positive overall
  • Learning resources is the most-discussed topic (10.7% of comments, sentiment index 19.5)
  • Marking criteria is the biggest pain point (sentiment -42.8, 2.9 vs sector)
  • Personal development is a clear strength (sentiment 68.0)

What students are saying

Engineering students centre their feedback on the building blocks of study: learning resources, course content, and assessment. The single largest topic is Learning resources, accounting for around one in ten comments (≈10.7%). Tone here is positive overall (index ~+19.5) and share is far above the sector average for the same topic, though sentiment sits just below sector by a small margin.

Course Content (type and breadth) is also prominent (≈9.5%) and net positive (≈+17.0). By contrast, Module choice/variety features less often but leans negative (≈−6.9) and sits well below sector on tone, signalling some appetite for clearer or more flexible pathways.

Assessment & Feedback is a major thread. Students talk about Feedback in 8.3% of comments with a near‑neutral balance (≈−3.1) but notably better than the wider sector on tone. However, two categories remain clear pain points: Marking criteria (≈−42.8) and Assessment methods (≈−31.1). When expectations, rubrics and exemplars are opaque, sentiment drops sharply.

Operationally, the delivery environment is comparatively steady. Remote learning is relatively common (≈6.8%) and positive (≈+8.5), substantially above sector tone. Organisation and management (≈5.2%) and Scheduling/timetabling (≈4.6%) also trend positive and sit well above sector benchmarks on sentiment, suggesting basic rhythm and predictability are broadly working for this cohort. People‑centred support is a net strength: Teaching Staff are viewed warmly (≈+31.8) and Student support is positive (≈+14.1). Personal development stands out as a highlight (≈+68.0). Placements/fieldwork, by contrast, barely feature in Engineering compared with the sector overall.

Top categories by share (engineering vs sector)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Learning resources Learning resources 10.7 3.8 6.9 19.5 -1.9
Type and breadth of course content Learning opportunities 9.5 6.9 2.6 17.0 -5.6
Feedback Assessment & feedback 8.3 7.3 1.0 -3.1 11.9
Remote learning The teaching on my course 6.8 3.5 3.3 8.5 17.5
Organisation & management of course Organisation & management 5.2 3.3 1.8 14.2 28.2
Scheduling/timetabling Organisation & management 4.6 2.9 1.8 13.5 30.0
Student support Academic support 4.1 6.2 -2.1 14.1 0.9
Teaching Staff The teaching on my course 4.0 6.7 -2.7 31.8 -3.7
Personal Tutor Academic support 3.9 3.2 0.7 9.0 -9.7
Delivery of teaching The teaching on my course 3.8 5.4 -1.6 3.0 -5.7

Most negative categories (share ≥ 2%)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Marking criteria Assessment & feedback 3.7 3.5 0.1 -42.8 2.9
Workload Organisation & management 2.6 1.8 0.7 -34.8 5.2
Assessment methods Assessment & feedback 3.7 3.0 0.7 -31.1 -7.4
COVID-19 Others 3.2 3.3 -0.2 -21.4 11.5
Module choice / variety Learning opportunities 3.3 4.2 -0.9 -6.9 -24.3
IT Facilities Learning resources 2.6 1.2 1.4 -6.2 7.8
Feedback Assessment & feedback 8.3 7.3 1.0 -3.1 11.9

Most positive categories (share ≥ 2%)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Personal development Learning community 3.1 2.5 0.6 68.0 8.2
Student life Learning community 2.6 3.2 -0.6 35.6 3.5
Teaching Staff The teaching on my course 4.0 6.7 -2.7 31.8 -3.7
Learning resources Learning resources 10.7 3.8 6.9 19.5 -1.9
Type and breadth of course content Learning opportunities 9.5 6.9 2.6 17.0 -5.6
Organisation & management of course Organisation & management 5.2 3.3 1.8 14.2 28.2
Student support Academic support 4.1 6.2 -2.1 14.1 0.9

What this means in practice

  • Make assessment clarity non‑negotiable. Publish annotated exemplars, checklist‑style rubrics, and clear marking criteria for every assessment type. Calibrate expectations across markers and modules, and set a realistic service level for feedback so students can act on it.
  • Keep the operational rhythm steady. The positive tone around organisation, scheduling and remote learning suggests these foundations are working; protect them with a single source of truth for timetable changes, a weekly “what changed and why” update, and clear ownership for decisions.
  • Invest where students feel the benefit daily. Learning resources are a defining theme in Engineering. Maintain availability and reliability, reduce access friction (physical or digital), and signpost how to get the most from what’s provided.
  • Use course architecture to reconcile breadth with choice. Where “type and breadth” is valued but “module choice/variety” trends negative, map progression routes, show option trade‑offs early, and ensure outcomes are transparent.

Data at a glance (2018–2025)

  • Top topics by share: Learning resources (≈10.7%), Type & breadth of course content (≈9.5%), Feedback (≈8.3%), Remote learning (≈6.8%), Organisation & management of course (≈5.2%).
  • Cluster view:
    • Delivery & ops (placements, scheduling, organisation, comms, remote): ≈18.6% of all comments, with sentiment above sector for remote learning, organisation and scheduling.
    • People & growth (personal tutor, student support, teaching staff, delivery of teaching, personal development, student life): ≈22.8% of comments, strongly positive overall, led by Personal development.
  • How to read the numbers. Each comment is assigned one primary topic; share is that topic’s proportion of all comments. Sentiment is summarised as an index from −100 (more negative than positive) to +100 (more positive than negative), then averaged at category level. Sector comparisons refer to like‑for‑like category baselines.

How Student Voice Analytics helps you

Student Voice Analytics turns open‑text survey comments into clear priorities you can act on. It tracks topics and sentiment by year so you can see what is changing, where, and by how much—at whole‑institution level and for specific schools, departments and programmes.

It also enables like‑for‑like sector comparisons across CAH codes and by demographics (e.g., year of study, domicile, mode of study, campus/site, commuter status), so you can evidence improvement relative to the right peer group. You can segment results by site/provider, cohort and year, and generate concise, anonymised summaries for programme teams and external stakeholders without trawling thousands of responses. Export‑ready outputs (web, deck, dashboard) make it straightforward to share priorities and progress across the institution.

How to use this data

This page presents sector-level student feedback analysis for engineering (non-specific), with sentiment benchmarks and topic breakdowns you can reference directly in institutional documents.

Use this for

  • Annual Programme Review (APR) — reference the top-categories table and sentiment benchmarks to contextualise your programme's results against the discipline.
  • TEF and quality enhancement — cite the sentiment index and sector delta columns as evidence of awareness of student priorities relative to the sector.
  • Professional body revalidation — draw on placement, assessment and support data for evidence of responsiveness to student feedback in your discipline.
  • Staff-Student Liaison Committees (SSLCs) — share the key findings and most-negative categories as discussion starters with student representatives.
  • New programme design — use the topic share and sentiment data to anticipate which aspects of the student experience will need proactive attention.

Common themes in this subject area (on our blog)

Most-read posts in this subject area

Recommended next steps

  1. Look for repeatability: which themes recur across years and modules?
  2. Check whether issues are structural (resources/staffing) or local (one module/team).
  3. Define what “good” looks like for the subject (examples, rubrics, assessment clarity).
  4. Track movement: do actions reduce volume/negativity for key themes next cycle?

Cite this page

Student Voice AI (2025). "Engineering (non-specific) student feedback analysis (CAH10-01-01)." Student Voice AI. https://www.studentvoice.ai/cah3/engineering-(non-specific)/

Case studies on resources, course content and assessment in engineering

The Student Voice Weekly

Research, regulation, and insight on student voice. Every Friday.

© Student Voice Systems Limited, All rights reserved.