Student Voice Analytics for Creative Writing — UK student feedback 2018–2025

Scope. UK NSS open-text comments for Creative Writing (CAH19-01-05) students across academic years 2018–2025.
Volume. ~986 comments; 96.9% successfully categorised to a single primary topic.
Overall mood. Roughly 55.6% Positive, 42.4% Negative, 2.0% Neutral (positive:negative ≈ 1.31:1).

What students are saying

Creative Writing students talk most about Assessment & Feedback, Learning Resources and staff support. The single largest topic is Feedback (8.0% share), and—unusually versus the wider sector—the tone here is mildly positive (index 6.2; +21.2 points vs sector). The pain point in Assessment is instead Marking criteria (4.4% share), which remains strongly negative (−41.4) and is echoed by concerns about Assessment methods (−12.6). The pattern is familiar: students want explicit criteria, calibrated exemplars and predictable turnaround.

“People and growth” themes are a clear strength. Personal development carries a very strong positive tone (71.9), and Teaching Staff are warmly regarded (41.7), both above sector on sentiment. Student support is also positive (24.1), while Personal Tutor references are net positive (11.5) but slightly below sector expectations on tone. Students value clarity, encouragement and responsiveness.

Learning resources are prominent by volume (7.4%) and positive overall (23.6). Under the surface, however, sub-topics diverge: Library sentiment is negative in this dataset (−12.5 vs +26.7 sector), and IT Facilities are a notable friction point (−30.0, below sector). This suggests the basics—reliable systems, access and discoverability—matter as much as specialist materials.

On delivery and operations, Remote learning is still a mild drag (−9.0), though Scheduling/timetabling is near neutral (5.0) and noticeably better than sector. Organisation/management is close to neutral (−1.6) and better than sector, and course communications, while a smaller topic by volume here, are less negative than sector when they do arise. Placements/fieldwork are scarcely mentioned (0.4% vs 3.4% sector), signalling they are not a defining feature of the student experience in this discipline.

Finally, collaboration dynamics deserve attention: Opportunities to work with other students carry a slightly negative tone (−7.1), suggesting groupwork expectations, structure and assessment could be made clearer.

Top categories by share (Creative Writing vs sector)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Feedback Assessment and feedback 8.0 7.3 0.7 6.2 21.2
Learning resources Learning resources 7.4 3.8 3.7 23.6 2.2
Personal Tutor Academic support 6.6 3.2 3.4 11.5 −7.2
Teaching Staff The teaching on my course 6.2 6.7 −0.6 41.7 6.2
Type & breadth of course content Learning opportunities 6.0 6.9 −1.0 24.3 1.7
Student support Academic support 5.7 6.2 −0.6 24.1 10.9
Personal development Learning community 5.7 2.5 3.2 71.9 12.1
COVID-19 Others 5.1 3.3 1.8 −27.1 5.8
Remote learning The teaching on my course 4.8 3.5 1.3 −9.0 0.0
Module choice / variety Learning opportunities 4.4 4.2 0.2 11.0 −6.4
Marking criteria Assessment and feedback 4.4 3.5 0.9 −41.4 4.3

Most negative categories (share ≥ 2%)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Costs / Value for money Others 2.1 1.6 0.5 −43.8 9.0
Marking criteria Assessment and feedback 4.4 3.5 0.9 −41.4 4.3
IT Facilities Learning resources 2.2 1.2 1.0 −30.0 −16.0
COVID-19 Others 5.1 3.3 1.8 −27.1 5.8
Assessment methods Assessment and feedback 2.2 3.0 −0.8 −12.6 11.1
Remote learning The teaching on my course 4.8 3.5 1.3 −9.0 0.0
Opportunities to work with other students Learning community 2.9 2.0 1.0 −7.1 −8.1

Most positive categories (share ≥ 2%)

Category Section Share % Sector % Δ pp Sentiment idx Δ vs sector
Personal development Learning community 5.7 2.5 3.2 71.9 12.1
Teaching Staff The teaching on my course 6.2 6.7 −0.6 41.7 6.2
Student life Learning community 2.8 3.2 −0.3 35.5 3.4
Type & breadth of course content Learning opportunities 6.0 6.9 −1.0 24.3 1.7
Student support Academic support 5.7 6.2 −0.6 24.1 10.9
Learning resources Learning resources 7.4 3.8 3.7 23.6 2.2
Delivery of teaching The teaching on my course 3.6 5.4 −1.9 23.6 14.9

What this means in practice

  • Make assessment clarity non‑negotiable. Publish annotated exemplars, use checklist‑style rubrics, and set realistic service levels for feedback turnaround. Close the loop by explaining how feedback connects to marking criteria and how to use it next time.
  • Strengthen the resource foundations. Where Library and IT issues surface, prioritise access and reliability: reading list availability, e‑resource discoverability, and stable, simple digital workflows for submission, feedback and seminars.
  • Support peer learning by design. If collaboration is part of the experience, set clear expectations, roles and assessment weightings, and provide light scaffolding (e.g., short, structured workshops or peer‑review templates) to reduce friction.
  • Keep delivery predictable. Maintain a single source of truth for schedules and changes, and be explicit about when and why remote is used. Small, consistent updates lower uncertainty and improve sentiment around course operations.

Data at a glance (2018–2025)

  • Top topics by share: Feedback (8.0%), Learning resources (7.4%), Personal Tutor (6.6%), Teaching Staff (6.2%), Type & breadth of course content (6.0%).
  • Clusters: the delivery & ops cluster (placements, scheduling, organisation, comms, remote) accounts for ~11.4% of all comments; the people & growth cluster (personal tutor, student support, teaching staff, delivery of teaching, personal development, student life) accounts for ~32.5%, with strongly positive tone.
  • Under/over‑representation vs sector: Placements/fieldwork are far less discussed here (0.4% vs 3.4% sector). Library carries a notably lower sentiment than sector (−12.5 vs +26.7).
  • How to read the numbers. Each comment is assigned one primary topic; share is that topic’s proportion of all comments. Sentiment is computed per sentence and summarised as an index from −100 (more negative than positive) to +100 (more positive than negative), averaged at category level.

How Student Voice Analytics helps you

Student Voice Analytics turns open‑text survey comments into clear, prioritised actions. It tracks topics and sentiment over time (by year) for every discipline, and works at whole‑institution level as well as for specific faculties, schools and programmes. It produces concise, anonymised theme summaries and representative comments so programme teams and external partners can act without trawling thousands of responses.

Critically, it proves change on a like‑for‑like basis with sector comparisons across CAH codes and by demographics (e.g., year of study, domicile, mode of study, campus/site, commuter status). You can segment results by site/provider, cohort and year to target interventions where they will move sentiment most. Export‑ready outputs (for web, deck or dashboard) make it straightforward to share priorities and progress across the institution.

Insights into specific areas of creative writing education