Are applied psychology students positive about remote learning?

Published Apr 15, 2024 · Updated Mar 05, 2026

remote learningapplied psychology

Applied psychology students are more positive about remote learning than the sector benchmark. Their comments still highlight two make‑or‑break needs: clear assessment and a stable online rhythm.

Across the National Student Survey (NSS) open‑text comments (see our NSS open-text analysis methodology), sentiment for the remote learning theme is mildly negative overall (index −3.4). The applied psychology cohort, however, shows a small positive signal on remote delivery (+5.5), with 51.6% of comments positive. This analysis sits within a sector lens on online delivery and within applied psychology, a Common Aggregation Hierarchy (CAH) subject area used across UK higher education to compare experience by discipline. Students consistently praise teaching staff (+43.7) but highlight the need for feedback processes that feel usable. Feedback makes up 8.6% of all comments.

How do applied psychology students experience remote learning?

The transition from face-to-face to online formats changes how students connect with peers and staff. Synchronous tutorials help sustain interaction, but screen fatigue and variable engagement remain. Applied psychology reads more positively than many subjects, so teams can build on that strength with a consistent weekly rhythm, shorter segments in live sessions, and clearly signposted tasks. Parity for students who study asynchronously matters. Record sessions promptly, add concise takeaway summaries, and provide a single, stable link hub per module so students always know where to go.

How are assessments and exams adapted online?

Online assessment offers flexibility, but it exposes pain points that applied psychology students raise frequently: opaque criteria, uneven marking, and unpredictable turnaround (see common challenges in psychology assessments). This echoes the pattern in which feedback attracts a sizeable share of comments and reads negatively. Publish annotated exemplars, checklist-style rubrics, and clear grade descriptors. Specify submission formats clearly and state feedback turnaround times upfront. If you use proctoring or timed windows, set out contingency routes for connectivity issues and confirm any changes to assessment briefs or marking criteria in writing.

How are university support systems working for these cohorts?

Students value access to academic staff and their encouragement during remote study. General student support can feel harder to navigate online. Consolidate signposting into a single source of truth, align virtual office hours with time‑zone‑aware alternatives, and ensure rapid responses through agreed channels. Counselling, wellbeing workshops, and personal tutor check‑ins help, but they work best when integrated with teaching touchpoints so support feels part of the programme, not an add‑on.

How well do students access learning resources?

Access to e‑books, journals, and recorded content underpins success in remote study. Students report friction when platforms are fragmented or slow. Make remote‑first materials standard, with captioned recordings, transcripts, and low‑bandwidth versions. Library access is appreciated, so extend that strength by simplifying navigation, aligning reading lists with licensing limits, and monitoring usage to address bottlenecks quickly.

What improves interaction in tutorials, seminars, and workshops?

Breakout discussions, assigned roles within groups, and concise polls or quizzes stimulate participation and give immediate feedback on understanding. Staff moderation that prompts contribution and keeps focus on learning outcomes sustains momentum. Provide a short written recap after each session so students who cannot attend live still get the key points and can contribute on discussion boards.

What technical challenges matter most?

Inconsistent connectivity, audio problems, and platform glitches undermine learning, particularly in discussion‑heavy subjects. Providers can reduce friction with a short online orientation, device‑loan schemes, reliable streaming, and a straightforward route for reporting issues. Monitor the top pain points weekly and close the loop with a brief “what we fixed” update so students see action and regain confidence.

What should providers prioritise next?

Remote learning remains part of the blend for applied psychology. The priorities are assessment clarity, operational rhythm, and asynchronous parity, alongside sustained human contact. Programme teams that protect a predictable pattern for online activity and make criteria and feedback actionable see fewer concerns and better engagement.

How Student Voice Analytics helps you

  • Track topic volume and sentiment for remote learning and applied psychology over time, with drill‑downs from provider to school to programme.
  • Slice results by mode, age, domicile/ethnicity, disability, and subject groups to compare like with like.
  • Produce concise, anonymised summaries and representative comments for programme teams and governance.
  • Export tables and charts to brief stakeholders and support continuous improvement cycles.

Explore Student Voice Analytics to track remote learning sentiment and themes across your applied psychology programmes, with interpretation notes in our sentiment analysis guide for UK universities.

Request a walkthrough

Book a free Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready reporting.
  • Benchmarks and BI-ready exports for boards and Senate.
Prefer email? info@studentvoice.ai

UK-hosted · No public LLM APIs · Same-day turnaround

Related Entries

The Student Voice Weekly

Research, regulation, and insight on student voice. Every Friday.

© Student Voice Systems Limited, All rights reserved.