Are assessment methods working for electrical and electronic engineering students?

By Student Voice Analytics
assessment methodselectrical and electronic engineering

Yes. Across the National Student Survey (NSS), students discussing assessment methods are predominantly negative, with 28.0% positive and 66.2% negative comments (sentiment index −18.8), so clarity, parity and flexibility matter. Within Electrical and Electronic Engineering, as grouped in the sector’s Common Aggregation Hierarchy, overall tone trends more positive (51.2% positive against 44.6% negative) but the friction points are the design and communication of assessment, especially feedback and marking standards. These sector patterns shape the choices described below and explain why students in this discipline respond when assessments are transparent, calibrated and paced across the programme.

Traditional vs modern assessment techniques?

The landscape of testing in Electrical and Electronic Engineering has shifted. Traditional written exams and coursework remain central, while online exams and project-based assessments now mirror professional practice and support flexible delivery. These approaches work best when teams publish a concise assessment brief per task, use checklist-style rubrics tied to outcomes, and run light marker calibration with exemplars so expectations are aligned for students and staff. Designing with mature and part-time learners in mind through predictable submission windows and asynchronous options for oral components strengthens perceived fairness for a diverse cohort.

What are the specifics of engineering assessments?

Assessments must test both theory and application. Lab work and practical tasks assess analytical judgement alongside hands-on problem-solving; group projects simulate collaborative engineering and shared responsibility; and lab reports require precise documentation and analysis. Staff improve reliability by agreeing marking criteria upfront, sampling double-marking where variation is highest, and summarising moderation decisions. A short, programme-level debrief after major tasks—flagging common strengths and pitfalls before individual marks—helps students understand standards and builds trust.

The online examination experience?

The move to online examinations introduces flexibility, including the 24-hour exam window, and better alignment with industry tools such as integrated development environments. To protect equity, universities provide robust technical support, contingency routes and clarity on allowed resources. Short orientation on assessment formats, academic integrity and referencing conventions particularly assists not UK domiciled students. Accessibility built in from the start and carefully crafted question design maintain standards while reducing avoidable stress.

Course content coherence and industry relevance?

Assessment design works well when it tests transfer from classroom to workplace. Embedding real engineering case studies, using current simulation software and involving practitioners in project briefs adds authenticity. Mapping criteria to learning outcomes and stating how evidence of competence will be judged reduces ambiguity, while varied yet coordinated methods across modules avoid duplication and sustain student motivation.

University support and student–university communication?

Direct communication and visible support improve how students experience methods of assessment. Programmes strengthen delivery when they:

  • make the method unambiguous through a one-page assessment brief, checklist rubrics and indicative grade profiles
  • calibrate for consistency using a small set of anonymised exemplars and recorded moderation notes
  • reduce friction for diverse cohorts via predictable submission windows, early release of briefs, accessible formats and brief orientation for those unfamiliar with UK assessment conventions
  • coordinate at programme level with a single assessment calendar to avoid deadline pile-ups and method clashes between modules
  • close the loop with a short post-assessment debrief so students see action on common issues

How does the learning environment impact student wellbeing?

Assessment choices shape workload, predictability and confidence. Transparent criteria, timely formative comments and debriefs reduce anxiety and support retention. Group work benefits from explicit roles, fair marking approaches and early conflict resolution routes; practicals benefit from staffed lab access and clear safety and reporting expectations. Where assessment practice shows parity and consistency, wellbeing improves.

Technological tools and resources?

Simulation tools, virtual labs and digital lab books let students test designs safely and revisit work for revision. Automated text or code-checking supports formative feedback at scale when used with care. Remote elements remain mixed in reception in this discipline, so setting expectations for format, interaction and materials, and keeping timetabling stable, helps delivery feel predictable and professional.

How Student Voice Analytics helps you

Student Voice Analytics shows where assessment methods go wrong and where they work. It segments your open-text by discipline and cohort to pinpoint issues in Electrical and Electronic Engineering, including feedback, marking criteria, online exams and programme coordination. You can track sentiment for assessment methods over time, compare like for like against the sector, and export concise, anonymised summaries and tables for module teams, quality reviews and boards—so actions on calibration, briefs, accessibility and assessment calendars are targeted and evidenced.

Book a Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and governance packs designed for OfS quality and standards and NSS requirements.

More posts on assessment methods:

More posts on electrical and electronic engineering student views: