Supporting the Less Adaptive Student

By Christine Enowmbi Tambe

Updated Apr 08, 2026

1. Introduction

Students do not enter higher education with the same preparation, confidence, or study habits. In problem-based learning environments, that gap can widen quickly, especially for students who are used to teacher-centred teaching and are suddenly expected to take much more responsibility for their own learning.

This challenge is especially visible in demanding first-year quantitative modules. Students may struggle because they lack prior subject knowledge, because their preferred learning habits do not fit problem-based learning, or because both issues are present at once. Previous research (1) highlights the importance of feedback that arrives while students can still use it in supporting underperforming students, and Tempelaar's case study shows how learning analytics, assessment, and blended learning can work together to provide that support earlier and more effectively.

Instructors can draw feedback from either assessment results or learning analytics, but learning analytics is often the timelier source. By analysing data from e-learning systems, such as problems attempted, success rates, hint use, worked examples, and time on task, teams can identify students who may need support before final results make the problem obvious. These systems can then generate dashboards and predictions that help both staff and students monitor progress, review study tactics, and respond sooner. Tempelaar argues that dispositional learning analytics, combined with mastery learning platforms that offer assessment as, for, and of learning, can help address unequal opportunities by making support more targeted and actionable.

2. Instructional methods used in the case study

2.1. The role of dispositional learning analytics

To build a dispositional learning analytics system, institutions combine learning activity data from online environments with self-report data on students' dispositions, experiences, values, and attitudes. This matters because activity data alone shows what students did, while dispositional data helps explain why different students respond differently to the same learning design.

The result is more actionable feedback. Instead of treating all underperformance as the same problem, educators can connect patterns in engagement to specific interventions and better support students whose prior knowledge or learning approaches do not align neatly with problem-based learning.

2.2. The role of assessment

Timeliness is central to actionable feedback. Assessment data is still valuable, especially because early assessment can be a strong predictor of final module performance, but each assessment type contributes something different. Tempelaar identifies three types of assessment that work best in combination when the goal is to support students early.

2.2.1. Assessment as learning

When learning is assessment-guided, as in digital platforms built around mastery learning, the most immediate assessment data comes from everyday learning activity. Assessment as learning happens through formative e-tutorials that present a problem, track whether the student can solve it, and then adjust the next task accordingly. If a student struggles, the system provides scaffolds such as worked examples or guidance for each step of the solution. The benefit is immediate visibility: staff and students can see patterns in progress from the opening weeks of the module, rather than waiting for a later test.

2.2.2. Assessment for learning

Assessment for learning can come from low-stakes summative quizzes, such as the fortnightly quizzes in Tempelaar's case study. Because these quizzes are aligned with the final examination, they can highlight students who are at risk of failing while there is still time to respond. Their limitation is speed. Quiz data is useful, but it arrives less frequently than learning activity data, so it is less effective when fast intervention is needed.

2.2.3. Assessment of learning

A final written examination is used to assess student learning. It remains important as the formal measure of achievement, but on its own it tells institutions too late which students needed support earlier.

2.3. The role of blended learning

In Tempelaar's case study, the blended learning design combined face-to-face workshops with technology-enhanced learning in an introductory mathematics and statistics module for business and economics students. Small-group problem-based learning, supervised by a subject expert, was the core in-person element and attendance was mandatory. The online component, built around e-tutorials, was optional so that students retained responsibility for their own study decisions.

That flexibility did not mean leaving students unsupported. Students with limited prior knowledge were encouraged to use the e-tutorials more intensively because quizzes drew on the same question pool and contributed to final performance. In effect, the blended design offered extra structure without removing student autonomy.

This matters because problem-based learning only works well when students can monitor their understanding and adjust their study strategies. Digital platforms supported that process by letting students track practice performance, quiz preparation, and quiz results in both absolute and peer-relative terms. For less adaptive students, this created an additional route into the course: they could use digital feedback and assessment cues to navigate a learning environment that might otherwise feel unfamiliar or demanding.

3. Outcomes and conclusions

The study identified two broad profiles shaped by students' pre-tertiary learning approaches. One profile was more aligned with deep learning and self-regulation, which fits problem-based learning well. The other relied more on stepwise learning and external regulation, making it less adaptive in a setting where students are expected to direct more of their own learning.

Trace data from the e-tutorials showed that these groups took different routes through the online environment. Students with the less adaptive profile were more likely to rely on the external regulation built into the digital platform, particularly its assessment structure. Even so, the final module grades did not differ significantly between the groups.

That is the key takeaway for practice. When blended learning is paired with assessment-guided technology and timely learning analytics, students who begin with less adaptive learning habits can still succeed in a problem-based learning context. Tempelaar does note an important limitation: the findings are based on learning that took place inside digital environments, while self-study and face-to-face learning outside those platforms would need laboratory or field research to examine in more detail.

FAQ

Q: How does student voice factor into the development and refinement of learning analytics tools and feedback mechanisms?

A: Student voice helps ensure that learning analytics and feedback systems reflect how students actually experience a course, not just how staff assume they do. When institutions gather and use student feedback on clarity, workload, support, and usability, they can refine dashboards, feedback timing, and intervention design so the tools are more relevant to different student groups. That makes learning analytics more likely to support students who need structured guidance, rather than simply measuring them.

Q: What role does qualitative text analysis play in understanding and improving student engagement and learning outcomes in a blended learning environment?

A: Qualitative text analysis helps institutions understand the part of engagement that numbers alone can miss. By reviewing forum posts, survey comments, and other open-text feedback, educators can identify common themes, challenges, and perceptions among students. That deeper view helps teams improve blended learning design, clarify expectations, and tailor support in ways that better fit a diverse student population.

Q: How are the challenges of ensuring equity and inclusion addressed when implementing dispositional learning analytics and assessments in a diverse student body?

A: Equity and inclusion depend on using analytics and assessments in ways that recognise different starting points, rather than rewarding only one preferred learning style. That means designing tools and interventions that are sensitive to varied educational backgrounds, checking for bias, and monitoring outcomes across student groups. It also means involving diverse students in the development and review of these systems, so institutions can spot unintended barriers early and provide more appropriate support.

References:

[Source] Tempelaar D. Supporting the less-adaptive student: the role of learning analytics, formative assessment and blended learning. Assessment & Evaluation in Higher Education. 2020 May 18;45(4):579-93.
DOI: 10.1080/02602938.2019.1677855

[1] Pardo A. A feedback model for data-rich learning experiences. Assessment & Evaluation in Higher Education. 2018 Apr 3;43(3):428-38.
DOI: 10.1080/02602938.2017.1356905

Request a walkthrough

Book a free Student Voice Analytics demo

See all-comment coverage, sector benchmarks, and reporting designed for OfS quality and NSS requirements.

  • All-comment coverage with HE-tuned taxonomy and sentiment.
  • Versioned outputs with TEF-ready reporting.
  • Benchmarks and BI-ready exports for boards and Senate.
Prefer email? info@studentvoice.ai

UK-hosted · No public LLM APIs · Same-day turnaround

Related Entries

The Student Voice Weekly

Research, regulation, and insight on student voice. Every Friday.

© Student Voice Systems Limited, All rights reserved.