Evidence-led course design: strengthening persistence in MOOCs

Steps leading up a wall

The recent introduction of AI-powered functionality within Coursera provided an opportunity for us to explore how learners are engaging with our portfolio of MOOCs. These tools also raised important questions about learner behaviour and course design. In particular, we wanted to understand where learners were disengaging, what was limiting persistence, and how we might redesign our courses to better support success.

To address this, we undertook a structured transformation of a selection of our MOOCs, piloting AI-enabled features within Coursera and grounding our approach in learner data, academic judgement, and learning design expertise.

In this post, we share what we changed, what worked, what did not, and how these insights are now informing our degree programmes. This work was led by Catia Costa (Online education enhancement manager) and me, Kseniia Vladimirova (Senior Coursera Programme Manager), in collaboration with Tim Hall (Senior Manager: Product Innovation) and Larisa Grice (Senior Learning Designer).


Tackling early disengagement: redesigning the beginning of our MOOCs

We started by auditing sixteen of our highest-enrolment MOOCs and the results revealed a worrying pattern: there was a sharp drop-off immediately after the first few items in the course. In some business management courses, up to 48.3% of learners disengaged before progressing meaningfully.

In most cases, the issue was not academic quality. Instead, we identified several design challenges:

  • dense learning material
  • confusing navigation
  • limited scaffolding in early activities.

We realised that first impressions of a course are critical. If learners do not feel oriented, motivated and supported in the first weeks, many will not continue.

What we changed: onboarding, scaffolding and dialogue

We redesigned the beginning of the 16 courses identified for change, to create what we called a frictionless entry point. We:

  • introduced standardised onboarding materials to clearly welcome and orient learners
  • scaffolded learning from the very first item
  • embedded dialogue activities using the AI-powered Coursera Coach, encouraging learners to engage in structured, Socratic exchanges around key concepts.

These dialogue activities were not positioned as add-ons, but as integral learning moments. They prompted learners to reflect, apply ideas and test understanding early in the course.

a list of Standardised introductory material to welcome and orientate learners
Standardised introductory material to welcome and orientate learners

What happened: measurable gains in completion

The results of this optimisation surprised us!

Optimisation was about aligning pedagogy, learner behaviour and platform tools in a way that works for learners. This phase reinforced a simple lesson: thoughtful onboarding and early cognitive engagement matter more than we sometimes assume.

Applying the lessons: designing new MOOCs and progression routes

Once we had strengthened our existing portfolio, we applied the lessons to new course development through two approaches: progression routes into degree study and some new applied skills MOOCs.

Designing AI-supported progression routes into degree study

We began developing new Coursera Specializations designed to provide structured progression routes into degree study. Initially, we experimented with using Coursera’s AI tools to extract and repurpose content from existing degree modules.

This accelerated early prototyping. However, we quickly saw the limitations. AI-generated structures required substantial academic redesign to avoid repeating disengagement issues we had already identified.

In one case, we completely rebuilt a Specialization after reviewing the first iteration. In another, a second iteration required only targeted enhancement by academic subject experts, including:

  • integrated virtual labs
  • aI-supported assessments
  • role-play simulations.

This reiterated to us that AI tools are most effective when used in partnership with academic expertise, not as substitutes for it.

screen shot of a 3 course specialisation
Progression routes into degree study: an example specialization

Designing new MOOCs around applied skills and learner experience

In parallel, we developed a new category of MOOCs focused on competencies that are often difficult to teach at scale, such as communication, professional presence and authentic self-expression.

In collaboration with The Royal Central School of Speech and Drama, we drew directly on the design principles established during our course optimisations, and developed two new MOOCs: The Impactful Communicator and The Authentic Communicator. They included

  • early and consistent scaffolding
  • reflective and practice activities that learners could apply in real professional contexts
  • structured feedback loops to reinforce progress and build confidence
  • integration of AI-supported course elements as natural components of the learning experience.

In total, we developed 12 new MOOCs in 2025. Across the board, these courses demonstrated higher engagement rates than those launched in previous cycles – validation of the design principles we have worked hard to establish.

Which of Coursera’s AI tools genuinely improved learning?

As we developed and redesigned our MOOCs, we piloted a range of AI tools integrated into Coursera and analysed over 285 pieces of learner feedback to understand their educational impact.

Some tools had a clear and measurable effect on engagement.

Dialogue activities using Coursera Coach were particularly powerful. We received exceptional feedback from learners.

  • 79% rated them as ‘very useful’.
  • 42.5% reported being challenged to think in new ways.
  • Learners engaged more deeply with key concepts, especially when dialogue was embedded early in a course.

AI-assisted grading in peer review activities resolved a long-standing bottleneck in feedback cycles, reducing delays and preventing learners from stalling while waiting for peer responses.

Role-play simulations with AI personas enabling practice in realistic professional scenarios were highly effective for applied skills development, with consistently strong learner satisfaction.

screen shot of a role play in maths
Example of a role play activity used in mathematics

Other tools such as AI dubbing and AI course builder tools required more caution relating to data protection considerations under UK legislation and ethical questions from some academics. We therefore limited their use.

The overall lesson was clear: the question was never whether to adopt AI tools wholesale, but how to evaluate each one against learner benefit and pedagogical integrity.

From experimentation to systemic change: applying MOOC insights to degree programmes

Perhaps the most important outcome of this work has been the deep, evidence-based insights we have gained into what drives learner engagement and persistence. We learned:

  • where disengagement is most likely
  • how early scaffolding affects persistence
  • which forms of interaction sustain motivation
  • where critical transition points occur

We now apply these insights to our degree programmes and future work will focus on early engagement, persistence through academically complex content, and retention at key transition points.

Lessons learned and recommendations for practice

One of the most important lessons we learned is that not every experiment succeeds, and not every AI tool can be used in courses at scale.

For institutions exploring AI integration in online learning, we would highlight three practical recommendations.

1. Integrate learner feedback into AI design from the outset
If we were to begin again to build courses with Coursera’s AI tool, we would invest earlier in feeding qualitative learner feedback directly into AI system prompts. This would reduce the likelihood of predictable design issues in the first iteration and limit the need for extensive manual correction.

2. Set clear structural constraints for AI-generated designs
We would establish parameters from the outset. For example, limiting courses to four units, excluding unnecessary discussion prompts, and ensuring dialogue activities are strategically placed.

3. Prioritise motivation and learning outcomes over technical capability
Above all, we would continue to emphasise the defining principle of this project: AI tools are most helpful when they increase participants’ motivation to stay on a course and support them in achieving learning outcomes effectively.

Designing for persistence in an AI-enabled landscape

Our experience in 2025 showed that platform-level AI innovation does not automatically improve learning. However, when combined with intentional design, data-informed reflection and academic expertise, it can meaningfully enhance learner engagement without compromising academic integrity.

Completion rates improved. Engagement deepened. But most importantly, we developed a clearer understanding of how to design for persistence and success in fully online learning.

As AI tools continue to evolve, our focus remains constant: using them in ways that genuinely support learners and strengthen the quality of online education.


Photo by Kanhaiya Sharma on Unsplash