top of page
Participants
 

Our target users were teachers and professors with experience using curriculum planning and instructional platforms. We focused on participants who regularly engage in lesson creation and classroom resource management.

Participants were recruited through email outreach and posters placed around the RIT campus. A total of six individuals were selected based on their fit with the user profile. This number was determined by both the project budget and the need for a sufficient dataset to draw meaningful, actionable conclusions.

We recruited six participants for the usability test: four professors and two teaching assistants, all of whom had prior experience using a lesson management system called myCourses to create and manage course content.

Each participant was given a brief introduction to the New Path Learning platform, followed by task cards outlining specific actions to complete within the "Build-A-Lesson" feature. Participants were asked to notify us when they believed they had completed each task. They were not informed that their performance was being timed. The moderator evaluated task success based on whether the participant met the defined criteria within the benchmark time, without prompting or assistance.

Test Environment

All usability testing sessions were conducted at RIT’s Usability Testing Lab, a controlled environment designed to minimize distractions and ensure consistent testing conditions.

  • Location: RIT Usability Testing Lab

  • Device: Lenovo Legion running Windows

  • Browser: Chrome or a Chrome-based browser with a stable internet connection

  • Instructions: Delivered verbally by the moderator, with printed task cards provided for reference

  • Recording Setup: Screen recording of user activity and a front-facing camera to capture participant reactions

This setup ensured we could observe both on-screen behavior and non-verbal cues, providing richer insights during analysis.

Tasks

Task Scenario given to participants:

 

You are an 8th-grade educator who wants to create a custom lesson on Chemical reactions. You want to create an interactive lesson that involves mixed media elements of presentations, videos, and games.

Task 01

Start by creating a lesson on Chemical reactions for your class. Give a name, add material you find relevant.

Task 02

You have diagrams on your computer that you drew and want to add to your lesson. JPG file name:

chemical-reactions.pdf

Task 03

Incorporate a video from YouTube on "Types of chemical reactions" and then rearrange the lesson elements in any order

Task 04

Add a presentation from the gallery. You want to ensure that it aligns with the New York State curriculum for 8th-grade students.

Task 05

Go back to your dashboard and view the latest interactive element added to the lesson you built.

Previous Usability Tests

In 2022, New Path Learning conducted a user study focused on the sign-in and onboarding experience. The study also evaluated how users created classes and assigned work through the platform, using the System Usability Scale (SUS) to measure effectiveness. Based on the findings, the team recommended improvements related to task efficiency, system feedback, and interface clarity. These suggestions were implemented to varying degrees across the platform, helping to inform future iterations and laying the groundwork for continued usability enhancements.

New Path Learning - 
A Lesson Management System

Usability Testing • Research Planning • Data Analysis

banner

OVERVIEW

Industry/ Market segment

Education/EdTech

Technology/Platform:

Web based application, compatible with desktop and mobile devices

User Profiles:

Educators, students, and educational institutes

About​

New Path Learning is a digital platform designed to support K–12 educators in crafting and sharing tailored lesson plans and curriculums. As the platform grew, instructors began voicing concerns about the "Build-A-Lesson" feature, a new tool used for planning lessons. Through usability testing and analysis, we identified pain points and proposed 8 actionable solutions to improve the overall experience for educators.

Relevance

User studies often uncover usability issues that go unnoticed during development, especially interaction problems that users struggle to articulate. For New Path Learning, past usability testing led to meaningful improvements in the platform’s functionality. Encouraged by those results, the team initiated another round of testing to continue refining the experience for both educators and students. This new phase aimed to uncover hidden friction points and ensure the platform evolves with its users' needs in mind.

CLIENT BRIEF

Understanding the Context

We began the project by meeting with the Chief Information Officer (CIO) of New Path Learning, who gave us a comprehensive walkthrough of the web application. He explained how educators interact with the platform, shared insights from previous user tests, and highlighted feedback gathered from customers. The platform is actively used by educators and students in schools across Rochester and surrounding areas.

A recurring theme in user feedback was confusion around the "Build-A-Lesson" feature, basically the difficulty of creating lessons from scratch. It was unclear whether this stemmed from bugs or simply poor feature intuitiveness. After a technical review, the engineering team found no major bugs that could explain the issues. This led the leadership team, including the CEO, to turn to usability testing as a strategic step to uncover deeper, experience-based issues before committing to any major redesign.

Pink Poppy Flowers

The 'Build-a-Lesson' Section

Project Goals

The primary goal of this project was to uncover usability barriers within the "Build-A-Lesson" feature that were impacting educators’ ability to efficiently create lessons. Our objective was to bridge that gap by identifying points of friction, understanding how educators navigate the platform, and proposing targeted design solutions.

 

Ultimately, the aim was to support New Path Learning in creating a more intuitive, accessible, and educator-friendly experience, ensuring the platform truly works for the people it's built to serve.

Plan

Our process began with understanding the platform and the context in which educators use it. We first conducted a heuristic evaluation to identify usability issues and gauge the overall user experience. From there, we prioritized the most critical issues that could be impacting user performance and satisfaction.

To explore these pain points further, we carried out usability testing with real users. The data collected from these sessions helped us validate and clarify specific usability problems unique to the "Build-A-Lesson" feature. Based on our findings, we developed a set of actionable recommendations to guide future design improvements.

EVALUATION AND TESTS

Understanding User Flow 
  • The user can register or log in to see their 'My Dashboard', 'My Curriculum', 'Lesson resources', and 'Class management' on the header menu.

  • With the focus on the 'Build-a-lesson' feature under 'My Curriculum' the user can start a new lesson and see a list of previous lessons they have built.

  • When starting a new lesson, the teacher can use pre-made templates from other teachers or add text and media from scratch by either uploading or redirecting to another source on the internet.

  • Once the lesson has been created, the lesson can be published to the class of students they have for the grade.

Heuristic Evaluation 

Following our initial meeting with the CEO, our team conducted a heuristic evaluation using Nielsen’s Ten Usability Heuristics as our primary framework, complemented by Chuck’s Heuristics for deeper categorization.

Each team member independently explored the "Build-A-Lesson" feature by attempting to create a complete lesson using all available tools. During this process, we identified and documented usability issues, assigning severity scores and categorizing them based on the heuristics.

We then regrouped to review our findings, align on issue severity, and consolidate our observations into a shared evaluation. This collaborative review ensured a well-rounded understanding of the platform’s usability challenges from multiple perspectives.

Key Findings

During our team whiteboarding session, we identified the most frequently violated heuristics that were contributing to user frustration. These included:

  • Inconsistencies in shortcuts and icon usage across the platform and in comparison to common interface patterns.

  • Lack of system feedback, such as missing previews or confirmation messages during critical steps.

  • Limited error recovery, making it difficult for users to undo actions or revert to previous versions if mistakes were made.

Each of these issues was thoroughly documented with contextual examples and actionable recommendations in the final report submitted to the client. Click here for full report

Pink Poppy Flowers
Plan of Action
What are we testing?

The heuristic evaluation served as a valuable foundation for planning our usability tests. It helped us pinpoint specific areas within the "Build-A-Lesson" feature that required deeper investigation with real users. Based on our findings, we identified five key focus areas for testing:

  1. Error prevention and recovery – How effectively users can avoid mistakes and recover from them within the lesson-building flow.

  2. Navigation and workflow – The overall ease and logic of moving through the "Build-A-Lesson" process.

  3. User control and freedom – The ability to manage and edit media, interactive components, and content without feeling restricted.

  4. Interface clarity and consistency – Understanding and recognition of UI elements such as buttons, icons, and tool labels.

  5. Help and documentation – How accessible and useful the support content is when users need guidance.

These areas became the foundation for our usability testing scenarios and task flows

Discussing Findings

Pink Poppy Flowers
Quantitative

To evaluate the usability of the "Build-A-Lesson" feature, we tracked and analyzed the following key metrics during each session:

  • Task completion time – How long participants took to complete each task

  • Task success rate – Whether tasks were completed correctly within benchmark criteria

  • Error rate – Instances of incorrect clicks or navigation down the wrong path

  • Help and confusion signals – Number of times participants asked for help or expressed that they were lost

Pink Poppy Flowers
Pink Poppy Flowers
Qualitative

All session recordings were transcribed using AI transcription software, then reviewed and edited by the team for accuracy. Each team member individually analyzed the transcripts and session notes to identify recurring themes, patterns in user behavior, and notable pain points. This thematic analysis provided deeper insights into users’ experiences beyond what the numbers alone could reveal.

Pink Poppy Flowers

"I assume it would be under Media"

- P5 assumed all possible types of media under one men

Pink Poppy Flowers

"I feel like I added it but I don’t see it”

- P3 when they were not able to view the lesson they had created

"I thought that I could see my media in this tab, but I don’t really know what this is or if I added it by mistake."

- P2 hoped their uploaded media to show where they would find it in other learning management systems

"I cannot go any further like I cannot get the cursor to select this and it does not tell me I cannot."

- P1 when they were not able to upload a video and were not able to find the solution.

Pink Poppy Flowers
Pink Poppy Flowers
Pink Poppy Flowers

"A message would have been nice." - when P3's progress was not autosaved

Pink Poppy Flowers

"Is there no add button?!"

- P4 when they were not able to add a presentation

ANALYZING RESULTS

As a team, we reviewed and aligned our individual findings to identify common usability challenges. Through discussion and synthesis, we surfaced several key themes that consistently appeared across sessions. These themes not only captured the core of the user experience issues but also reinforced trends observed in the quantitative data, helping us build a well-rounded understanding of the platform’s usability gaps.

1. Inadequate error recovery and prevention measures

2. Discoverability issues for functions and menus

3. Internal consistency issues causing confusion

4. Inconsistencies in external consistency affecting user experience

5. Lack of feedback and guidance for new users

Inadequate error prevention and recover

Participants struggled with vague or uninformative error messages that made it difficult to identify what went wrong or how to fix it. Instead of guiding users toward resolution, the messaging often left them confused and unsure of their next steps.

A common expectation among users was that their work would be automatically saved as they moved between different sections of the platform. The lack of an auto-save or draft-saving feature led to lost progress and visible frustration. One participant, after unintentionally losing their lesson, remarked that the system should have at least warned them before navigating away from the page highlighting a critical gap in feedback and preventative design.

Inconsistencies in external consistency

Participants often relied on familiar interaction patterns from other platforms, but the system didn’t always meet those expectations. For example, all participants attempted to select multiple media files, such as PDFs and images, simultaneously, and several tried to drag and drop files from their local system, expecting it to work intuitively.

There were also habitual behaviors that the interface didn’t support.

 

Every participant tried to double-click on lesson names to open them, a common interaction in file-based systems. Additionally, navigation menu labels caused confusion; many participants misinterpreted their purpose based on unclear naming. For instance, the "My Media" section was expected to show previously uploaded content, but it did not behave as users anticipated, leading to repeated confusion.

Lack of guidance and feedback

Throughout the testing sessions, participants consistently encountered a lack of system guidance and feedback, which left them uncertain about their actions and progress.

  • Task 01: Unclear Lesson Creation
    When creating a new lesson, users were unsure where their lesson had been saved or how to access it later. There was no clear indication of its location within the interface.

  • Task 02: File Upload Issues
    Users received no confirmation or error messages after uploading files. Without visual or textual feedback, they were left guessing whether the uploads were successful.

  • Task 03: Media Addition Confusion
    When adding videos, participants struggled due to missing media titles and the absence of preview thumbnails. This made it difficult to select the correct file for upload.

  • Task 04: Challenges Adding Presentations
    New users were unsure how to import presentations from the NYS curriculum database. The interface lacked prompts or instructions, making the process unclear.

  • Task 05: Identifying Recent Interactables
    Participants attempted to sort their created lessons to find the most recent one, but the absence of a "sort by date" option made it difficult to determine recency.

These issues highlight a broader gap in communication between the system and its users, leaving them without the guidance or feedback needed to confidently complete tasks.

Discoverability Issues

The usability test surfaced several moments where users struggled to locate key features, revealing significant gaps in the platform’s discoverability.

  • Starting a Custom Lesson
    Half of the participants were unable to find a way to begin creating a custom lesson. They searched through various sections—such as Class Management, My Assignments, and Lesson Resources—without success, often circling back or guessing at next steps.

  • Adding a Presentation or Interactable
    When prompted to add a presentation, none of the participants succeeded in Task 1. The dropdown menus appeared confusing or vague, and users were unsure how or where to begin. The lack of clear labels or visual cues contributed to the confusion.

  • Selecting by Standard
    One participant captured the frustration well, noting: “The dropdowns were not intuitive for me to open; if one of them would have been open, I probably would have seen more.” This emphasized how hidden or collapsed interface elements prevented users from exploring available options.

  • Locating Saved Work
    Users also had difficulty identifying where their work was saved after lesson creation. In particular, they found it challenging to locate the most recently created lesson element within the My Lessons folder, leading to confusion and inefficiency when reviewing or editing content.

Overall, the platform's layout and labeling hindered users from discovering and accessing essential tools, slowing down task completion and increasing reliance on trial and error.

Recommendations presented

Image 4-3-23 at 7.06 PM.JPG

name 'build-a-lesson' section as create lesson for a clear call-to-action

error prevention messages or implement auto-save functionality to minimize data loss

restructuring the navigation menu to have 'build-a-lesson' section under lesson resource

Pink Poppy Flowers

make the help and documentation menu clear

offer contextual help or interactive wizard for new user

specific and actionable error messages, help users understand the issue and how to resolve it

Delivering deliverables

After completing our analysis, the team compiled a detailed presentation and report summarizing our findings and actionable recommendations. These were shared in a final meeting with the CIO to guide future platform improvements.

In the following quarter, the engineering team addressed key issues by implementing features such as autosave, support for double-click actions, and drag-and-drop functionality. They also enhanced the platform’s feedback system by adding confirmation and error messages where needed.

Subsequent customer feedback surveys showed a positive response from educators, who reported that the updated platform was noticeably easier to use and better aligned with their workflow needs.

bottom of page