Enhancing a Digital Evaluation Tool
National Center for Construction Education & Research
March 2024
Skip to Outcome
Expertise
UX Design, UI Design, User Research
Tools
Figma, Dovetail, Jira
Platforms
Web, mobile, tablet
Project Overview

The National Center for Consruction Education & Research (NCCER) wants to remain a leading market contender and continue to innovate through their digital evaluation tool. This B2B product allows evaluators to observe and submit performance evaluations on the fly, foregoing the need for paper records. With the Version 1 release, users struggled to efficiently assign and submit evaluations, resulting in poor acceptance and an influx of customer support tickets. I set out to uncover the core pain points and implement changes to remedy these issues.

Background

NCCER is a non-profit that is recognized by the construction industry as the training, assessment, and career development standard for craft professionals. They offer a wide range of products in the B2B and B2C sectors, delivering training and up-skilling through a web of connected learning platforms. Through leveraging new technologies and continuous advancement in their space, NCCER aims to be the leading provider of craft credentials.

What Are Credentials?

Learners work toward earning credentials, which prove their qualification for a specified craft and allows them to work on project sites. There are two segments that make up most credentials:

Knowledge

Through NCCER’s LMS, learners complete assigned modules, consisting of videos, readings, and interactive activities.

Performance

Learners demonstrate their knowledge of modules through performing hands-on activities, which are observed by certified instructors and evaluators.

Responsibilities
  • User Research
  • Project Management
  • UX & UI Design
Goal

Enhance the end-to-end evaluation process for instructors and evaluators, from assignment to submission.

  • Conduct user research
  • Prioritize Insights & Design
  • Outcome
1
Conduct User Research

I dove right into a 4 week research sprint to better understand the gaps and use direct feedback to generate actionable design insights. I reached out to instructors and evaluators at schools and contracting companies.

  • User interviews (10 participants)
  • User surveys (22 responses)
User Interview Insights
1. Lengthy Assignment

The process of assigning evaluations to learners took too many clicks and was generally seen as cumbersome.

2. Navigation Friction

Some users struggled with getting started within the web-app, which was pinned on the general layout and information architecture.

3. Accessibility

A portion of our users require the ability to access evaluations offline, due to their geographical locations.

User Survey Insights
1. Roster Management

Users expressed their need for more control over their groups and rosters, with the ability to view and edit at a deeper level.

2. Auto Assignment

Users operate under the assumption that when they assign modules in NCCER's LMS, the performance web-app will automatically load the corresponding modules.

3. Evaluation Preferences

Users felt “locked” into one way of evaluating, despite the fact that they commonly need to switch back and forth depending on their situation.

2
Prioritize Insights & Design

With all of the interesting findings I generated from the research sprint, I met with product stakeholders to discuss. We deliberated on which insights were of high priority so that I could put together a design roadmap for the V2 build. We came up with three actionable items to focus on redesigning:

1. Implement Auto-Assign

Investigate the LMS API and consider how we can automatically load modules into the performance web-app.

2. Reduce Assignment Time

Decrease the time it takes a user to select a learner, add performance modules to their record, and finalize the assignment.

3. Improve Information Architecture

Adjust home page and end-to-end flow through the web-app to help guide users in the right direction.

Revisiting Auto-Assign

In the user interviews, 7 out of 10 participants said they expect corresponding performance modules to automatically load into the performance app when assigned in the LMS. The current process requires users to assign knowledge and performance modules twice, in their respective environments. This is redundant and a primary cause of user frustration.

Discovery

The team overlooked the LMS API for Version 1 - it turns out that we could leverage the LMS in order to simultaneously load modules. If implemented, this would eliminate the assignment requirement entirely within the performance web-app and could solve a vast majority of customer frustration....

Challenges

As we began to look further into the above discovery, we ran into some problems. Business rules require that each assignment is attached to an instructor. Unfortunately, the LMS cannot support this functionality, therefore, the “auto-assign” solution quickly fell apart.

We transitioned back to redesigning the existing assignment flow, but still begged the question - how can we best leverage the LMS API to create a more efficient assignment workflow?

Through brainstorming with the engineering team, we discovered that we could highlight corresponding performance modules for learners that have existing assignments in the LMS. I integrated this concept into the modal design, getting as close to the “auto-assign” process as possible. This solution allows users to quickly assign modules with a single multi-selection. We dubbed this feature the 'Learning Platform Sync.'

End-to-End Assignment Process

In addition to the learning platform sync, we knew that there were more pain points that need addressed within the entirety of the assignment flow. The two main issues uncovered in my research are:

1. Too Many Steps

Users are currently required to navigate through folders one by one, with each craft, level, and module on their own modal.

2. Single Module Select

Users are constrained to selecting one module at a time to assign to a learner. This results in a redundant and lengthy process for each module a learner is required to perform.

With these issues in mind, I put forth a new flow intended to reduce assignment time considerably.

Reworking the Information Architecture

Users expressed a feeling of "where and how do I get started"? Version 1 utilized a table that hosted static information, but the learners could not be interacted with and the group rosters had little control.

Improvements to the information architecture included:

  • Highlight important actions - use a FAB component for small device sizes.
  • Additional assignment interactivity - users can view learners and see a top-down overview of their assigned modules.
  • A deeper layer to an assignment, showing the list of learners it contains, with roster management controls.
3
Outcome

After designing the solutions for the core problems we uncovered, I handed off designs to our engineering team and within a month's time we were ready to launch Version 2. Utilizing tools like Hot Jar and Google Analytics, we were able to track the impact of the new updates.