August '18


Author, Interviewer, Analysis, Moderator


A usability test for a new product feature built on the test taking section of the website

Analytics was a new feature developed so expert subscribers at Clutch could better understand the academics areas they needed to improve. Our company had never worked on user-focused analytics projects. To ensure we were improving the interface as much as possible, we conducted usability testing.

Idea work with gear concept business creative illustration. Man


Usability testing involves a series of steps:

  • Setting recruiting criteria
  • Scheduling interviews
  • Setting up and running tasks
  • Analyzing results
  • Documenting results
Recruitment agency searching for job applicants

Recruitment Criteria

This testing would focus on expert users. I determined the criteria with an engineer. These users have:

  • Subscribed for over two months
  • Used two or more exam reviews

Scheduling Interviews

I created a simple email template to recruit these users. The incentive for users to answer our questions was a $10 Amazon gift card.

I opted to use calendly to schedule interviews, since it automatically creates calendar invites and reminded users before their interviews.


Setting up and running tasks

I broke down my usability test script into sections:

  1. Test introduction
  2. Background questions
  3. Tasks introduction
  4. Giving one task at a time
  5. Follow-up questions
  6. Post-interview survey
  7. User debrief

The test introduction ensured users understood the questions and tasks ahead. If they didn't feel comfortable, they could opt out of testing.



These are the high-fidelity wireframes that were tested for usability.

You can checkout the full prototype here


Analyzing the results

To analyze tests, I recorded each completed task as "pass" or "fail."

I combined the results with the debrief data and post-interview surveys to create a complete picture of usability testing. I also brainstormed findings and recommendations for solving the issues that came up.


Documenting results

I recorded each user interview so my teammates could watch  according to their own pace. I compiled the findings and recommendations for changing the interface in a report distributed to other teams so they could benefit.



  • The sidebar contains correct and incorrect indicators. This is helpful, as it allows students to revise problems before exams.
  • Instead of preparing for exam reviews in advance, users would realize which areas they needed to study when they attempted exam reviews and then go to those chapter videos.
  • The chapter analysis of a user's performance and learning ability was confusing. There was a green and red bar but no indication of how many correct or incorrect answers this represented.
  • The instructions contained so much text users wouldn't read them.
  • Worksheets and videos didn't always align.
  • It would be helpful for users to retake a section of a Clutch exam after studying so they could see their improvement for that topic.
  • Users wanted a place where they could ask questions about exam reviews.


  • Add helper text (tooltip) to indicate which questions were asked on previous exams (professor recommendations)
  • Add a scale to indicate how many answers were correct and incorrect
  • Reduce the amount of instructional text
  • Add the option to reset individual sections of exam reviews so students could retake those questions


Left: I updated the chapter analysis to insert a scale indicating how many questions were correct and incorrect. I then repositioned it so it could be clearly understood.

Right: I added an option to reset exam review questions.



The biggest challenge was accomplishing usability testing within the allotted time. It took many meetings and presentations for different stakeholders to emphasize the importance of doing usability testing early. It would help us save time and challenge or affirm our assumptions of how users interacted with the interface. 

Stakeholders ultimately allowed me to conduct usability testing. However, I had to run the process alone. This made it difficult later for stakeholders to accept my proposed changes.


Even though I was allowed to conduct the research, stakeholders did not want to be involved. Including at least two to three stakeholders would have removed any biases I might have had as an interviewer and designer and convinced them of the results.

My takeaway was to insist multiple stakeholders would be involved with usability testing and research.