20% Higher Engagement via Exam Analytics
Designed analytics features for exam reviews, resulting in a 20% increase in student engagement. The team and I did this by adding additional features to support students learning experience, providing better visibility of the system, increase user control with reset options and providing flexibility with chapter analysis.
Problem Statement
New users who initially subscribed for homework questions showed low engagement with exam reviews. Prior analysis indicated that higher engagement correlates with lower churn rates, higher NPS scores, and increased lifetime value (LTV).
Making Exam Reviews Easier for New College Subscribers
Our main customers are pre-med students struggling with subjects like bio and chemistry. They expect websites to be user-friendly.
Our research found that they weren't using our exam reviews much because it was hard to see which questions they got right or wrong.
My Role in the project
As the Product Designer, I collaborated with our CTO, Customer Support, and Engineering Team to boost exam review engagement.
My tasks included analyzing user feedback, brainstorming with engineers, designing the interface, and conducting usability tests to identify improvements
Objective: Boost Fall Semester Engagement for New Students
With the Fall semester starting and our marketing campaigns underway, we aimed to enhance the exam review experience for new subscribers. This would directly affect our semester-long revenue.
Our focus was on identifying features that could both elevate user engagement and increase their lifetime value (LTV).
Discovery Phase
I began by collaborating with the Customer Success Team to gather student feedback via Intercom chat. After analyzing the input, I drafted a product brief outlining solutions for the identified issues.
The feedback pointed to three key areas needing improvement in the exam review interface:
Sidebar: Users wanted clearer indicators for each question's status to help them decide which ones to review before exams.
Dashboard: Students sought a way to easily track their progress to better allocate study time across topics.
List Page: Users requested a feature to retake tests and reset answers for more effective exam preparation.
Team Review & Sketches
I shared the findings with our CTO, Engineering Team, and Customer Success. Together, we planned and prioritized initial sketches for these interfaces for the upcoming sprint.
Exploring Charts for Dashboard Analytics
Evaluating List Page Sketches
High-Fidelity Prototyping
After initial sketches and talks with the front-end engineer, I created detailed prototypes for early usability tests. This would help us gather more feedback and confirm we were on the right path.
The prototypes required fine-tuning the navigation to clearly show question statuses like 'correct,' 'incorrect,' or 'incomplete.' We also added system statuses for summary videos at the start of each chapter.
For the dashboard, I worked with the front-end engineer to explore chart libraries, ensuring timely delivery based on ease of use.
Existing Exam Reviews Dashboard
Proposed Dashboard
Usability Test
I managed the entire usability testing process, from setting recruitment criteria to sharing the final report with the team. We focused on two key metrics: subscription length and the number of exam reviews viewed. This helped us target users who had initially engaged but later disengaged from the exam reviews.
Planning Usability Test
Proposed Dashboard
The recruitment criteria was focused on two main categories, length of subscription and number of exam reviews viewed. This helped us narrow down to those folks who have had some engagement with the exam review but ended up dropping out from that funnel.
I setup a Calendly with a whole week of time slots for students and with the help of our marketing manager send them out to students who hadn’t received any other overlapping communication from us.
I conducted seven usability test, and found a great success of 100% task completion for finding progress, and switching between chapters and 70% task completion for tasks associated using data for exam revision. I documented the results and shared the findings with the whole company, leading to revisions to our designs.
Usability Test Results
Post Test Survey Results
The four main design recommendations based on the usability tests were
Add helper text (tooltip) to indicate which questions were asked on previous exams (professor recommendations)
Add a scale to indicate how many answers were correct and incorrect
Reduce the amount of instructional text
Add the option to reset individual sections of exam reviews so students could retake those questions.
Iteration and Handoff
Following the design recommendations, we refined the interface to accommodate all question statuses and allow users to reset specific exam review sections.
I collaborated with our engineering team to clarify the design objectives and user goals. After implementation, I conducted a final design QA to address any minor UI adjustments.
Scale-Enhanced Chart Designs for Improved Readability
Design for Resetting Individual Exam Review Chapters
Onboarding and Outcomes
Post-deploy, we developed an onboarding experience to guide users through the new analytics, navigation, and reset features. A marketing email was also sent to introduce the update before their next exam.
The outcome was a 20% increase in engagement, measured by questions solved within exam reviews, leading to extended lifetime value (LTV) for users who joined later that Fall semester.