top of page
Cover (1).png

Optimizing onboarding & increasing retention at Sensible

Summary

As the first designer at Sensible.so, I was tasked with addressing key challenges related to user retention and engagement. The platform, originally designed by programmers, faced low retention and activation rates due to a lack of unified visual style, seamless flows, and an optimized onboarding experience. This case study outlines how, through data-driven user testing, strategic design improvements, and a revamped onboarding experience, I successfully increased retention and user satisfaction, transforming Sensible's user experience.

The problem

Key Issues Identified:
 

  • Low Retention and Engagement
    Users were dropping off early in their journey, particularly during the onboarding process and initial interactions with key features.

  • Trust Barrier in AI Reliability
    Users were hesitant to rely on the AI for critical document extractions and reviews, doubting its accuracy and feeling a lack of control.

  • Inconsistent Design Language
    A fragmented visual and interaction design led to a lack of trust and a disjointed user experience.

  • Non-Optimized User Flows
    Key user flows, particularly in document extraction and review, were cumbersome and lacked intuitiveness.

​Company Type

startup, AI 

​

Role

Senior/Lead Product Designer, First Designer​

​

Platform

Responsive Web​

​

Team

Collaborated closely with Product Managers and Engineers​​​​​​​​​​​​​

Design Process

Research

  • Conducted multiple user interviews.

  • Analyzed quantitative data from onboarding funnel metrics.

  • Benchmarked competitors’ experience

​

Ideation

Generated ideas and selected the final concept.

​

Prototyping

​

Testing

Detail usability tests conducted, feedback received, and iterations made.

Read full Onboarding case study

Cover (1).png
Screenshot 2024-11-14 at 12.05.30.png

Key Research Findings

The onboarding flow was too lengthy and asked for unnecessary information upfront.

 

Users were unclear about the value they would get after signing up.

​

Users were hesitant to rely on the AI for critical document extractions and reviews, doubting its accuracy and feeling a lack of control.

Revamping Onboarding to Improve Activation

​​

  • Fast Path to the "Aha Moment": Streamlined the flow to quickly demonstrate the app’s core value.

  • Value Over Features: Focused on the benefits users gain, not just the features.

  • Interactive Onboarding: Introduced Template Library for hands-on exploration without requiring user data.

  • Self-Service Flow: Built a scalable, independent onboarding option, reducing reliance on support.

​

​

Alternative version.png
Alternative version-1.png

​Impact

​

The revamped, scalable onboarding process converted more sign-ups into active users, driving acquisition growth.

​

50% Reduction in Support Requests

Self-service onboarding decreased reliance on the support team, creating a smoother experience for new users.

Alternative version-2.png
Onboarding.png

Building Trust to Retain Users

After launching the AI extraction feature, extensive user testing revealed a significant challenge: users were hesitant to trust the AI’s ability to extract data accurately. Many feared the results would be unreliable, creating a barrier to adoption.

What I Did to Build Trust:
 

  1. AI Transparency Mechanisms:
    Introduced confidence signals for each data extraction, clearly showing users the system's certainty level. This gave users insight into the AI's reliability and built confidence in its outputs.

  2. Empowering Users Through Control:
    Added advanced settings to let users customize the behavior of the LLM, providing greater control over how data was extracted and processed. This ensured users could tailor the system to their specific needs and preferences.

  3. Human Review Features:
    Implemented a Human Review option, enabling users to interact with and correct AI outputs. This feature reduced uncertainty by blending automation with human oversight.

Screenshot 2024-11-28 at 11.38.54.png
Screenshot 2024-11-28 at 11.49.02.png

Impact

​

  • 35% Increase in Retention Rates

    • Simplified user flows and trust-building mechanisms ensured users stayed beyond the critical first 30 days.

  • 40% Increase in Feature Adoption

    • Transparent and user-controlled AI features encouraged broader usage of document extraction tools.

dashboard.png

Challenges and Learnings

Challenges:

 

  • Trust in AI: Users were initially skeptical of the AI’s accuracy, fearing unreliable outputs.

  • Balancing Automation and Control: Finding the right mix between automation and user intervention to meet diverse user needs.

  • Complex Onboarding: Simplifying the onboarding process while effectively showcasing the app’s value was challenging.

  • Scaling Self-Service: Building a scalable, self-service onboarding flow required careful design to reduce dependency on support.

 

Learnings:

​

  • Transparency Drives Adoption: Features like confidence signals and Human Review significantly increase trust in AI systems.

  • User Empowerment Matters: Giving users control, through advanced settings and customization options, fosters engagement and confidence.

  • Simplicity Wins: Streamlining onboarding and reducing friction accelerates user activation and retention.

  • Iterative Design is Key: Regular user testing and feedback loops are essential for uncovering pain points and driving meaningful improvements.

AI Editor.png
bottom of page