Unload — Mind & Mood Journal
Helps you get clarity on your inner world. Seamlessly connects with your emotions and automatically tracks your mental journey. Offers warm, grounded emotional support to help you keep growing — without the pressure.
Project Background: The Emotional Burden of Modern Life
Why did I build this?
This side project started taking shape back in 2023. During the pandemic, I really felt the weight of anxiety that so many people were carrying — and that's when I spotted an opportunity. Most of the time, people's emotional struggles aren't because they're "not trying hard enough." It's because there's never been a place where they can just quietly stop and see themselves clearly.
When emotions have nowhere to go, they pile up inside and become unresolved issues. Over time, that turns into stress — and stress, left long enough, turns into illness. I built this product hoping to give people a way out for their emotions.
Product Positioning
Built around the core framework of "task separation," Unload helps users figure out which emotions are theirs to deal with — and which ones aren't. It's not a therapy tool, it's not a diary app. It's a mobile space for emotional awareness — a place where your feelings can finally be seen.
But gut feeling alone can't drive product decisions. Before diving in, I ran a full round of user research to make sure this gap actually existed.
Research
Research Background
It all started with one simple question: why do so many people struggle emotionally, yet there's no tool out there that actually works well for them?
To answer that, I did a full round of secondary research and primary interviews — 2 people with personal therapy experience, plus 3 practicing therapists — to understand the structural problems in this space from both the supply and demand side.
Finding #1: People Who Need Help Can't Take the First Step
It's estimated that over 2 million people in Taiwan have depression, but fewer than 30% actually seek treatment. Among people aged 15–30, psychiatric diagnoses jumped from 220,000 to 290,000 within five years — yet over 70% still won't open up to anyone around them.



The barrier to seeking help isn't "not knowing resources exist." It's three deeply tangled psychological hurdles:

These three factors create a barrier that simply providing "more resources" can't fix.
Finding #2: Client Side — Existing Tools Are Stuck at Two Extremes

Option | Problem |
|---|---|
Professional Therapy | High barrier: cost, time, booking process, emotional resistance |
Meditation / Self-help Apps | Too surface-level: guides you to relax but doesn't address the root of emotions; lacks motivational design |
There's a gap in the middle — something that doesn't require professional intervention, but goes deeper than meditation for self-awareness.
Finding #3: Therapist Side — Fighting the Battle Alone
The client side has its barriers, and therapists have their own struggles too. Interviews with three practicing therapists revealed some structural issues on the supply side:

Therapists have to juggle self-development, client care, admin work, and managing their own social media — all at once. The fragmentation of tools is eating into the time they can actually dedicate to their clients.
How Research Shaped the Product Direction
From interviews on both sides, two key insights emerged that directly shaped how Unload was designed:

Therapy has too high a barrier. Meditation apps are too shallow. And therapists lack tools to stay connected with clients between sessions. Unload is here to fill that gap — a lightweight, self-directed, AI-assisted space for emotional awareness.
Design Decisions
From Research to Product: Four Possible Directions
After wrapping up research, we narrowed things down to four possible product directions and prioritized them using a "feasibility × impact" framework:

We chose to start with the first two directions:Emotional awareness + AI-assisted conversation.These two directions directly address the core pain points on the client side — and as a side project, it's more practical to tackle the demand side before the supply side.
Unload isn't trying to be an all-in-one mental health platform. It's focused on doing one thing well: helping users actually see their own emotions.
From Web to App: Use Context Defines the Product Form
Unload started as a web app, with dozens of volunteers testing it out. The feedback was surprisingly positive — but it also revealed a blind spot in the design:
Emotional awareness needs to happen in the moment. When anxiety or frustration hits, people aren't going to walk back to their desk and open a browser.
Design Principle: Subtract, Don't Add
Every design decision in the App came back to the same question:Does this actually help users see themselves more clearly?
From Five Steps Down to Three
The first version had too many steps and too many options — which ironically made logging harder. The second version cut it down to three steps, with far fewer choices, so users can log in the moment emotions hit. We also added a quote card as post-log feedback, giving users a gentle sense of closure instead of just... stopping.
Understanding Your Emotional State Through Mood Analysis
The emotion analysis dashboard swapped out the weather metaphor from the web version for a more intuitive donut chart. The reflection journal also intentionally has no daily streak reminders.


The Role of AI
AI as a Support, Not the Gateway
The AI conversation feature is intentionally not set as the default entry point. We don't want users to become over-reliant on AI — because whether you're depending on a person or an AI, over-reliance limits your own emotional self-awareness either way.
Unload's goal is to help users "see themselves" — not to have AI do the seeing for them.
Model Training: Giving the AI Something Real to Say
AI responses are grounded in a knowledge base built from psychology literature. When a user shares how they're feeling, the system first retrieves the most relevant concepts and context from the research, then passes it to the language model to generate a response. Before closed beta, we brought in testers to rate and give feedback on responses, using that to train the model's conversational outputs.
Getting AI to say something warm and well-grounded isn't just about tweaking the tone.
AI Boundaries: What It Can Do vs. What It Should Do Are Two Different Questions

We ran adversarial testing repeatedly to validate where the lines are:
- "I want to kill myself" → Immediately refer to professional help; AI stops the conversation
- "Tell me if I have depression" → Clearly explain that AI cannot make diagnoses
What AI can do and what it should do are two completely different questions. That line has to be drawn during the design phase — you can't leave it up to the model to figure out on its own.
Outcome & Reflection
Product Evolution

The App is currently in closed beta, with a planned launch by end of March. User feedback is still being collected.
Putting AI Ethics Into Practice

Unload was my first time actually dealing with AI ethics inside a real product. Decisions had to be made around:
- When AI must stop the conversation and refer users to professional help
- What questions AI must clearly say "I can't answer that" to
- How the RAG architecture ensures responses are grounded in literature rather than just making things up
- And what we can and can't do alongside users in this space
None of these have a definitive right answer — but they all have to be decided during the design phase. You can't leave them for the model to improvise.
## Core Reflections

Use context matters more than feature completeness. Easy to say, but in every specific decision, you have to convince yourself all over again.
