My Impact
01
Led Meditation UX Design:
Designed research-informed Meditation user flow and wireframes.
02
Drove Spatial Interactive Design:
Led brainstorming with a focus on spatial interactions. Designed the 3D Call-to-Action and hand-tracking effects, modeled and animated the AI agent.
03
Bridged Design & Engineering:
Implemented hand-tracking effects and audio-reactive AI agent animation with engineers. Resolved Blender/Unity integration issues.
The Problem
In our fast-paced world, stress and emotional challenges are widespread. Tight schedules and overwhelming workloads leave little time for relaxation, making it a luxury for many busy professionals to recharge.
Solution Overview
Features
01
Talk it Through
Understanding emotions is key to managing stress. Flo, our AI companion, detects and visualizes subtle emotions in user speech as bright and dark bubbles.
02
Bubble Burst
To empower users to process their emotions both physically and visually, we incorporated accurate hand gesture tracking. Each bubble, when poked, opens a portal to a virtual healing forest.
Meditate in Immersion
Experience guided mindful meditation in VR, practicing deep breathing with visual cues and interacting with the scenery through hand gestures.
Starting Point
1.1 XR Hackathon with Open Topics: Where to begin?
The Global XR Challenge has only two constraints: the experience must be in Extended Reality and playable on Meta Quest or AR Glasses. With the freedom to choose any topic, the main challenge lies in identifying the "right" problem space to address.
As the team enthusiastically brainstormed, ideas were initially scattered. To focus the discussion, I posed a critical question:
"What problems are better solved with Extended Reality than with a traditional app or web?"
However, before answering that, we had to extract and understand XR's unique strengths.
1.2 With XR's strengths in mind, we began Brainstorming using the Crazy 8 technique. To refine and evaluate ideas, I outlined key questions to help the team identify the most promising problem space.
4 problem areas emerged from Brainstorming. After evaluating "Why", "What", and "How" for each area, we concluded that an interactive VR therapy space offers the fewest inherent issues and the most design opportunities.
Articulate ideas into a narrative User Flow
2.1 We chose promising ideas from the Brainstorm to refined them, and AI naturally emerged as a part of the solution.
A key highlight of this project was the strong collaboration between design and development. Engineers joined all design meetings, providing early input. While refining the concept of emotion bubbles, an engineer suggested using AI to analyze user speech and categorize emotions as positive or negative.
2.2 Empower users as the bridge between AR and VR
To amplify user participation and joy, we decided to empower users as the bridge between AR and VR, enabling them to transform their room into a complete virtual world by popping emotion bubbles and focusing gaze.
After finalizing the user flow, we split tasks: I took on the UX design for Phase 3: Interactive Meditation.
3.1 Understanding Meditation traditions and sciences.
To create a scientifically-backed, user-focused design, I reviewed reliable sources like Psychology Today, Mindfulness, and the National Library of Medicine to quickly understand meditation traditions and science before designing.
While there are numerous types of Meditation, they are not entirely distinct from one another, sharing essential elements.
I shared these research findings with the team to ensure everyone had a comprehensive understanding of the subject, informing details in their respective tasks.
3.2 Identify design opportunities in Meditation traditions.
During my research, I found that meditation processes are highly visual, relying heavily on imagining peaceful energy. This presents numerous opportunities for interaction and visual design.
3.3 Combine the traditional Mindfulness meditation process with guided imagery techniques and intuitive interactions in VR.
Phase 3: Interactive Meditation User Flow
Wireframes in sequence
Pulsing animation and breath visualization to guide calm breathing
Side view diagram: spatial relations between UI elements
3.4 Push for spatial interactions in Extended Reality
To advance spatial interaction design beyond 2D screens in 3D environments, I envisioned a spatial Call-to-Action (CTA), where users activate it by placing their hand inside the cube and change actions by rotating it to different faces.
To reduce user effort, this idea was refined into a multi-face CTA, with all different actions assigned to each face, allowing users to press to commit.
After drafting the Meditation user experience, I identified technical requirements to sync with engineers and assess feasibility.
4.1 Technical requirement check-list
After discussions, we scaled back the eye-tracking feature, as it's only available on the Meta Quest Pro.
We also dropped the biofeedback breathing visualization due to time constraints. While research shows it’s feasible through monitoring abdominal elevation changes with the controller, we prioritized other interactions and opted for a preset animation to guide breathing instead.
4.2 UIUX adjustments following technical scale-back:
User Flow Change
Add a breathing animation to the AI Agent and position it at the center of users' view, making it a multi-purpose object: AI agent, breathing guide, and constant focal point.
Pulsing animation + Biofeedback breath visualization
AI Agent + Pulsing animation
Adjust eye-tracking effects to hand-tracking effects
Right hand swipe creates green aurora
Left hand swipe creates pink aurora
Lift either hand to grow a tree
Push both hands forward to create falling leaves
Finalize Visual Style
5.1 Explore different mood boards
5.2 I created 2 AI Agent styles, and the team chose the abstract design.
Meditation Hi-fidelity design in sequence
Final headset demo walk through
Take-away
Collaboration with developers can be more than just check-ins and handoffs—it can be highly creative.
We faced initial issues exporting meshes and animations from Blender and Figma to Unity but resolved them using the ABC file format and a Figma-to-Unity converter.
With new tech and hardware, digital experiences are no longer limited to 2D screens. I'm proud I dove in—though intimidating at first, learning by creating is the best approach.
Thank you for reading!