
Timeline
12 months (ongoing)
Size
6 people
Platform
Native mobile app (iOS-first)
As the Founding Product Designer, I led product design for Closet Genie, a native iOS fashion app focused on wardrobe-first styling and affordable fashion discovery. To accelerate iteration and validate ideas quickly, I also contributed to frontend implementation, working closely with engineering to translate core UX flows into production-ready interfaces.
Closet Genie brings lightweight AI to budget-conscious young women who want to look put-together without overspending—helping them build outfits from what they already own and find affordable pieces that complement their existing wardrobe. Rather than complex color analysis or body mapping, the app focuses on practical needs: "what can I wear with what I have?" and "where can I find an affordable version of what I saw online?" Gamification through styling challenges, avatar competitions, and community contests keeps the experience engaging rather than purely utilitarian.
Designed the experience to prioritize inspiration-led shopping flows that naturally guide users from browsing to purchase. Optimized recommendations around what looks compelling on the avatar and aligns with trending aesthetics, rather than minimizing overlap between items. Used the user's existing wardrobe as contextual input to suggest complementary additions that expand styling possibilities and encourage continued exploration. Streamlined interactions and navigation to support fast, scrollable discovery and frequent engagement with new products. Emphasized complete-look and avatar previews to clearly communicate how additional pieces elevate the overall styling outcome.
The first version required users to type their styling request in their own words, and the system generated outfits based on detected keywords. In practice, this created friction because users had to "know what to say," and the lightweight language model performed best only when inputs matched a narrow set of recognizable terms, leading to hesitation, slower onboarding, and inconsistent results.



We replaced free-form input with curated CTA options so users could select an intent instead of composing a prompt. This reduced cognitive load and made the entry step feel more guided and predictable, trading expressiveness for speed and clarity while still enabling the AI to generate outfits without requiring users to write anything.

In the initial navigation, the avatar entry was visually emphasized to signal it as the core feature, using a prominent, "raised" treatment that stood out from the other tabs. While it drew attention, the strong visual weight disrupted hierarchy and made some users interpret the avatar as the only meaningful starting point, rather than one feature within a broader shopping and styling flow.



We changed the avatar entry into a more standard navigation pattern and allowed users to use their own profile image as the avatar representation. This made the menu feel more stable and intuitive, preserving discoverability while reducing visual pressure and clarifying that the avatar supports the experience rather than dominating it.

The early landing page used multiple clickable e-commerce style content cards, enabling users to jump directly into product or outfit pages. However, the density of entry points increased first-time confusion, and in some cases the card layout visually competed with—or partially obscured—the outfit preview, extending onboarding time because users spent effort figuring out where to begin.



We simplified the landing experience to focus on a single, clear entry point with the avatar and trending content as the primary visual anchors. This reduced cognitive load and made it easier for first-time users to understand where to start.





Initial wireframes explored a gamified experience featuring customizable avatars, clothing items displayed as collectibles, and a "looks" system for saving outfit combinations. However, discovery interviews (n=20) revealed a critical misalignment: while users accepted avatars as styling inspiration, they prioritized visual previews of complete looks over collection-building mechanics. Participants wanted the app to "just work" visually—showing them styled outfits that sparked desire—rather than engaging with achievement systems or wardrobe management features. This insight prompted a pivot from gamification toward an AI-assisted recommendation flow that keeps the technology invisible while delivering the aspirational, full-look compositions users actually respond to.
Closet Genie launched in early January 2026, with core user flows validated through initial testing. The virtual try-on experience, wardrobe organization system, and outfit recommendation engine are live and in users' hands. Early feedback has shaped key interactions—particularly around the avatar customization flow and the balance between AI suggestions and user control.
Each Friday, the design team gathers user recommendations that are then triaged into bug fixes and feature requests. UI suggestions and pain points.