SPATIAL AND PRODUCT
Product, spatial, and interactive work across hardware concepts, AR, imaging pipelines, and CGI.
Case study · sample layout
Meta Ring
Personal AI & Health Ring
2025
Meta Ring explores what happens when artificial intelligence becomes
invisible —
integrated not as a screen, but as an object you forget you're
wearing.
Designed as a seamless wearable, the ring acts as a quiet interface
between the body and intelligent systems —
sensing, interpreting, and responding without demanding attention.
Design intent
The goal was a form where advanced technology disappears into simplicity. Rather than adding features, the design removes visual noise — reducing the object to a continuous, balanced band that feels natural on the hand while quietly housing sensors, light, and intelligence beneath the surface. Comfort, subtlety, and trust come first.
Structure
Form, perfected
Engineered as a single, unified structure — exterior and interior read as one continuous object, not a stack of parts.
No visible seams. No mechanical breaks. The shell flows uninterrupted around the finger while the inner surface is sculpted for long-term comfort. Proportions stay minimal so the ring reads closer to jewelry than gadgetry.
Light is used sparingly — not as decoration, but as communication.
Interaction
Human-centered signals
Instead of screens or alerts, the ring communicates through soft illumination and calm presence — readable at a glance without pulling you out of the moment.
- Glanceable The light band can echo system states, health signals, or AI activity — still legible without a display.
- Quiet When nothing is active, the object visually settles down so technology stays present, not noisy.
- Continuous Subtle, ambient feedback fits long wear: calm cues you notice only when they matter.
Wear
Engineered for everyday wear
A slim, durable outer shell pairs with a skin-friendly inner surface — tuned for comfort you can live in, not just try on.
The design assumes continuous use across work, rest, and routine. The exterior protects internal components; the interior balances contact, moisture, and pressure so the ring feels stable without hot spots or fatigue.
This is not something you constantly remove — it's meant to settle into daily life.
AR experience
HappyAR
Discover Happy & Interactive AR App
2018
Concept, Design, Development
Unity, Vuforia SDK, Xcode, Apple Developer Program, Maya, Photoshop
Unity 6, AR Foundation, C#, Android (ARCore)
Overview
Discover Happy (2018) was an experimental AR experience for a social event called Rubber Duck — inviting people to find and share joy through playful phone-based AR.
The 2025 build is a modern extension: rebuilt in Unity 6 and AR Foundation with emphasis on real-time object placement and manipulation, cleaner interaction design, stable gesture controls, and modular AR features — a complement to the original marker-driven story.
Discover Happy · 2018
Concept & testing
Early cubes and printed targets were designed to stress-test image recognition and surface detection. Each pattern was tuned so Vuforia could lock on reliably across lighting and viewing angles.
Experience flow
- A bright, minimal UI greets users with DISCOVER HAPPY — tap to launch the camera.
- Pointing at printed targets (including a large colorful rug) triggers a giant 3D rubber duck anchored in real space.
- After the moment plays, SHARE THIS HAPPY invites capture and posting to social platforms like Instagram.
Technical approach
Built in Unity with the Vuforia SDK: 3D assets, materials, and animation triggers mapped to image targets, plus video-triggered beats for mixed-media AR. Archive shots include Vuforia project keys (2018) as proof of the original build.
Reflection
My first deep dive into interactive AR storytelling — merging design, technology, and emotion. It showed me how immersive formats can carry narrative; that curiosity still drives my AR/VR work today.
Interactive APP · 2025
A focused prototype exploring stable, modular AR interactions — hand tracking and scene placement over a desk, built for clarity and repeat testing rather than a single event storyline.
Process & documentation
Six stills from the build: marker tests, targets, in-app UI, the outdoor duck beat, and project configuration.
Case study · AI reconstruction
Reframe
AI image reconstruction for production art
2024
Reframe is a reconstruction workflow for damaged, tiny, or uneven references—scans, old prints, and phone grabs that are hard to draw from. The goal is a stable likeness and readable planes before color, costume, or motion enter the conversation.
Outputs are tuned for sketch artists and art directors: high-resolution stills you can zoom, plus optional motion studies when a single frame is not enough to sell weight and expression.
Overview
The sheet reads left to right: what the archive gave us, what structure we recover, and how we want the final read to feel on the page. AI steps are directed—not a single click—so edges, texture, and identity stay consistent across the set.
Motion · expression sheet (GIF)
A single animated sheet carries the full loop—still progression up top and frame-by-frame motion below for artists who need to read weight and timing, not just a posed still. Click to open it full size in the lightbox (use arrows to switch between the hero sheet and this motion strip).
Case study · AI → 3D
Formlift
AI-assisted sculpt, look-dev, and real-time delivery
2024
Formlift is a production workflow from 2D direction to sculpt-ready 3D: AI-assisted passes, DCC sculpt and materials, then a clear in-engine read—on-brand and art-directable.
Paint-over and ComfyUI feed ZBrush sculpt and Substance Painter; Unreal Engine locks lighting and scale for production.
Overview
The goal is not a one-click mesh, but a repeatable bridge: keep design intent from the 2D beat, use AI where it saves iteration time, then commit edges and silhouette in sculpt. Materials stay procedural where possible, with hand-authored breaks so the mask feels crafted, not filtered.
Pipeline
Photoshop for concept lock and paint-over · ComfyUI for directed image passes and utilities · ZBrush for primary and secondary forms · Substance Painter for texel density and wear · Unreal Engine for context, light, and export.
Process & frames
Six process frames: painted direction, AI 3D blockout, Photoshop–ComfyUI live link, ComfyUI graph, Substance texture, and an Unreal viewport check. Open any tile—lightbox arrows walk the full sequence with the hero above.