Dozy
HR platform designed for tomorrow's screen
The Problem
Dense data, two dimensions.
HR platforms are conceptually dense. The challenge isn't just readability—it's scalability across dimensions: from a flat screen today to a spatial environment tomorrow.
Most tools are locked into one or the other. Dozy is the bridge.
"I spend half my day just trying to find where the data I need is buried."
The initial core workspace designed for seamless transition between flat and spatial environments.
The Mission
Optimize: Design a dashboard that handles high-stakes data without cognitive overload.
Systematize: Develop a glassmorphic language (depth, layering) that scales into XR.
Validate: Build a hands-on spatial prototype to prove the interaction model.
Research
Decoding Spatial Logic
I benchmarked the friction points: flat HR tools like Workday bury data in clutter, while early spatial UIs (Vision OS) use depth to communicate relationships that flat UI can't replicate.
Thesis: If we build with glassmorphic depth now, the spatial transition becomes a context change, not a redesign.
Execution
Building the Infrastructure
The first challenge was structural. Dozy spans six modules for high-stakes decision makers. I prioritized Information Architecture to ensure the data grouping remained stable when moved into 3D space.
High-Fidelity Wireframes
The wireframe pass introduced the card-based data grouping system and the persistent AI layer. The spatial logic — depth, blur, layering — was baked in at this stage, not added later.
Insights
Depth as hierarchy
Glassmorphism isn't a trend: It's a spatial language. Layered surfaces communicate priority in XR. I baked these depth relationships into the flat UI from day one.
Proactive AI layers: Instead of a search bar, the AI proactively flags absenteeism spikes or recruitment bottlenecks. It answers the question before the manager asks it.
Modular card systems: Data is broken into cards. On screen, they reduce clutter. In XR, they become physical panels the user can arrange in their own space.
Testing via spatial prototype: I built a window manager using hand tracking (MediaPipe). Key discovery: Gesture hold durations must be longer than screen clicks to avoid false triggers in physical space. Wireframes could never have surfaced this friction.
Final Designs
The final screens bring the glassmorphic system to life across the six modules — each with its own data density, but all speaking the same spatial language.
Home — team status, pipeline, and attendance at a glance
Recruitment — pipeline view with AI anomaly flags
Onboarding — employee lifecycle entry point
Person — individual profile with layered data cards
Job Creation — structured form with spatial depth
Schedule Hub — time and resource management
Reflection
Systemic thinking
Glassmorphism succeeded because it was a design decision, not a visual one. By justifying every component spatially, the system scales into XR without a single code change — just a change in physical context.
Dynamic research
While benchmarking provided the logic, I'd want to observe real HR managers. The data they check 'first' is often different from what a designer assumes. Next time: more observation, less assumption.
Validation through code
The gesture duration discovery was only possible through a coded prototype. It proved that in spatial design, 'friction' is sometimes a feature to prevent accidental interaction.