
Role
Product Designer
Timeline
2022–2023
Tools
Figma
Description
A desktop-based application that allows developers to demo their experiences without a VR headset.
Context
Meta XR Simulator is a lightweight OpenXR runtime that enables developers to test VR applications without wearing a headset. It streamlines development by simulating Meta Quest behavior, supporting automation, and integrating with Unity and Unreal.
Challenges in efficient VR application development
Before XR Simulator, developers relied on headset-only testing, which slowed iteration and made spatial debugging harder.
❌
Testing requires a headset: Real-time testing in Unity or Blender often demands a physical Quest device, slowing iteration speed.
❌
Scene changes require re-export: Minor tweaks in environments trigger full rebuilds and redeployment, increasing dev cycle friction.
❌
No web-based parity for spatial scenes: Engineers lacked a way to simulate spatial logic or transitions in a browser-based tool.
❌
Inconsistent tooling across teams: Blender and Unity integrations had different simulation workflows with no shared interface.
Cross-platform simulation, consistently designed
We introduced a scalable design system that prioritized code parity and real-time responsiveness.
✅
Built a flexible design system from scratch: Inspired by ImGui, we created a component set aligned with Unity’s inspector layout.
✅
Responsive, resizable panels: Mimicked familiar editor UIs while adapting to browser constraints and varied screen sizes.
✅
Integration Issues: Lack of seamless integration with popular development platforms like Unity and Unreal Engine.
✅
Cross-tool compatibility: Enabled consistent experiences for both Unity- and Blender-authored simulations.
XR Simulator uses a lightweight UI framework modeled after ImGui, ensuring consistency with Unity Editor interactions while remaining fully browser-based.
✅
Immediate visual feedback when editing scenes
✅
Data syncs in real time with Blender and Unity exports
✅
Side-by-side comparison for before/after view states
✅
Gamepad-enabled navigation mimics the in-VR experience

Users can toggle which camera views, left, right, or both are being displayed on the stage.

The configuration module can be dragged and docked to any side of the screen.

Toggling both cameras on and paired with a gamepad, developers can simulate their experience as if they were in a headset.
Speeding up spatial development across teams
The launch of XR Simulator led to faster iteration, wider team participation, and clearer cross-functional collaboration. By removing the dependency on headsets and unifying simulation workflows, the tool significantly boosted productivity across Unity and Blender teams.
⚡
decrease in time-to-first preview
🪲
decrease in QA reported bugs
🔍
increase in usability testing coverage
Building XR Simulator reinforced the value of accessible, code-aligned internal tooling. By focusing on headset-free simulation, we created a faster, more inclusive development pipeline. The project also highlighted the importance of early alignment with engineering and real-world testing environments.
✅
Created a reusable design system for internal simulation tools
✅
Built a UI framework from scratch that scaled across Unity and Blender projects
✅
Introduced gamepad simulation, mirroring VR interactions without hardware setup
✅
Enabled headset-free collaboration for design, QA, and PMs
⚠️
Add snapshot state-saving to make simulations persist across sessions
⚠️
Explore component-level UI for scene-specific settings
⚠️
Explore user permissions and scene versioning for shared dev environments