
Company
Meta
Role
Product Designer (IC5)
Year
2022–2024
Problem
Challenges in efficient VR application development
Before XR Simulator, developers relied on headset-only testing, which slowed iteration and made spatial debugging harder.
❌
Testing requires a headset: Real-time testing in Unity or Blender often demands a physical Quest device, slowing iteration speed.
❌
Scene changes require re-export: Minor tweaks in environments trigger full rebuilds and redeployment, increasing dev cycle friction.
❌
No web-based parity for spatial scenes: Engineers lacked a way to simulate spatial logic or transitions in a browser-based tool.
❌
Inconsistent tooling across teams: Blender and Unity integrations had different simulation workflows with no shared interface.
Approach
Cross-platform simulation, consistently designed
We introduced a scalable design system that prioritized code parity and real-time responsiveness.
✅
Built a flexible design system from scratch: Inspired by ImGui, we created a component set aligned with Unity’s inspector layout.
✅
Responsive, resizable panels: Mimicked familiar editor UIs while adapting to browser constraints and varied screen sizes.
✅
Integration Issues: Lack of seamless integration with popular development platforms like Unity and Unreal Engine.
✅
Cross-tool compatibility: Enabled consistent experiences for both Unity- and Blender-authored simulations.
Solution
XR Simulator uses a lightweight UI framework modeled after ImGui, ensuring consistency with Unity Editor interactions while remaining fully browser-based.
✅
Immediate visual feedback when editing scenes
✅
Data syncs in real time with Blender and Unity exports
✅
Side-by-side comparison for before/after view states
✅
Gamepad-enabled navigation mimics the in-VR experience
Users can toggle which camera views, left, right, or both are being displayed on the stage.
The configuration module can be dragged and docked to any side of the screen.
Toggling both cameras on and paired with a gamepad, developers can simulate their experience as if they were in a headset.
Implementation
I worked closely with engineers to ship the MVP in under 12 weeks, just in time for Meta Connect. The tool was demoed to internal teams as part of our broader mixed reality strategy.
Design Process
Results
Speeding up spatial development across teams
The launch of XR Simulator led to faster iteration, wider team participation, and clearer cross-functional collaboration. By removing the dependency on headsets and unifying simulation workflows, the tool significantly boosted productivity across Unity and Blender teams.
⚡
decrease in time-to-first preview
🪲
decrease in QA reported bugs
🔍
increase in usability testing coverage
Takeaways
Building XR Simulator reinforced the value of accessible, code-aligned internal tooling. By focusing on headset-free simulation, we created a faster, more inclusive development pipeline. The project also highlighted the importance of early alignment with engineering and real-world testing environments.
✅
Created a reusable design system for internal simulation tools
✅
Built a UI framework from scratch that scaled across Unity and Blender projects
✅
Introduced gamepad simulation, mirroring VR interactions without hardware setup
✅
Enabled headset-free collaboration for design, QA, and PMs
⚠️
Add snapshot state-saving to make simulations persist across sessions
⚠️
Explore component-level UI for scene-specific settings
⚠️
Explore user permissions and scene versioning for shared dev environments
Get in touch
Let’s talk. Whether you’re building something new or improving what’s already working, I’d love to hear about it.
Sean Finn
Product Designer
All rights reserved.
©2025