Developing a headset-free VR demo experience

Developing a headset-free VR demo experience

Meta XR Simulator is a lightweight OpenXR runtime that enables developers to test VR applications without wearing a headset. It streamlines development by simulating Meta Quest behavior, supporting automation, and integrating with Unity and Unreal.


I led the product design of XR Simulator, shaping its interface, building its scalable design system, and partnering across product, engineering, and research to bring it to life. The tool was showcased during Meta Connect as part of our mixed reality development toolkit.

Meta XR Simulator is a lightweight OpenXR runtime that enables developers to test VR applications without wearing a headset. It streamlines development by simulating Meta Quest behavior, supporting automation, and integrating with Unity and Unreal.


I led the product design of XR Simulator, shaping its interface, building its scalable design system, and partnering across product, engineering, and research to bring it to life. The tool was showcased during Meta Connect as part of our mixed reality development toolkit.

Meta XR Simulator is a lightweight OpenXR runtime that enables developers to test VR applications without wearing a headset. It streamlines development by simulating Meta Quest behavior, supporting automation, and integrating with Unity and Unreal.


I led the product design of XR Simulator, shaping its interface, building its scalable design system, and partnering across product, engineering, and research to bring it to life. The tool was showcased during Meta Connect as part of our mixed reality development toolkit.

Company
Meta

Role
Product Designer (IC5)

Year
2022–2024

Problem

Challenges in efficient VR application development

Before XR Simulator, developers relied on headset-only testing, which slowed iteration and made spatial debugging harder.

Testing requires a headset: Real-time testing in Unity or Blender often demands a physical Quest device, slowing iteration speed.

Scene changes require re-export: Minor tweaks in environments trigger full rebuilds and redeployment, increasing dev cycle friction.

No web-based parity for spatial scenes: Engineers lacked a way to simulate spatial logic or transitions in a browser-based tool.

Inconsistent tooling across teams: Blender and Unity integrations had different simulation workflows with no shared interface.

Approach

Cross-platform simulation, consistently designed

We introduced a scalable design system that prioritized code parity and real-time responsiveness.

Built a flexible design system from scratch: Inspired by ImGui, we created a component set aligned with Unity’s inspector layout.

Responsive, resizable panels: Mimicked familiar editor UIs while adapting to browser constraints and varied screen sizes.

Integration Issues: Lack of seamless integration with popular development platforms like Unity and Unreal Engine.

Cross-tool compatibility: Enabled consistent experiences for both Unity- and Blender-authored simulations.

Solution

Clean, contextual, and code-aligned

Clean, contextual, and code-aligned

Clean, contextual, and code-aligned

XR Simulator uses a lightweight UI framework modeled after ImGui, ensuring consistency with Unity Editor interactions while remaining fully browser-based.

Immediate visual feedback when editing scenes

Data syncs in real time with Blender and Unity exports

Side-by-side comparison for before/after view states

Gamepad-enabled navigation mimics the in-VR experience

Users can toggle which camera views, left, right, or both are being displayed on the stage.

The configuration module can be dragged and docked to any side of the screen.

Toggling both cameras on and paired with a gamepad, developers can simulate their experience as if they were in a headset.

Implementation

From concept to Connect in 3 months

From concept to Connect in 3 months

From concept to Connect in 3 months

I worked closely with engineers to ship the MVP in under 12 weeks, just in time for Meta Connect. The tool was demoed to internal teams as part of our broader mixed reality strategy.

Design Process

Wireframes

Wireframes

mIRO

Design & Prototyping

Design & Prototyping

Figma

Coding Stack

Coding Stack

React

IMGUI

Code Repository

Code Repository

Phrabricator

Wireframes

mIRO

Design & Prototyping

Figma

Coding Stack

React

IMGUI

Code Repository

Phrabricator

Results

Speeding up spatial development across teams

The launch of XR Simulator led to faster iteration, wider team participation, and clearer cross-functional collaboration. By removing the dependency on headsets and unifying simulation workflows, the tool significantly boosted productivity across Unity and Blender teams.

63%
63%

decrease in time-to-first preview

🪲

28%
28%

decrease in QA reported bugs

🔍

54%
54%

increase in usability testing coverage

Takeaways

Designing tools that reduce friction across VR workflows

Designing tools that reduce friction across VR workflows

Building XR Simulator reinforced the value of accessible, code-aligned internal tooling. By focusing on headset-free simulation, we created a faster, more inclusive development pipeline. The project also highlighted the importance of early alignment with engineering and real-world testing environments.

What Worked

What worked

Created a reusable design system for internal simulation tools

Built a UI framework from scratch that scaled across Unity and Blender projects

Introduced gamepad simulation, mirroring VR interactions without hardware setup

Enabled headset-free collaboration for design, QA, and PMs

What I'd Improve

What I'd improve

⚠️

Add snapshot state-saving to make simulations persist across sessions

⚠️

Explore component-level UI for scene-specific settings

⚠️

Explore user permissions and scene versioning for shared dev environments

Get in touch

Let’s talk. Whether you’re building something new or improving what’s already working, I’d love to hear about it.

Sean Finn
Product Designer

All rights reserved.
©2025