Three.js From Zero · Article s7-06

S7-06 VR UI Patterns

Season 7 · Article 06

VR UI Patterns

HTML doesn't composite into the HMD. In VR, UI is 3D meshes with controller or hand interaction. Ray-pointer, pinch-and-drag, gaze-plus-tap, worldspace panels. Comfort rules that break everything if you ignore them.

1. The golden rules of VR UI

  1. No HUDs stuck to the face. Fixed-to-camera UI causes nausea fast.
  2. Big, far, thin. Panels at 1-2m distance, 0.3-1.5m wide, thin for depth perception.
  3. Always-face-camera billboards for small text. Otherwise readability drops.
  4. Feedback for every interaction. Hover highlight + controller haptic + sound.
  5. Never trap user in menu. Always an exit.

2. Ray-pointer (controllers)

Extend a line from controller into space. It hits a panel. Button under the ray highlights. Trigger press = click.

// In Three.js XR
const controller = renderer.xr.getController(0);
controller.addEventListener('select', onSelect);

function onSelect(e) {
  const ctrl = e.target;
  const ray = new THREE.Raycaster();
  ray.ray.origin.setFromMatrixPosition(ctrl.matrixWorld);
  const dir = new THREE.Vector3(0, 0, -1).applyQuaternion(ctrl.quaternion);
  ray.ray.direction = dir;

  const hits = ray.intersectObjects(panel.children);
  if (hits.length) handleClick(hits[0].object);
}

3. Pinch (hand tracking)

Quest/Vision Pro hand tracking: thumb + index pinch = "click". The joint positions are in XR session's input sources.

const handController = renderer.xr.getHand(0);
// 25 joints exposed as Object3Ds
const thumb = handController.joints['thumb-tip'];
const index = handController.joints['index-finger-tip'];

function isPinching() {
  return thumb.position.distanceTo(index.position) < 0.02; // 2cm
}

4. Poke / direct touch

Finger directly pokes a 3D button. Intuitive but needs hit volumes you don't miss.

const indexTip = handController.joints['index-finger-tip'];
// each frame
button.getWorldPosition(tmp);
if (tmp.distanceTo(indexTip.position) < 0.05) {
  button.userData.press();
}

5. Gaze selection (fallback)

On devices without controllers (Vision Pro before visionOS 1.x), user stares at a button; after dwell time, triggers. Inaccessible to some users — offer alternative.

6. Layout patterns

PatternUse
Floating wristwatch menuTool selector, always-available
Table-top panelSettings, app-like UI
Radial menuQuick actions, context
Floating billboardNotifications, ephemeral info
Walking menuDiscrete choices, "go to room X"

7. Readability

  • Text size: 18-24px-equivalent at 1.5m. Test in HMD.
  • Contrast: minimum 4.5:1. HMD optics reduce contrast.
  • Avoid pure white on pure black — pixel ghosting.
  • MSDF text (S7-08) for crisp text at any distance.

8. Comfort

  • Don't move the UI with the head. Fixed worldspace only.
  • Panels above eye line = neck strain. Slightly below.
  • Transitions: fade, not teleport. Abrupt worldspace changes = sim sickness.
  • Haptic feedback on hover/click. Free, massively reassuring.

9. Three.js tools

  • renderer.xr.getController(n), getHand(n)
  • XRControllerModelFactory for ghost-hand models
  • XRHandModelFactory for hand meshes
  • three-mesh-ui lib: Flexbox-like layout for VR panels
  • HTMLMesh: DOM → 3D mesh for menus
Run this demo's code on a Quest to see actual VR UI. Desktop browsers won't show the HMD overlay; they simulate with a preview.

10. Takeaways

  • HTML HUDs don't work in VR. 3D meshes + controller/hand interaction instead.
  • Ray-pointer for controllers. Pinch/poke for hands.
  • Panels at 1-2m, large, fixed in worldspace.
  • Always hover+haptic+sound feedback.
  • three-mesh-ui + XRControllerModelFactory get you 90% there.

VR-specific article — concept-heavy. Full VR demo requires a Quest or Vision Pro; it won't work on desktop.