Three.js From Zero · Article s7-06
S7-06 VR UI Patterns
VR UI Patterns
HTML doesn't composite into the HMD. In VR, UI is 3D meshes with controller or hand interaction. Ray-pointer, pinch-and-drag, gaze-plus-tap, worldspace panels. Comfort rules that break everything if you ignore them.
1. The golden rules of VR UI
- No HUDs stuck to the face. Fixed-to-camera UI causes nausea fast.
- Big, far, thin. Panels at 1-2m distance, 0.3-1.5m wide, thin for depth perception.
- Always-face-camera billboards for small text. Otherwise readability drops.
- Feedback for every interaction. Hover highlight + controller haptic + sound.
- Never trap user in menu. Always an exit.
2. Ray-pointer (controllers)
Extend a line from controller into space. It hits a panel. Button under the ray highlights. Trigger press = click.
// In Three.js XR
const controller = renderer.xr.getController(0);
controller.addEventListener('select', onSelect);
function onSelect(e) {
const ctrl = e.target;
const ray = new THREE.Raycaster();
ray.ray.origin.setFromMatrixPosition(ctrl.matrixWorld);
const dir = new THREE.Vector3(0, 0, -1).applyQuaternion(ctrl.quaternion);
ray.ray.direction = dir;
const hits = ray.intersectObjects(panel.children);
if (hits.length) handleClick(hits[0].object);
}
3. Pinch (hand tracking)
Quest/Vision Pro hand tracking: thumb + index pinch = "click". The joint positions are in XR session's input sources.
const handController = renderer.xr.getHand(0);
// 25 joints exposed as Object3Ds
const thumb = handController.joints['thumb-tip'];
const index = handController.joints['index-finger-tip'];
function isPinching() {
return thumb.position.distanceTo(index.position) < 0.02; // 2cm
}
4. Poke / direct touch
Finger directly pokes a 3D button. Intuitive but needs hit volumes you don't miss.
const indexTip = handController.joints['index-finger-tip'];
// each frame
button.getWorldPosition(tmp);
if (tmp.distanceTo(indexTip.position) < 0.05) {
button.userData.press();
}
5. Gaze selection (fallback)
On devices without controllers (Vision Pro before visionOS 1.x), user stares at a button; after dwell time, triggers. Inaccessible to some users — offer alternative.
6. Layout patterns
| Pattern | Use |
|---|---|
| Floating wristwatch menu | Tool selector, always-available |
| Table-top panel | Settings, app-like UI |
| Radial menu | Quick actions, context |
| Floating billboard | Notifications, ephemeral info |
| Walking menu | Discrete choices, "go to room X" |
7. Readability
- Text size: 18-24px-equivalent at 1.5m. Test in HMD.
- Contrast: minimum 4.5:1. HMD optics reduce contrast.
- Avoid pure white on pure black — pixel ghosting.
- MSDF text (S7-08) for crisp text at any distance.
8. Comfort
- Don't move the UI with the head. Fixed worldspace only.
- Panels above eye line = neck strain. Slightly below.
- Transitions: fade, not teleport. Abrupt worldspace changes = sim sickness.
- Haptic feedback on hover/click. Free, massively reassuring.
9. Three.js tools
renderer.xr.getController(n),getHand(n)XRControllerModelFactoryfor ghost-hand modelsXRHandModelFactoryfor hand meshesthree-mesh-uilib: Flexbox-like layout for VR panelsHTMLMesh: DOM → 3D mesh for menus
10. Takeaways
- HTML HUDs don't work in VR. 3D meshes + controller/hand interaction instead.
- Ray-pointer for controllers. Pinch/poke for hands.
- Panels at 1-2m, large, fixed in worldspace.
- Always hover+haptic+sound feedback.
- three-mesh-ui + XRControllerModelFactory get you 90% there.
VR-specific article — concept-heavy. Full VR demo requires a Quest or Vision Pro; it won't work on desktop.