Three.js From Zero · Article s9-06
S9-06 WebRTC Video
Season 9 · Article 06
WebRTC Video + Screen Share
Webcam video on a 3D avatar. Screen share on an in-game terminal. MediaStream → HTMLVideoElement → VideoTexture → mesh.
1. Getting video
// Webcam
const camStream = await navigator.mediaDevices.getUserMedia({ video: true });
// Screen share
const scrStream = await navigator.mediaDevices.getDisplayMedia({ video: true, audio: true });
2. Stream → HTMLVideoElement
const video = document.createElement('video');
video.srcObject = stream;
video.play();
video.muted = true; // Required or autoplay blocks
3. Video → Three.js texture
const tex = new THREE.VideoTexture(video);
tex.colorSpace = THREE.SRGBColorSpace;
const mesh = new THREE.Mesh(
new THREE.PlaneGeometry(4, 2.25), // 16:9
new THREE.MeshBasicMaterial({ map: tex })
);
scene.add(mesh);
4. Live demo — webcam on a plane
click Start webcam
5. In-game screen share terminal
- Player clicks in-game "monitor" mesh → browser prompts for screen share.
- Stream goes to VideoTexture on that mesh.
- Other players see the share via WebRTC relay.
Used in VR productivity apps (Horizon Workrooms, Immersed).
6. Sending via WebRTC
// Add all tracks (audio + video)
stream.getTracks().forEach(t => pc.addTrack(t, stream));
// Remote receives via ontrack — same as voice
Same RTCPeerConnection pattern as S9-05, just more tracks.
7. Constraints
getUserMedia({
video: {
width: { ideal: 1280 },
height: { ideal: 720 },
frameRate: { max: 30 },
facingMode: 'user', // or 'environment' for back cam
}
});
8. Bandwidth control
const sender = pc.getSenders().find(s => s.track.kind === 'video');
const params = sender.getParameters();
params.encodings = [{ maxBitrate: 500_000 }]; // 500 Kbps
await sender.setParameters(params);
9. Takeaways
- getUserMedia for cam, getDisplayMedia for screen.
- HTMLVideoElement as intermediate.
- VideoTexture maps it onto a mesh.
- Same WebRTC plumbing as voice, just add video tracks.
- Constraints: resolution, frameRate, facingMode.