Three.js From Zero · Article s2-08

Authoritative Multiplayer — Prediction & Reconciliation

← threejs-from-zeroS2 · Article 08 Season 2
Article S2-08 · Three.js From Zero

Authoritative Multiplayer — Prediction & Reconciliation

S2-07 had everyone shouting positions at each other. Fine for cursors. Disaster for gameplay — two players shoot each other simultaneously, whose hit counts? In authoritative multiplayer the server decides and clients catch up. This article covers the three techniques that make server-authority feel responsive: client prediction, server reconciliation, and snapshot interpolation.

The demo simulates a server + two clients in one page. The left panel is the authoritative server view. The right is a client — you drive it with arrow keys, see its predicted movement, and watch reconciliation correct small drifts. Toggle the artificial latency / packet loss to see the techniques cope.

SERVER (authoritative)
CLIENT (your inputs)
Drive with ← ↑ → ↓ (click the right panel first). Crank latency to 300ms — the client moves instantly because of prediction while the server catches up. Crank packet loss to 20% — reconciliation smooths the gaps. This is what every modern multiplayer shooter does.

Why authoritative at all?

In P2P / fully-trusting setups (like the S2-07 cursor demo), each client just announces "I'm here, my state is X". No cheating prevention, no conflict resolution. Fine for collaboration. Bad for anything where players compete, because any player can lie about their position or score.

An authoritative server owns the canonical state. Clients send inputs ("I pressed forward"), the server simulates the world, broadcasts the result. Cheats become server-side bugs instead of client-side bugs.

P2P (S2-07)Authoritative (S2-08)
State ownerEach clientServer
What clients sendTheir stateTheir inputs
Cheat resistanceNoneGood
Used forCollab tools, cursors, whiteboardsGames, trading, anything with adversaries

Client prediction — the responsiveness trick

Naïve authoritative: press W → send to server → wait 100ms → receive new position → render. The result: input feels like it's lagging 100ms behind.

The fix: apply your own input immediately on the client, and send it to the server. The server's echo confirms (usually) what you already predicted. If the server disagrees, you snap/correct (reconciliation).

// Local tick — runs every frame
function clientTick(dt) {
  const input = readKeys();
  const seq = nextSeq++;

  // 1. Apply input locally (prediction)
  applyInput(localState, input, dt);
  pendingInputs.push({ seq, input, dt });

  // 2. Send to server
  network.send({ type: 'input', seq, input, dt });
}

The client's render always shows its predicted state. By the time the server's confirmation arrives, you've already seen the result and moved on.

Server reconciliation — when prediction is wrong

Occasionally the server disagrees (packet lost, other player collided with you, physics edge case). The server sends its authoritative state at regular intervals. The client compares:

network.onMessage = (msg) => {
  if (msg.type === 'state') {
    // Rewind to server's state
    localState = msg.state;
    // Drop inputs the server has already processed
    pendingInputs = pendingInputs.filter((i) => i.seq > msg.lastProcessedSeq);
    // Replay pending inputs on top to catch up
    for (const { input, dt } of pendingInputs) {
      applyInput(localState, input, dt);
    }
  }
};

This is the "reconciliation" step. If the server agrees with the client's prediction, replaying the inputs produces the same state the client already has — user sees nothing. If the server disagreed (collision, shot, etc.), replaying produces a slightly different state — user sees a tiny snap that can be smoothed with a lerp.

Snapshot interpolation — for remote players

Your own character uses prediction. Other players don't — you can't predict their inputs. Instead, you receive their positions as server snapshots and interpolate between the last two:

// State buffer for each remote player
const snapshots = [];   // [{ time, state }, ...]

network.onMessage = (msg) => {
  if (msg.type === 'snapshot') {
    snapshots.push({ time: msg.time, state: msg.state });
    // Trim to last ~1 second
    while (snapshots.length > 20) snapshots.shift();
  }
};

// Render time: 100ms in the past to have two snapshots to interp between
function renderRemote() {
  const renderTime = performance.now() - INTERP_DELAY_MS;
  const [a, b] = snapshotsAroundTime(renderTime);
  if (!a || !b) return;
  const alpha = (renderTime - a.time) / (b.time - a.time);
  remoteMesh.position.lerpVectors(a.state.pos, b.state.pos, alpha);
}

100ms interpolation delay is the Counter-Strike / Valorant standard. Feels laggy for remote players but deterministic — everyone agrees on when-what-happened.

The three clocks

Authoritative multiplayer has three conceptual times:

ClockWhat it isUse for
Server timeServer's wall clockTimestamping events, lag compensation
Client presentNow, on the clientRendering your own predicted character
Client past~100ms before presentRendering remote players (interpolation)

A well-structured client keeps these three times explicit and converts between them consciously. Mixing them up is how you get the classic "I shot them and they didn't die" bug.

Lag compensation — hitscan over time

Player A shoots player B. B was at position X when A fired, but by the time A's packet arrives, B has moved to Y. Whose timing wins?

Answer: rewind the server's state to the time A pressed fire, check if the shot hits B at that historical position, apply the hit. This is lag compensation and every competitive shooter does it:

// Server keeps a history of snapshots
const history = new RingBuffer(1000);   // ~1s at 1000Hz

function onClientShot(msg) {
  const pastState = history.at(msg.clientTime);
  const hit = raycastAgainst(pastState, msg.ray);
  if (hit) applyDamage(hit.player);
}

The window you rewind should be bounded (say, max 200ms) so cheaters can't claim they fired 5 seconds ago.

Interest management — don't send everything to everyone

A 100-player game broadcasting every player's position to every other player = O(n²) bandwidth. For 100 players, ~10kB × 100 × 60Hz = 60MB/s. Kills your server.

Fix: area of interest. Each player only receives updates for entities they can see — nearby players, visible objects, same room. Reduces bandwidth by 90-99%.

TechniqueHow
Distance-basedOnly send entities within radius R
Grid sectorsDivide world into cells, send only your cell + neighbors
OctreeSpatial index, query per-client per-tick
PVS (potentially visible set)Precomputed which areas can see which — FPS-style

Wire format — serialize tight

JSON is fine for presence. For game state, it's 5-10× bigger than necessary. Production games use binary:

// Dumb but effective: pack position into Int16Array (±32767)
function encodePos(x, y, z, buf, offset) {
  buf.setInt16(offset + 0, x * 100);   // 1cm precision, ±320m range
  buf.setInt16(offset + 2, y * 100);
  buf.setInt16(offset + 4, z * 100);
}

// Delta compression: only send fields that changed
// Bitfield up front signals which fields are present

Libraries that do this well: geckos.io (UDP over WebRTC), netcode.io, Colyseus's @colyseus/schema.

Platform pick

PlatformWhen it wins
ColyseusRooms + schemas + matchmaking baked in. Great for session-based games.
PartyKitEdge WebSockets on Cloudflare. Cheap, global.
HathoraManaged game servers, persistent rooms, instance management.
geckos.ioUDP-over-WebRTC for low-latency action games.
Socket.IO + bare NodeFull control, more code to write.

Common first-time pitfalls

  • Rubber-banding on slightest disagreement. Don't snap to server state — lerp. Compute the error, interpolate over a few frames.
  • Predicting collisions. Clients can predict their own motion but not arbitrary physics interactions. Server resolves, clients correct.
  • Lost-packet spiral. If you drop an input message, the server's state diverges until the next confirmed input arrives. Include a seq number so the server knows what's missing.
  • Tick rate mismatch. Client ticks at 60Hz, server at 30Hz. Interpolate, don't try to 1:1 map.
  • Sending full state. At 60Hz, even small scenes saturate uplinks. Delta compress, and/or lower the broadcast rate to 20Hz.
  • Trust on client. "It's just my game, cheaters don't matter" — until you add a leaderboard or a trade. Plan for server authority from day 1.

Exercises

  1. Ping display: measure round-trip time on each input → confirmation cycle. Draw a ping number in the stats panel.
  2. Snapshot compression: switch the demo from full-state snapshots to delta snapshots (only changed fields).
  3. Simple lag compensation: add a click-to-shoot behavior, server rewinds 100ms of position history to validate hits.

What's next

Article S2-09 — GPGPU with TSL Compute Nodes. 100,000 boids, all simulated on the GPU. Plus the WebGPU landscape, compute kernels, and benchmark vs CPU.