Three.js From Zero · Article s5-02
S5-02 Path Tracing
Path Tracing in the Browser
The ground-truth algorithm. Shoot rays, bounce on random directions, accumulate. Over many samples, it converges to a physically-correct render. Slow, but every AAA film was made with it.
1. The algorithm, one page
for each pixel:
color = 0
for each sample:
throughput = 1
ray = camera → pixel
for each bounce (up to N):
hit = trace(ray)
if !hit: color += throughput * sky; break
color += throughput * emissive(hit)
if russian_roulette_terminate(throughput): break
new_dir = sample_hemisphere_cosine_weighted(hit.normal)
throughput *= brdf(hit) * dot(new_dir, normal) / pdf(new_dir)
ray = hit.pos + eps*normal, new_dir
final[pixel] += color / samples
2. Why is it the ground truth?
Every surface, at every point, scatters in infinitely many directions. Direct lighting approximates with "only the light direction." Probes approximate with SH. Path tracing just integrates the real integral by Monte Carlo — shoot enough random rays, average.
By law of large numbers, the average converges to the exact rendering equation. Given infinite samples, you get truth.
3. Building blocks
Ray-sphere intersection
float hit = -1.0;
vec3 oc = ray.origin - sphere.center;
float b = dot(oc, ray.dir);
float c = dot(oc, oc) - sphere.r*sphere.r;
float disc = b*b - c;
if (disc > 0.0) hit = -b - sqrt(disc);
Cosine-weighted hemisphere sample
vec3 cosineHemisphere(vec2 xi, vec3 n) {
float r = sqrt(xi.x);
float phi = 2π * xi.y;
vec3 tangent = normalize(any perpendicular);
vec3 binorm = cross(n, tangent);
return tangent*(r*cos(phi)) + binorm*(r*sin(phi)) + n*sqrt(1.0 - xi.x);
}
xi is two random numbers. Output: a direction in the hemisphere oriented around n, weighted so higher cos(θ) is sampled more often — matching the Lambertian BRDF.
Russian roulette
float p = max(throughput.r, max(throughput.g, throughput.b));
if (rand() > p) break;
throughput /= p;
Probabilistically kill dim paths. Unbiased: stop early, but divide by survival probability.
4. Live demo — progressive path tracer (Cornell)
A tiny path tracer, written as a fragment shader. Each frame adds one sample per pixel. Watch noise → smooth photo as samples accumulate.
5. The speed-ups (AAA stack)
- Next event estimation (NEE): at every bounce, also sample a direct ray toward a light. Reduces variance 10×.
- BVH acceleration: bounding volume hierarchy. O(log n) ray-scene intersection instead of O(n).
- MIS (multiple importance sampling): weight BRDF samples vs. light samples by heuristic. No double count.
- ReSTIR: resampled importance sampling + temporal reuse. State of the art for many-light scenes.
- Denoise: train a small U-Net (NVIDIA OptiX) to remove noise from 1-sample-per-pixel input. Real-time quality at game budgets.
6. What ships in the browser
- three-gpu-pathtracer (Garrett Johnson) — full path tracer on top of Three.js. BVH, MIS, progressive, supports MeshPhysicalMaterial.
- WebGPU is fast enough for interactive path tracing. Check this repo — it's the de-facto Three.js PT.
7. When to use path tracing
- Offline rendering / cinematics: always.
- Lightmap baking: yes (runs once).
- Product viewers, NFT galleries: yes (user sits still, accumulates 30+ spp in a second).
- Real-time games: hybrid only (primary rays raster, reflections/GI via PT).
8. Takeaways
- Path tracing = shoot rays, bounce on random dirs, average samples. Ground truth.
- Cosine-weighted hemisphere sampling matches Lambert BRDF → zero-variance diffuse.
- BVH + NEE + MIS + denoise = how modern PT runs in real time.
- three-gpu-pathtracer is the Three.js ecosystem's answer. Production quality.