Three.js From Zero · Article 02
Geometry & the Mesh
Geometry & the Mesh
Every visible thing in Three.js is a Mesh — and a Mesh is two
things bolted together: a Geometry and a Material. This article
is about the geometry half. We're going to cover every built-in primitive, understand
how geometry is actually stored under the hood, build a custom shape from scratch, and then
look at how meshes relate to each other in the scene graph.
Change the dropdown in the demo to swap between primitives live — that's the same
Mesh, same material, with just the geometry property replaced.
Every concept in this article is visible right there.
What a Mesh actually is
From Mesh.js in the Three.js source, stripped to its essence:
class Mesh extends Object3D {
constructor(geometry = new BufferGeometry(), material = new MeshBasicMaterial()) {
super();
this.geometry = geometry;
this.material = material;
}
// ...raycast, clone, etc.
}
That's the whole idea. A Mesh owns a geometry and a material. The scene graph owns the
mesh's transform (position, rotation, scale) via the Object3D it inherits from.
Material is the subject of Article 03. Geometry is
the subject of this article.
The primitives tour
Three.js ships with a catalog of primitive geometries. Nine of them cover 90% of what you'll ever need:
| Class | Args | When you reach for it |
|---|---|---|
BoxGeometry | w, h, d, ws, hs, ds | Walls, crates, UI cards, debug proxies. |
SphereGeometry | r, widthSeg, heightSeg | Planets, bubbles, orbs, env probes. |
PlaneGeometry | w, h, ws, hs | Floors, banners, billboards, shader canvases. |
CircleGeometry | r, segments | Discs, radial UI, flat buttons in 3D. |
ConeGeometry | r, h, rSeg, hSeg | Arrows, party hats, spotlights. |
CylinderGeometry | rTop, rBot, h, rSeg | Pillars, pipes, barrels, shafts. |
TorusGeometry | r, tube, radSeg, tubSeg | Rings, halos, life savers. |
TorusKnotGeometry | r, tube, tubSeg, radSeg, p, q | Flashy demos (like this series). |
IcosahedronGeometry | r, detail | Low-poly rocks, asteroids, d20s. |
There are a handful more — DodecahedronGeometry, OctahedronGeometry,
TetrahedronGeometry, RingGeometry, LatheGeometry,
TubeGeometry, ExtrudeGeometry, ShapeGeometry,
EdgesGeometry, WireframeGeometry, CapsuleGeometry. When
you need them you'll find them, but don't memorize the list.
Try the dropdown above and watch the vertex / triangle count in the bottom-left badge change. That count is what actually matters for performance — not the geometry class.
Under the hood: BufferGeometry
Every primitive above is, ultimately, a BufferGeometry. This is the only
geometry class Three.js ships in modern versions (the old Geometry class was
removed in r125). A BufferGeometry is a bag of typed arrays — one array per
attribute — plus an optional index.
Attributes
An attribute is per-vertex data. The common ones:
position—vec3— where the vertex is in local space. Required.normal—vec3— the surface direction at that vertex. Required for lighting.uv—vec2— texture coordinate (0..1) for sampling textures.color—vec3/4— per-vertex color, used whenmaterial.vertexColors = true.tangent—vec4— needed for correct normal mapping. Auto-computed viacomputeTangents().
You can also add your own: aWindStrength, aRandomSeed, whatever
your shader needs.
The index
A triangle has 3 vertices, but two triangles that share an edge share 2 vertices. If you stored those shared vertices twice, you'd waste memory and GPU cache. So you store each unique vertex once in the attributes, and use an index array of integers to say "triangle 1 uses vertices [0, 1, 2], triangle 2 uses [1, 2, 3]."
Index unlocks a ~3× memory win on most geometry and is why all built-in primitives are indexed.
Building a BufferGeometry by hand
Enough theory. Here's a single triangle:
const geometry = new THREE.BufferGeometry();
const positions = new Float32Array([
// x, y, z
-1, 0, 0,
1, 0, 0,
0, 1, 0,
]);
geometry.setAttribute('position', new THREE.BufferAttribute(positions, 3));
geometry.computeVertexNormals();
That's it. Add it to a mesh, add the mesh to the scene, and you have a triangle.
computeVertexNormals() fills in the normal attribute for you based
on the triangle winding — critical if your material responds to light.
The custom wavy plane in the demo
Select Custom wavy plane in the dropdown. That geometry is built like this:
function makeWavyPlane(size = 2, segs = 64, amp = 0.15) {
const g = new THREE.PlaneGeometry(size, size, segs, segs);
const pos = g.attributes.position;
for (let i = 0; i < pos.count; i++) {
const x = pos.getX(i);
const y = pos.getY(i);
const z = Math.sin(x * 3) * Math.cos(y * 3) * amp;
pos.setZ(i, z);
}
pos.needsUpdate = true;
g.computeVertexNormals(); // recompute so lighting still works
return g;
}
Three steps every custom geometry goes through:
- Start with a good subdivided base (a
PlaneGeometryhere — you could also start with an emptyBufferGeometryand provide everything). - Modify the
positionattribute in a loop. - Mark
needsUpdate, recompute normals.
Useful BufferGeometry methods
computeVertexNormals()— recomputes normals from positions + indices. Use after any CPU-side vertex displacement.computeBoundingBox()/computeBoundingSphere()— used for frustum culling and raycasting. Auto-computed lazily, but you can force them.center()— translates the geometry so its bounding box origin is (0,0,0). Useful when models load off-center.translate(x, y, z),rotateX(a),scale(x, y, z)— bake a transform into the vertices. Different from transforming the mesh: this changes the geometry data itself.dispose()— free the GPU buffers. Critical when you replace a geometry, like the demo does on every dropdown change.
Wireframe, points, lines — same geometry, different draw mode
Flip the render as dropdown above. What actually changes:
- Solid —
new THREE.Mesh(geom, material). Triangle list. - Wireframe — same mesh, but
material.wireframe = true. Much simpler than swapping toLineSegmentswhen you want to keep the lighting setup. - Points —
new THREE.Points(geom, new THREE.PointsMaterial({ size: 0.02 })). Draws a sprite per vertex.
For a crisper wireframe overlay without the filled faces showing through, reach for
EdgesGeometry + LineSegments:
const edges = new THREE.EdgesGeometry(geom, 15); // 15° threshold
const lines = new THREE.LineSegments(
edges, new THREE.LineBasicMaterial({ color: 'white' })
);
The scene graph — how meshes compose
Every Object3D (including every Mesh) has:
position—Vector3.rotation—Euler(order matters: default is'XYZ').quaternion— alternate rotation representation. Set either, the other updates.scale—Vector3.children— array of objects whose transforms are relative to this one.
This is the scene graph. When you parent.add(child), the child's transform
becomes local to the parent. Move the parent, everything under it moves with it. This is how
you build a solar system, a skeleton, an instanced structure.
const earth = new THREE.Group();
scene.add(earth);
const planet = new THREE.Mesh(sphereGeom, earthMat);
earth.add(planet);
const moon = new THREE.Mesh(smallSphereGeom, moonMat);
moon.position.set(2, 0, 0);
earth.add(moon); // moon orbits earth's origin
renderer.setAnimationLoop((t) => {
earth.rotation.y = t * 0.001; // rotates planet + moon together
moon.rotation.y = t * 0.005; // moon also spins locally
renderer.render(scene, camera);
});
add()vsattach():addreparents preserving local transform — the child can visually jump.attachreparents preserving world transform — the child stays put visually. Useattachwhen you pick up an object in 3D space and drop it into another container.
World vs local transforms
You'll eventually need the world position of a deeply nested mesh — for example, to aim a camera at it or to raycast between two children of different parents:
const world = new THREE.Vector3();
moon.getWorldPosition(world);
Equivalents exist for quaternion (getWorldQuaternion), scale
(getWorldScale), and direction (getWorldDirection). They walk up the
parent chain and multiply transforms — cheap, but not free in a hot loop.
Merging geometries
Each mesh is a draw call. If you have 500 static rocks, drawing 500 meshes separately will kill your frame rate. Merge them into one geometry and draw it as a single mesh:
import { mergeGeometries } from 'three/addons/utils/BufferGeometryUtils.js';
const merged = mergeGeometries([geomA, geomB, geomC]);
const mesh = new THREE.Mesh(merged, material);
Caveats: they all need the same material, and you lose per-object transforms after merging
(the vertices are baked in world space at merge time). For animated crowds, reach for
InstancedMesh instead — we cover that in Article 09.
Common first-time pitfalls
- Lit material looks flat or self-shadows oddly. You forgot
computeVertexNormals()after modifying positions. - Raycaster can't find your mesh. Auto-computed bounding sphere is stale after CPU-side vertex edits. Call
geometry.computeBoundingSphere(). - Memory leak replacing geometries. The old geometry's
dispose()wasn't called. The demo above disposes on every dropdown change. - Weird triangle artifacts. Indices point outside your position array — count mismatch.
console.log(geom.attributes.position.count, geom.index.count).
Exercises
- Build a heart geometry from a parametric curve and extrude it. Hint:
ExtrudeGeometry(shape, { depth: 0.3, bevelEnabled: true }). - Take the custom wavy plane and animate the wave inside
setAnimationLoop. Don't rebuild the geometry — mutate thepositionattribute every frame. Watch your FPS. - Build a mini solar system: sun (root), earth (orbits sun), moon (orbits earth). Use Groups for orbit pivots instead of math.
What's next
Geometry done. Next article: materials, lights, and shadows. That's where your scene stops looking like a tech demo and starts feeling like a render.