Three.js From Zero · Article 08

Custom Shaders: GLSL + TSL

Article 08 · Three.js From Zero

Custom Shaders: GLSL + TSL

Everything the built-in materials can't do, you do with shaders. A shader is a small program that runs on the GPU once per vertex (vertex shader) or once per pixel (fragment shader). When you write one yourself, you get direct control over position, color, and timing — for effects that aren't physically possible in PBR: energy fields, dissolve effects, liquid surfaces, hologram scanlines, procedural terrain, portal effects.

The demo runs five different shader materials on the same sphere. Scroll through the dropdown. Every preset is about 20 lines of shader code — you'll see all five in this article.

0 fps
drag to orbit
ShaderMaterial · GLSL

The four pieces of every shader

Before we write anything, memorize these four kinds of data. They're the entire mental model.

Shader pipeline diagram: Geometry (green, per-vertex attributes) and JavaScript (amber, per-draw uniforms) feed into the Vertex Shader (purple). Vertex shader outputs gl_Position and varyings. The rasterizer interpolates varyings across the triangle and passes them per-pixel to the Fragment Shader (purple), which outputs a pixel color (green). Three takeaway notes at the bottom.
Attributes flow from geometry, uniforms from JS. Vertex shader writes varyings that the rasterizer interpolates per-pixel for the fragment shader.
NameWhat it isWhere it comes from
attributePer-vertex dataYour BufferGeometry (position, normal, uv…)
uniformPer-draw-call dataJavaScript → GPU; same value for all vertices/pixels this frame
varyingVertex → fragment interpolated dataSet in vertex, read in fragment; interpolated by the GPU between vertices
constant / main fnThe shader code itselfYou write it as a GLSL string

The pipeline is: attributes flow into the vertex shader (plus uniforms); it outputs a clip-space position and any varyings; the GPU rasterizes, interpolating your varyings across each triangle; then the fragment shader runs per pixel, reading varyings and uniforms, outputting a color.

ShaderMaterial — the blank canvas

const mat = new THREE.ShaderMaterial({
  uniforms: {
    uTime: { value: 0 },
    uColor: { value: new THREE.Color('#38bdf8') },
  },
  vertexShader: /*glsl*/`
    varying vec2 vUv;
    void main() {
      vUv = uv;
      gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
    }
  `,
  fragmentShader: /*glsl*/`
    uniform float uTime;
    uniform vec3  uColor;
    varying vec2  vUv;

    void main() {
      float stripes = sin(vUv.y * 40.0 + uTime * 3.0) * 0.5 + 0.5;
      gl_FragColor = vec4(uColor * stripes, 1.0);
    }
  `,
});

Four details Three.js handles for you automatically on a ShaderMaterial:

  • position, normal, uv are auto-declared as attributes.
  • projectionMatrix, modelViewMatrix, viewMatrix, modelMatrix are auto-declared as uniforms.
  • You don't declare precision mediump float — Three.js prepends it.
  • WebGL2 is used by default, which gives you modern GLSL 3.00 ES features.

If you want none of that — total control — use RawShaderMaterial. For 99% of cases ShaderMaterial is correct.

In your animation loop, you step the uniform forward:

renderer.setAnimationLoop((t) => {
  mat.uniforms.uTime.value = t * 0.001;  // seconds
  renderer.render(scene, camera);
});

Five shader recipes — all in the demo

1. Ripple — vertex displacement

Displacement happens in the vertex shader. Modify position before passing it through the matrices. Recompute normals or lighting breaks — but for unlit Basic/Normal materials you can skip it.

vertexShader: /*glsl*/`
  uniform float uTime;
  uniform float uAmp;
  uniform float uFreq;
  varying vec3 vNormal;
  varying vec2 vUv;

  void main() {
    vUv = uv;
    vNormal = normal;

    vec3 p = position;
    // radial ripple along y, animated by uTime
    float w = sin(length(position.xy) * uFreq - uTime * 3.0);
    p += normal * w * uAmp * 0.1;

    gl_Position = projectionMatrix * modelViewMatrix * vec4(p, 1.0);
  }
`

Critical: uAmp * 0.1 keeps displacement small so the geometry doesn't explode. Push the slider to feel it.

2. Plasma — pure fragment math

fragmentShader: /*glsl*/`
  uniform float uTime;
  uniform float uFreq;
  varying vec2 vUv;

  void main() {
    float t = uTime;
    vec2  uv = vUv * uFreq;
    float n =
      sin(uv.x + t) +
      sin(uv.y + t * 1.1) +
      sin((uv.x + uv.y + t) * 0.7) +
      sin(length(uv - 0.5) * 3.0 - t);
    n *= 0.25;
    vec3 col = 0.5 + 0.5 * cos(6.28 * (n + vec3(0.0, 0.33, 0.67)));
    gl_FragColor = vec4(col, 1.0);
  }
`

Four overlapping sine waves, mapped through a color palette function. Classic demoscene effect. No geometry involved; every pixel computes its color independently from UV + time.

3. Hologram — scanlines + fresnel

fragmentShader: /*glsl*/`
  uniform float uTime;
  uniform vec3  uColor;
  varying vec3 vNormal;
  varying vec3 vWorldPos;

  void main() {
    vec3 V = normalize(cameraPosition - vWorldPos);
    float fresnel = pow(1.0 - max(dot(normalize(vNormal), V), 0.0), 2.0);

    float scan = sin(vWorldPos.y * 60.0 + uTime * 8.0) * 0.5 + 0.5;
    float flicker = 0.85 + 0.15 * sin(uTime * 30.0);

    vec3 col = uColor * (scan * 0.5 + 0.5) * fresnel * flicker;
    gl_FragColor = vec4(col, 1.0);
  }
`

Three tricks in one shader:

  • Fresnelpow(1 - dot(N, V), n). Edges-of-sphere brighter than the center. It's the "glowing rim" of every hologram.
  • Scanlinessin(y * 60) over world Y.
  • Flicker — tiny sine on time, multiplied in. Sells the CRT look.

4. Dissolve — discard

discard is a fragment shader keyword that throws away the pixel entirely. Combined with procedural noise you get "burn away" effects.

fragmentShader: /*glsl*/`
  uniform float uTime;
  varying vec3 vWorldPos;

  // Classic hash-based noise
  float hash(vec3 p) {
    return fract(sin(dot(p, vec3(12.9898, 78.233, 45.164))) * 43758.5453);
  }

  void main() {
    float n = hash(floor(vWorldPos * 20.0));
    float threshold = 0.5 + 0.5 * sin(uTime * 0.6);
    if (n < threshold) discard;          // burn holes

    // Edge glow — pixels close to the threshold still render but orange
    float edge = smoothstep(threshold, threshold + 0.08, n);
    vec3 col = mix(vec3(1.0, 0.3, 0.0), vec3(1.0), edge);
    gl_FragColor = vec4(col, 1.0);
  }
`

5. Liquid metal — noise + envmap fake

Displace vertices with 3D-ish noise so the surface flows, and shade with a procedural sheen:

vertexShader: /*glsl*/`
  uniform float uTime;
  uniform float uAmp;
  varying vec3 vNormal;

  float hash31(vec3 p) {
    return fract(sin(dot(p, vec3(12.9898, 78.233, 45.164))) * 43758.5453);
  }

  void main() {
    vec3 p = position;
    float n = hash31(floor(position * 3.0 + uTime * 0.3));
    p += normal * (n - 0.5) * uAmp * 0.08;
    vNormal = normal;
    gl_Position = projectionMatrix * modelViewMatrix * vec4(p, 1.0);
  }
`,
fragmentShader: /*glsl*/`
  varying vec3 vNormal;
  void main() {
    vec3 N = normalize(vNormal);
    vec3 col = mix(vec3(0.2, 0.4, 0.9), vec3(1.0, 0.8, 0.4), N.y * 0.5 + 0.5);
    gl_FragColor = vec4(col, 1.0);
  }
`

onBeforeCompile — shaders without giving up PBR

Writing a full ShaderMaterial means re-implementing lighting yourself. If you want to modify a PBR material — inject noise into its base color, displace vertices but keep shadows — you use onBeforeCompile:

const mat = new THREE.MeshStandardMaterial({ color: '#38bdf8' });

mat.userData.uTime = { value: 0 };

mat.onBeforeCompile = (shader) => {
  shader.uniforms.uTime = mat.userData.uTime;

  // Inject a uniform declaration into the vertex shader.
  shader.vertexShader = shader.vertexShader.replace(
    '#include <common>',
    `#include <common>
     uniform float uTime;`
  );

  // Inject vertex displacement right where positions are being transformed.
  shader.vertexShader = shader.vertexShader.replace(
    '#include <begin_vertex>',
    `#include <begin_vertex>
     transformed += normal * sin(position.y * 8.0 + uTime * 2.0) * 0.05;`
  );
};

// Step the uniform every frame:
mat.userData.uTime.value = t * 0.001;

Three.js's built-in materials are modular GLSL programs glued together from #include chunks. You patch them by replacing an include with the include + your extra code. The chunk names (common, begin_vertex, color_fragment, etc.) are stable across versions; full list in src/renderers/shaders/ShaderChunk.js.

Rule of thumb: write a fresh ShaderMaterial for unlit art effects (plasma, hologram, dissolve). Use onBeforeCompile when you want PBR shading + a modification (vertex wave + real shadows, procedural base color + envmap reflections).

TSL — Three Shading Language

New in r162+, stable in r170+. TSL is a JavaScript-based node graph for writing shaders — no GLSL strings, no backend-specific syntax, and it compiles to both WebGL and WebGPU automatically. Long-term this is where Three.js is heading.

The same "ripple" effect in TSL

import { MeshBasicNodeMaterial } from 'three/webgpu';
import { positionLocal, normalLocal, uniform, time, sin, length } from 'three/tsl';

const uAmp  = uniform(1.0);
const uFreq = uniform(6.0);

const mat = new MeshBasicNodeMaterial();
mat.positionNode = positionLocal.add(
  normalLocal.mul(
    sin(length(positionLocal.xy).mul(uFreq).sub(time.mul(3.0))).mul(uAmp).mul(0.1)
  )
);

Same math, expressed as JS function calls instead of GLSL text. Benefits:

  • Type-safe — the IDE knows positionLocal is a vec3 and will flag .rgba as wrong.
  • Portable — one graph compiles to GLSL for WebGLRenderer and WGSL for WebGPURenderer.
  • Composable — you pass nodes around, swap pieces, build abstractions the same way you would in regular TypeScript.
  • Node-tree editor friendly — Three's editor and other tools can serialize/edit TSL graphs visually.

TSL is the future but GLSL isn't going anywhere — every existing tutorial, shader library, and Shadertoy snippet is GLSL. Learn GLSL first (you'll need it to read others' code), graduate to TSL when it makes sense for your project.

TSL lighting-model integration

Where TSL really wins is combining custom math with PBR:

import { MeshStandardNodeMaterial } from 'three/webgpu';
import { sin, time, positionLocal, vec3 } from 'three/tsl';

const mat = new MeshStandardNodeMaterial({ color: '#38bdf8', roughness: 0.3 });

mat.colorNode = vec3(
  0.5,
  sin(positionLocal.y.mul(8.0).add(time.mul(2.0))).mul(0.5).add(0.5),
  0.8,
);

Full PBR — shadows, env reflection, tone mapping — with a procedural base color. No onBeforeCompile, no include-string surgery. This is the TSL sweet spot.

Debugging shaders

  1. Render a value as color. Stuck? gl_FragColor = vec4(vUv, 0.0, 1.0); to see UVs. vec4(normalize(vNormal) * 0.5 + 0.5, 1.0) to see normals. Colors don't lie.
  2. Watch for NaN. Black spots that flicker = NaN. Guard against 1.0 / 0, sqrt(negative), log(0).
  3. Compile errors show in console. Three.js prefixes them with THREE.WebGLProgram:. The line numbers refer to the compiled shader (your text + prefix chunks). Subtract ~50 lines to find your code.
  4. Use precision highp float; for math involving long ranges. mediump is default and cheap, but loses precision on distant or small values.

Common first-time pitfalls

  • Black mesh. Vertex shader doesn't write to gl_Position, or fragment doesn't write to gl_FragColor. Or an include path broke.
  • Weird shading / seams. You displaced vertices but didn't recompute normals (for lit materials). Do it in CPU (computeVertexNormals) or analytically in the shader (derivative trick).
  • Uniforms don't update. You reassigned mat.uniforms.uTime instead of mat.uniforms.uTime.value. It's a boxed value by design.
  • ShaderMaterial transparent objects sort wrong. Set material.transparent = true AND depthWrite = false on small particles; sort manually for big transparents.
  • "cannot find identifier `uv`". You used ShaderMaterial auto-declarations, but you're on an old Three.js version or a setup that doesn't include them. Verify you're not on RawShaderMaterial.
  • Onmobile the shader looks different. mediump precision kicks in. Add precision highp float; or use #ifdef GL_FRAGMENT_PRECISION_HIGH.

Exercises

  1. Port the plasma shader to run on a full-screen ShaderPass (Article 07) — so it becomes a post effect instead of a surface material.
  2. Add normalMap-like detail to the dissolve shader: sample a noise texture for the cutoff threshold instead of computing it per-pixel.
  3. Rebuild one of the five presets in TSL. Your choice. Start with plasma or ripple — they're the simplest graphs.

What's next

Article 09 — Performance: Instancing, LOD, Draw Calls. We've written fancy shaders; next we make sure the scene runs at 120 FPS on a Chromebook. Instanced meshes, batched meshes, LOD nodes, frustum culling, and the devtools profiling workflow.