2025-11-12Bruno Fernandes

WebGPU in Next.js Part 3: Uniforms and Animation

In the previous post, we set up a TypeGPU-powered WebGPU provider and created a reusable render pipeline component. Now it's time to enhance our shaders with uniforms and introduce animation to create dynamic visual effects.

Let's create a shader that uses uniforms to animate colors over time and follow the mouse position.

1. Accessors

TypeGPU provides a convenient way to define and manage uniform buffers through its accessor system. Accessors allow you to create strongly-typed uniform buffers that can be easily updated from JavaScript and accessed in your shaders.

Start by defining an accessor for our uniform data, which will include time, mouse position, and canvas resolution:

src/shaders/accessors.ts

1import tgpu from "typegpu";
2import * as d from "typegpu/data";
3
4export const timeAccess = tgpu["~unstable"].accessor(d.f32);
5export const resolutionAccess = tgpu["~unstable"].accessor(d.vec2f);
6export const mousePosAccess = tgpu["~unstable"].accessor(d.vec2f);

This code defines three accessors: timeAccess, resolutionAccess, and mousePosAccess, each corresponding to a specific data type:

  • timeAccess: A single float representing elapsed time.
  • resolutionAccess: A 2D vector representing the canvas resolution (width and height).
  • mousePosAccess: A 2D vector representing the mouse position (x and y coordinates).

2. Writing the animated fragment shader

Next, let's create a fragment shader that utilizes these accessors to produce an animated effect based on time and mouse position.

src/shaders/example-fragment2.ts

1import tgpu from "typegpu";
2import * as d from "typegpu/data";
3
4import {
5  mousePosAccess,
6  resolutionAccess,
7  timeAccess,
8} from "@/shaders/accessors";
9
10export const exampleFragment2 = tgpu["~unstable"].fragmentFn({
11  in: { uv: d.vec2f },
12  out: d.vec4f,
13}) /* wgsl */ `{
14    let aspect = resolution.x / resolution.y;
15    let correctedUv = vec2<f32>(in.uv.x * aspect, in.uv.y);
16    let correctedMousePos = vec2<f32>(
17        mousePos.x / resolution.x,
18        1.0 - (mousePos.y / resolution.y)
19    );
20
21    let dist = distance(correctedUv, correctedMousePos);
22    let speed = 2.0;
23    let pulse = (sin(time * speed) + 1.0) * 0.5;
24    let radiusMin = 0.03;
25    let radiusMax = 0.08;
26    let radius = mix(radiusMin, radiusMax, pulse);
27    let background = vec4<f32>(0.2, 0.5, 1.0, 1.0);
28    let circleColor = vec4<f32>(1.0, 0.5, 0.2, 1.0);
29
30    let aaWidth = fwidth(dist);
31    let t = smoothstep(radius - aaWidth, radius + aaWidth, dist);
32    let finalColor = mix(circleColor, background, t);
33
34    return finalColor;
35}
36`.$uses({
37  time: timeAccess,
38  resolution: resolutionAccess,
39  mousePos: mousePosAccess,
40});

With the $uses method, we can bind the shader's uniform variables to the corresponding accessors we defined earlier. We can then use these uniforms inside the shader code (the keys in the object passed to $uses correspond to the variable names in the WGSL code). This shows how much easier it is to manage uniforms with TypeGPU compared to raw WebGPU.

This shader does the following:

  1. It calculates the aspect ratio of the canvas and corrects the UV coordinates accordingly. This is needed to ensure that the effect maintains its proportions regardless of the canvas size. Right now our canvas is square, but this will be useful later if we change its dimensions.
  2. It computes the distance between the current fragment's UV coordinates and the mouse position.
  3. Using a sine wave based on the elapsed time, it creates a pulsing effect that modulates the radius of a circle.
  4. It uses smoothstep and fwidth to create anti-aliased edges for the circle, blending it smoothly with the background color.

Note: The 1.0 - (mousePos.y / resolution.y) calculation inverts the Y coordinate of the mouse position because WebGPU's coordinate system has the origin at the top-left corner, while mouse coordinates typically have the origin at the bottom-left.

Note: The fwidth function computes the rate of change of the distance value across neighboring pixels, which helps in achieving smooth transitions at the edges of the circle. For more information on fwidth, smoothstep and other built-in WGSL functions, refer to the WGSL specification.

3. Updating the RenderPipeline component

To make use of the new shader and its uniforms, we need to update our RenderPipeline component.

First, let's modify the component props to accept an array of accessor-uniform pairs so we can bind multiple uniforms easily:

src/components/render-pipeline.tsx

1// ...
2export type Binding<T extends d.AnyWgslData> = [
3  TgpuAccessor<T>,
4  TgpuUniform<T> | undefined,
5];
6
7export type RenderPipelineProps<TBindingData extends Binding<d.AnyWgslData>[]> =
8  {
9    vertexShader: TgpuVertexFn<
10      Record<never, never>,
11      {
12        uv: d.Vec2f;
13      }
14    >;
15
16    fragmentShader: TgpuFragmentFn<
17      {
18        uv: d.Vec2f;
19      },
20      d.Vec4f
21    >;
22
23    bindings?: TBindingData;
24
25    onFrame?: (elapsedTime: number) => void;
26
27    canvasRef?: React.RefObject<HTMLCanvasElement | null>;
28  };
29
30// ...

We now have a new bindings prop, which is an array of tuples. Each tuple contains an accessor and its corresponding uniform instance. This allows us to bind multiple type-safe uniforms to the pipeline.

Let's adjust the pipeline creation logic to include the new bindings:

1// ...
2// Create pipeline
3useEffect(() => {
4  if (!root) return;
5
6  let boundRoot: WithBinding = root["~unstable"];
7
8  if (bindings) {
9    for (const [accessor, uniform] of bindings) {
10      if (!uniform) continue;
11
12      boundRoot = boundRoot.with(accessor, uniform);
13    }
14  }
15
16  pipelineRef.current = boundRoot
17    .withVertex(vertexShader, {})
18    .withFragment(fragmentShader, {
19      format: navigator.gpu.getPreferredCanvasFormat(),
20    })
21    .createPipeline();
22}, [fragmentShader, redraw, root, vertexShader, bindings]);
23// ...

Here, we iterate over the bindings array and use the with method to bind each accessor to its corresponding uniform before creating the pipeline.

We removed the immediate redraw call after pipeline creation. Instead, we will handle drawing in an animation loop to support dynamic updates.

You might have noticed the new optional onFrame prop. It's a callback that will be invoked on each animation frame, allowing us to update uniform values dynamically from outside the component.

Let's create a new effect to do this:

1// ...
2// Animation loop
3useEffect(() => {
4  const startTime = performance.now();
5
6  let animationFrameId: number;
7
8  const render = () => {
9    const currentTime = performance.now();
10    const elapsedTime = (currentTime - startTime) / 1000;
11    onFrame?.(elapsedTime);
12
13    redraw();
14
15    animationFrameId = requestAnimationFrame(render);
16  };
17
18  animationFrameId = requestAnimationFrame(render);
19
20  return () => {
21    cancelAnimationFrame(animationFrameId);
22  };
23}, [onFrame, redraw]);
24// ...

Here, we set up an animation loop using the requestAnimationFrame API. On each frame, we calculate the elapsed time since the start and invoke the onFrame callback, allowing external code to update uniform values. After that, we call redraw to render the updated frame.

We also added an optional canvasRef prop to allow parent components to access the canvas element if needed. We need to update the canvas element to use this ref in addition to the internal one:

1// ...
2    return (
3        <canvas
4            {...rest}
5            ref={(el) => {
6                canvasRef.current = el;
7                if (externalCanvasRef) {
8                    externalCanvasRef.current = el;
9                }
10            }}
11        />
12    );
13// ...

4. Using the updated RenderPipeline component

Now we can use the enhanced RenderPipeline component in a page to create our animated effect. We'll instantiate the necessary uniforms and pass them along with the shaders.

src/app/typegpu-render-example2/page.tsx

1"use client";
2
3import { useCallback, useEffect, useRef, useState } from "react";
4import { TgpuUniform } from "typegpu";
5import * as d from "typegpu/data";
6
7import { RenderPipeline } from "@/components/render-pipeline";
8import { useGpu } from "@/providers/gpu-provider";
9import {
10  mousePosAccess,
11  resolutionAccess,
12  timeAccess,
13} from "@/shaders/accessors";
14import { exampleFragment2 } from "@/shaders/example-fragment2";
15import { mainVertex } from "@/shaders/main-vertex";
16
17export default function TypegpuRenderExample2Page() {
18  const { root } = useGpu();
19  const canvasRef = useRef<HTMLCanvasElement>(null);
20  const [timeUniform, setTimeUniform] = useState<TgpuUniform<d.F32>>();
21  const [mouseUniform, setMouseUniform] = useState<TgpuUniform<d.Vec2f>>();
22  const [resolutionUniform, setResolutionUniform] =
23    useState<TgpuUniform<d.Vec2f>>();
24
25  useEffect(() => {
26    if (!root) return;
27
28    setTimeUniform(root.createUniform(d.f32, 0));
29    setMouseUniform(root.createUniform(d.vec2f, d.vec2f(0, 0)));
30    setResolutionUniform(root.createUniform(d.vec2f, d.vec2f(0, 0)));
31  }, [root]);
32
33  useEffect(() => {
34    function onResize() {
35      const canvas = canvasRef.current;
36      if (!canvas) return;
37
38      const dpr = window.devicePixelRatio || 1;
39      const pixelWidth = Math.max(1, Math.floor(canvas.clientWidth * dpr));
40      const pixelHeight = Math.max(1, Math.floor(canvas.clientHeight * dpr));
41      resolutionUniform?.write(d.vec2f(pixelWidth, pixelHeight));
42    }
43
44    function onMouseMove(e: MouseEvent) {
45      const canvas = canvasRef.current;
46      if (!canvas) return;
47
48      const rect = canvas.getBoundingClientRect();
49      const dpr = window.devicePixelRatio || 1;
50      const x = e.clientX - rect.left;
51      const y = e.clientY - rect.top;
52      mouseUniform?.write(d.vec2f(x * dpr, y * dpr));
53    }
54
55    window.addEventListener("mousemove", onMouseMove);
56    window.addEventListener("resize", onResize);
57
58    onResize();
59
60    return () => {
61      window.removeEventListener("mousemove", onMouseMove);
62      window.removeEventListener("resize", onResize);
63    };
64  }, [mouseUniform, resolutionUniform]);
65
66  const handleFrame = useCallback(
67    (elapsedTime: number) => {
68      timeUniform?.write(elapsedTime);
69    },
70    [timeUniform],
71  );
72
73  return (
74    <div>
75      <div className="my-4">
76        <RenderPipeline
77          className="block aspect-square w-[420px] rounded border border-gray-300"
78          vertexShader={mainVertex}
79          fragmentShader={exampleFragment2}
80          onFrame={handleFrame}
81          bindings={[
82            [timeAccess, timeUniform],
83            [mousePosAccess, mouseUniform],
84            [resolutionAccess, resolutionUniform],
85          ]}
86          canvasRef={canvasRef}
87        />
88      </div>
89    </div>
90  );
91}

In this page component:

  1. We create state variables to hold the uniform instances for time, mouse position, and resolution.
  2. When the root becomes available, we instantiate the uniforms using createUniform.
  3. We set up event listeners to handle mouse movement and window resizing. The mouse position is updated in the mouseUniform, and the canvas resolution is updated in the resolutionUniform.
  4. The handleFrame callback updates the timeUniform with the elapsed time on each animation frame.
  5. Finally, we render the RenderPipeline component, passing in the shaders, bindings, and the onFrame callback.

Visiting this page should now display an animated circle that pulses over time and follows the mouse cursor. Feel free to experiment with the shader code and uniform values to create different effects. In the next part of the series, we'll explore compute shaders and how to leverage TypeGPU for more complex GPU computations. See you there!