WebGPU Rendering: Part 7 Video Texture

Matthew MacFarquhar
3 min readDec 5, 2024

--

Introduction

I have been reading through this online book on WebGPU. In this series of articles, I will be going through this book and implementing the lessons in a more structured typescript class approach and eventually we will build three types of WebGPU renderers: Gaussian Splatting, Ray tracing and Rasterization.

In this article we will talk about how to use a video as our texture instead of a static image, this boils down to really just continuously updating the textureBuffer and allowing the render pipeline to re-run using requestAnimationFrame.

The following link is the commit in my Github repo that matches the code we will go over.

Video Loader

First, we will create a VideoLoader class which encapsulates all the stuff we need to do with our video url to enable us load it up into an HTMLVideoElement.

class VideoLoader {
private _videoElement: HTMLVideoElement;
public static async create(videoUrl: string): Promise<VideoLoader> {
const videoElement = document.createElement("video");
videoElement.playsInline = true;
videoElement.muted = true;
videoElement.loop = true;
const videoReadyPromise = new Promise<void>((resolve) => {
let playing = false;
let timeUpdated = false;
videoElement.addEventListener("playing", () => {
playing = true;
if (playing && timeUpdated) {
resolve();
}
});
videoElement.addEventListener("timeupdate", () => {
timeUpdated = true;
if (playing && timeUpdated) {
resolve();
}
});
});
videoElement.src = videoUrl;
videoElement.play();
await videoReadyPromise;
return new VideoLoader(videoElement);
}
constructor(videoElement: HTMLVideoElement) {
this._videoElement = videoElement;
}
get videoElement(): HTMLVideoElement {
return this._videoElement;
}
}

Since initiating a video element is an async process, we take a similar approach to how our WebGPUContext class is set up — using a static async build function.

we basically just need to wait until our videoElement has sent out events for playing and the first timeupdate, then we are good to go for using it as a texture source.

Render Function

This render function will look very similar to what we have for texture loading, except now we will need to update the texture as the video plays, we also need to be able to run the pipeline multiple times to generate subsequent frames.

public async render_video_texture(shaderCode: string, vertexCount: number, instanceCount: number, vertices: Float32Array, texCoords: Float32Array,
transformationMatrix: Float32Array, projectionMatrix: Float32Array, videoUrl: string) {
const videoLoader = await VideoLoader.create(videoUrl);
const videoTexture = this._createTexture(videoLoader.videoElement.videoWidth, videoLoader.videoElement.videoHeight);
videoLoader.videoElement.ontimeupdate = async (event) => {
const imagedData = await createImageBitmap(videoLoader.videoElement);
this._device.queue.copyExternalImageToTexture({ source: imagedData }, {texture: videoTexture}, {width: imagedData.width, height: imagedData.height});
}

const transformationMatrixBuffer = this._createGPUBuffer(transformationMatrix, GPUBufferUsage.UNIFORM);
const projectionMatrixBuffer = this._createGPUBuffer(projectionMatrix, GPUBufferUsage.UNIFORM);
const sampler = this._createSampler();

const transformationMatrixBindGroupInput: IBindGroupInput = {
type: "buffer",
visibility: GPUShaderStage.VERTEX,
buffer: transformationMatrixBuffer,
}
const projectionMatrixBindGroupInput: IBindGroupInput = {
type: "buffer",
visibility: GPUShaderStage.VERTEX,
buffer: projectionMatrixBuffer,
}
const textureBindGroupInput: IBindGroupInput = {
type: "texture",
visibility: GPUShaderStage.FRAGMENT,
texture: videoTexture,
}
const samplerBindGroupInput: IBindGroupInput = {
type: "sampler",
visibility: GPUShaderStage.FRAGMENT,
sampler: sampler,
}
const { bindGroupLayout: uniformBindGroupLayout, bindGroup: uniformBindGroup } = this._createUniformBindGroup([transformationMatrixBindGroupInput, projectionMatrixBindGroupInput, textureBindGroupInput, samplerBindGroupInput]);

// CREATE VERTEX BUFFERS
const { buffer: positionBuffer, layout: positionBufferLayout } = this._createSingleAttributeVertexBuffer(vertices, { format: "float32x3", offset: 0, shaderLocation: 0 }, 3 * Float32Array.BYTES_PER_ELEMENT);
const { buffer: texCoordBuffer, layout: texCoordBufferLayout } = this._createSingleAttributeVertexBuffer(texCoords, { format: "float32x2", offset: 0, shaderLocation: 1 }, 2 * Float32Array.BYTES_PER_ELEMENT);

// CREATE COMMAND ENCODER
const render = () => {
const commandEncoder = this._device.createCommandEncoder();

const passEncoder = commandEncoder.beginRenderPass(this._createRenderTarget(this._context.getCurrentTexture(), {r: 1.0, g: 0.0, b: 0.0, a: 1.0}, this._msaa));
passEncoder.setViewport(0, 0, this._canvas.width, this._canvas.height, 0, 1);
passEncoder.setPipeline(this._createPipeline(this._createShaderModule(shaderCode), [positionBufferLayout, texCoordBufferLayout], [uniformBindGroupLayout], "bgra8unorm"));
passEncoder.setVertexBuffer(0, positionBuffer);
passEncoder.setVertexBuffer(1, texCoordBuffer);
passEncoder.setBindGroup(0, uniformBindGroup);
passEncoder.draw(vertexCount, instanceCount);
passEncoder.end();

this._device.queue.submit([commandEncoder.finish()]);

requestAnimationFrame(render);
}

requestAnimationFrame(render);
}
}

We achieve updating the texture from our video by adding a callback handler for videoElement.ontimeupdate which just copies our video image data into our texture buffer.

To allow our pipeline to re-run and utilize the updated texture info, we wrap it in a render function which calls request animationFrame at the bottom, and then we set off this recursive async frame rendering loop by calling renderFrame and passing in our function for rendering.

This process is so similar to our image texturing one that we even use the exact same shader code!

Conclusion

In this article we learned how to render a video as a texture, the core takeaway from this process was that we had to continuously update the GPU texture to have the right video frame and then run the render pipeline in a recursive never ending pattern instead of just once.

--

--

Matthew MacFarquhar
Matthew MacFarquhar

Written by Matthew MacFarquhar

I am a software engineer working for Amazon living in SF/NYC.

No responses yet