How to save html canvas animation as a video

Julien de Charentenay
4 min readNov 13, 2021

--

This story show how to add ‘save to video’ capabilities to a webpage containing canvas animation. The implementation extends on the work described in this story to add the ability to record the canvas rendering as a video.

The source code is available in this github repository.

Saving canvas to video — extract of demonstration video https://youtu.be/dF5i0gAI7hA

The views/opinions expressed in this story are my own. This story relates my personal experience and choices, and is provided for information in the hope that it will be useful but without any warranty.

I have experimented with the HTML canvas element in two projects: www.cfd-webassembly.com and www.video-mash.com. It is very versatile with the ability of rendering 2D and 3D — see this article for a sample of links to canvas examples.

In the context of my project www.video-mash.com, I wanted to add the ability to save videos to allow one to record their experience and share it. I investigated implementing this without relying on screen capture — and found out that the canvas element provides functionality to make this implementation nearly a walk in the park. I am also adding some comments on handling audio track.

Demonstration video

The video shown here is a video montage made from the recordings of the webcam feed as normal, after being treated to remove the background and after being treated to remove the foreground — a video icognito mode?. If interested, you can do the same experiment at https://www.video-mash.com/demo.html.

Canvas Capturing

Generating a video from a canvas uses two web API components:

  • Canvas’ captureStream() method — see here: this method converts the canvas into a media stream;
  • MediaRecorder — see here: this interface allows for handling of the media streams to generate the video.

The following code extracts show how the HTML canvas element — obtained using document.getElementById function — is converted to a relevant media stream using a 30 frame-per-seconds rate. The media stream is then assigned to the media recorder to output in a webm video format. The video data is recorded into the chunks array as it becomes available — in slice of 1 seconds in the example below. This code extract is run when the recording start.

var chunks = [];
var canvas_stream = canvas.captureStream(30); // fps
// Create media recorder from canvas stream
this.media_recorder = new MediaRecorder(canvas_stream, { mimeType: "video/webm; codecs=vp9" });
// Record data in chunks array when data is available
this.media_recorder.ondataavailable = (evt) => { chunks.push(evt.data); };
// Provide recorded data when recording stops
this.media_recorder.onstop = () => {this.on_media_recorder_stop(chunks);}
// Start recording using a 1s timeslice [ie data is made available every 1s)
this.media_recorder.start(1000);

The recording is stop by running the stop method on the media recorder.

this.media_recorder.stop();

This trigger the media recorder onstop callback. In the implementation, the callback consists of a call to the function on_media_recorder_stop with the video data chunks provided as argument. The function gathers the video data as a blob that is automatically downloaded using the filename video.webm:

// Gather chunks of video data into a blob and create an object URL
var blob = new Blob(chunks, {type: "video/webm" });
const recording_url = URL.createObjectURL(blob);
// Attach the object URL to an <a> element, setting the download file name
const a = document.createElement('a');
a.style = "display: none;";
a.href = recording_url;
a.download = "video.webm";
document.body.appendChild(a);
// Trigger the file download
a.click();
setTimeout(() => {
// Clean up - see https://stackoverflow.com/a/48968694 for why it is in a timeout
URL.revokeObjectURL(recording_url);
document.body.removeChild(a);
}, 0);

The github demo provides four canvas highlighting the information made available by the MediaPipe SelfieImplementation model. The implementation allows each canvas to be recorded individually by calling the record function with the id of the canvas to be recorded.

And Audio Track

So far the implementation allows for the recording of the video only. The audio feed to be recorded can be added to the recording as an audio track added to the canvas media stream.

The following is required to add the webcam audio to the recording:

  • The webcam audio feed needs to be requested alongside the video feed:
navigator.mediaDevices
.getUserMedia({ audio: true, video: { facingMode: "user" } })
  • When the webcam media stream is assigned to a video element, the audio element should be muted to avoid audio feedback;
  • The webcam audio track is extracted for later use:
// Request webcam with audio and user facing camera
navigator.mediaDevices
.getUserMedia({ audio: true, video: { facingMode: "user" } })
.then((media_stream) => {
// Retrieve audio track
this.audio_track = media_stream.getAudioTracks()[0];
// Assign media stream to video element - with audio muted
this.webcam_video = document.createElement("video");
this.webcam_video.srcObject = media_stream;
this.webcam_video.muted = true;
this.webcam_video.style.display = "none";
document.body.appendChild(this.webcam_video);
this.webcam_video.onplay = this.playing; // And start playing
this.webcam_video.play();
})

The audio track is added to the media stream derived from the canvas before creating the media recorder, resulting in the following revision of the code presented in the previous section:

var chunks = [];
var canvas_stream = canvas.captureStream(30); // fps
// Add audio track
if (this.audio_track) {canvas_stream.addTrack(this.audio_track);}
// Create media recorder from canvas stream
this.media_recorder = new MediaRecorder(canvas_stream, { mimeType: "video/webm; codecs=vp9" });

Extra

My project www.video-mash.com also allows the recording of the video audio track. I investigated using the audioTracks element as described in here — but the feature is currently in draft form and does not seem to be widely supported. In place, a media stream is created from the video element using the captureStream method — similar to the approach done on the canvas — and extracting the audio tracks from it as shown below with this.#video representing a video element created using document.createElement('video'):

this.video = document.createElement('video');
//... assign url and start playing
this.media_stream = this.video.captureStream();
this.audio_track = this.media_stream.getAudioTracks()[0];

Thanks for reading

If you like this story, please clap. You can also read my other stories here and see what I have been up-to at www.charentenay.me.

--

--

Julien de Charentenay

I write about a story a month on rust, JS or CFD. Email masking side project @ https://1-ml.com & personal website @ https://www.charentenay.me/