Recording Videos
Video or it Didn't Happen 🤨
Recording videos is the most fundamental feature of VideoKit (if it wasn't already obvious from the name). In this guide, we will explore how to record videos with and without code.
Recording videos requires an active VideoKit Core plan.
Using the Recorder Component
The first step is to create a VideoKitRecorder
component:
On WebGL, audio cannot be recorded from an AudioSource
or
AudioListener
because Unity does not support
capturing engine audio on WebGL.
Starting a Recording
Next, we need to invoke VideoKitRecorder.StartRecording
when
we want to begin a recording session. For this, we can use a UI button:
Feel free to resize the button and change the text to something more descriptive.
Stopping a Recording
Before we take our setup for a spin, we should add another button to stop the recording.
We will add a UI button to call VideoKitRecorder.StopRecording
:
Try it Out
Enter play mode, click the 'Start Recording' button, wait a few seconds, then click the 'Stop Recording' button. Navigate to your project directory and there should be a new directory called VideoKit, and within it, you should find a video of your empty scene!
In the editor, VideoKit always saves videos to the recordings
directory in your project folder.
In the player, VideoKit saves videos to your app's private documents directory.
Using the VideoKit API
Video recording is exposed with the MediaRecorder
class.
Creating a Recorder
First, create a recorder specifying the desired format and accompanying settings:
// Create a recorder
var recorder = await MediaRecorder.Create(
format: MediaRecorder.Format.MP4,
width: 1280,
height: 720,
frameRate: 30
);
See MediaRecorder
for information on what settings are required for different formats.
Appending Sample Buffers
The MediaRecorder
is designed with a push architecture, where pixel buffers and audio buffers must explicitly
be sent to the recorder using the MediaRecorder::Append
method.
But to simplify this process, VideoKit provides Sources which handle pulling pixel buffers and audio buffers from the screen, one or more game cameras,
audio listeners, and other sources. Let's use a CameraSource
to record video
from the game camera:
// Create a clock to generate recording timestamps
var clock = new RealtimeClock();
// Create a camera source
var cameraSource = new CameraSource(
recorder, // the recorder accepts and records pixel buffers from the source
clock, // the clock is used to generate recording timestamps
Camera.main // the camera source captures pixel buffers from one or more game cameras
);
On the other hand, you can append raw pixel buffers to the recorder on-demand:
// Create a pixel buffer
Texture2D image = ...;
using var pixelBuffer = new PixelBuffer(
image, // texture to get pixel data from
clock.timestamp // pixel buffer timestamp to use for recording
);
// Append
recorder.Append(pixelBuffer);
You can create a PixelBuffer
from a
Texture2D
, byte[]
,
NativeArray<byte>
, or unmanaged byte*
.
The MediaRecorder
only supports appending pixel buffers with the RGBA8888
format.
The process is similar for appending audio buffers:
// Create an audio buffer
float[] audioData = ...;
using var audioBuffer = new AudioBuffer(
44_100,
2,
audioData,
clock.timestamp
);
// Append
recorder.Append(audioBuffer);
You can create an AudioBuffer
from a float[]
, NativeArray<float>
, or
unmanaged float*
.
The audio data
contained in the audio buffer must be linear PCM and interleaved by channel.
Finishing the Recording
To finish a recording session, stop appending sample buffers then finish writing:
// Dispose any media sources
cameraSource.Dispose();
// Finish writing
MediaAsset asset = await recorder.FinishWriting();
The recorder returns a MediaAsset
which refers to a media file.
Once MediaRecorder::FinishWriting
is
called, you must not call any method or access any property on the recorder.
Next Steps
Now that you are familiar with the basics of video recording, continue with the following: