Streaming from Cameras
Adding the User to the Mix 👩🏾
VideoKit makes it very easy to discover available camera devices and stream the camera preview. Below, we will explore doing so with and without code.
Streaming the camera requires an active VideoKit Core plan.
Using the Camera Manager Component
VideoKit provides the VideoKitCameraManager
component to
simplify working with camera devices in Unity scenes.
Adding the Component
The first step is to add a VideoKitCameraManager
component in your scene.
Displaying the Preview
To display the camera preview, first create a UI RawImage
and add a VideoKitCameraView
component:
Now press play, and that's it!
Removing the User Background
VideoKit supports removing the user's background from the camera preview. First, enable the Human Texture
capability on the VideoKitCameraManager
component:
Next, change the View Mode
on the VideoKitCameraView
component to Human Texture
:
Using the VideoKit API
Camera streaming is exposed with the CameraDevice
class.
Discovering Camera devices
Because camera devices access highly sensitive user data, the very first step is to request
camera permissions from the user. We provide the
CameraDevice.CheckPermissions
method for this:
// Request camera permisisons from the user
PermissionStatus status = await CameraDevice.CheckPermissions(request: true);
// Check that the user granted permissions
if (status != PermissionStatus.Granted)
return;
Once camera permisisons have been granted, we can now discover available cameras using the
CameraDevice.Discover
method:
using System.Linq;
// Discover available camera devices
CameraDevice[] cameras = await CameraDevice.Discover();
// Use the first front-facing camera
CameraDevice camera = cameras.FirstOrDefault(cam => cam.frontFacing);
Configuring the Camera Stream
Camera devices expose an extensive API for configuring the preview resolution, photo resolution, focus mode, exposure mode, and much more. At a minimum, we recommend configuring the preview resolution:
// Set the camera preview resolution
camera.previewResolution = (1280, 720);
Regardless of the orientation of your app, camera resolutions are always specified in landscape mode. This means that the width should always be the bigger number.
Streaming Pixel Buffers
To begin streaming pixel buffers from the camera, use the CameraDevice.StartRunning
method:
// Start streaming pixel buffers from the camera
camera.StartRunning(OnPixelBuffer);
You must provide a callback that will be invoked with pixel buffers as they become available:
void OnPixelBuffer (PixelBuffer cameraBuffer) {
// ...
}
The CameraDevice
will always invoke this callback on a dedicated camera thread--not the Unity main thread.
As a result, do not call Unity methods from this callback.
Displaying Pixel Buffers
The PixelBuffer
instances provided to the camera callback contain raw pixel
data. This data is usually optimized for the camera system, which means that it is often in a planar format
(e.g. YUV) and has no geometric transformations applied (i.e. orientation, mirroring).
Before we can display the pixel data, we must first convert it to the RGBA8888
pixel format and apply any
relevant geometric transformations. First, allocate a NativeArray
to hold the transformed pixel data
and a Texture2D
to actually display the pixel data:
private NativeArray<byte> rgbaData;
private Texture2D texture;
void StartCamera () {
// Get the current preview resolution from the camera
var (width, height) = camera.previewResolution;
// Create preview data
rgbaData = new NativeArray<byte>(width * height * 4, Allocator.Persistent);
// Create the preview texture
texture = new Texture2D(width, height, TextureFormat.RGBA32, false);
// Start streaming pixel buffers from the camera
camera.StartRunning(OnPixelBuffer);
}
When we receive a new pixel buffer in OnPixelBuffer
, we can use the
PixelBuffer.CopyTo
method to copy the pixel data to our rgbaData
while applying relevant pixel format and geometric transformations:
void OnPixelBuffer (PixelBuffer cameraBuffer) {
lock (texture) {
// Create a destination `PixelBuffer` backed by our preview data
using var previewBuffer = new PixelBuffer(
cameraBuffer.width,
cameraBuffer.height,
PixelBuffer.Format.RGBA8888,
rgbaData
);
// Copy the pixel data from the camera buffer to our preview buffer
cameraBuffer.CopyTo(previewBuffer, rotation: PixelBuffer.Rotation._180);
}
}
If your rotation
is Rotation._90
or Rotation._270
, make sure to swap the width
and height
when
creating the previewBuffer
pixel buffer.
Finally, use your script's Update
method to upload the preview data into a Texture2D
for viewing:
void Update () {
// Update the preview texture with the latest preview data
lock (texture)
texture.GetRawTextureData<byte>().CopyFrom(rgbaData);
// Upload the texture data to the GPU for display
texture.Apply();
}