Workflow_FrameSource_ExternalImageStream

Attention

This page is outdated, updates will be online soon…

Demonstrate how to create an EasyAR app with a non-system-API-based camera.

You can create a customized CameraDevice reference this sample. The sample uses system API as a sample and don’t support Android USB camera out-of-the-box. For Android USB camera, libuvc integration is needed. You can reference this .

Notice: don’t use this sample for Windows USB camera, as they have built-in support in EasyAR.

Notice: Using video stream as frame input is also supported, you can reference API documents to modify this sample.

How to Use

../../_images/image_20.png

How It Works

Use a custom FrameSource

To use a customized camera device in the scene, replace the VideoCamera with a customized FrameSource. Put it under the ARSession node, the session will use it automatically.

../../_images/image_s15_1.png

Inherit FrameSource

This sample inherit the FrameSource to implement camera features. Override Type/IsAvailable/OnEnable/OnDisable/OnAssemble to make the camera run.

public override Optional<InputFrameSourceType> Type { get => InputFrameSourceType.General; }

public override Optional<bool> IsAvailable { get => Application.platform == RuntimePlatform.Android || Application.platform == RuntimePlatform.IPhonePlayer; }

protected override void OnEnable()
{
    base.OnEnable();
#if !UNITY_EDITOR && UNITY_ANDROID
    if (externalCamera != null)
        externalCamera.Call<bool>("start", cameraCallback);
#elif !UNITY_EDITOR && UNITY_IOS
    if (externalCamera != null)
        externalCamera.start(cameraCallback);
#endif
}

protected override void OnDisable()
{
    base.OnDisable();
#if !UNITY_EDITOR && UNITY_ANDROID
    if (externalCamera != null)
        externalCamera.Call<bool>("stop");
#elif !UNITY_EDITOR && UNITY_IOS
    if (externalCamera != null)
        externalCamera.stop();
#endif
}

protected virtual void OnDestroy()
{
    Close();
}

public override void OnAssemble(ARSession session)
{
    base.OnAssemble(session);
    Open();
}

public void Open()
{
}

public void Close()
{
}

Native camera implementation

This sample use native camera source code to provide camera features. The source code for different platforms are in different places.

Android implementation is in the Gradle project in AndroidProject~ folder, and its output is pre-compiled into custom-camera.jar in Runtime/Android folder. AndroidProject~ is hidden in the Unity editor, you can access it directly from native filesystem.

iOS implementation is in the MyCamera files in the Runtime/iOS folder.

Implement camera device using native code provides the most flexibility to use system camera. You can change the source code to make camera feature work which is not integrated in EasyAR VideoCameraDevice.

Expose native camera interface into C#

A few code are used to wrap native interfaces accessible in C#, which directly calls Java or C interface.

private AndroidJavaObject externalCamera;
private CameraCallback cameraCallback;

private class CameraCallback : AndroidJavaProxy
{
}

private ExternalCamera externalCamera;
private Action<IntPtr, int> cameraCallback;

private class ExternalCamera : IDisposable
{
}

Handle InputFrame in camera callback

Both Android and iOS implementation in this sample expose the camera callback to C#. In the camera callback, InputFrameSink.handle is used to handle InputFrame created from the image raw data.

private void HandleSink(Buffer imageBuffer, PixelFormat format, Vector2 imageSize, int orientation, int cameraType, double timestamp)
{
    using (var cameraParams = CameraParameters.createWithDefaultIntrinsics(new Vec2I((int)imageSize.x, (int)imageSize.y), (CameraDeviceType)cameraType, orientation))
    using (var image = new Image(imageBuffer, format, (int)imageSize.x, (int)imageSize.y))
    using (var frame = InputFrame.createWithImageAndCameraParametersAndTemporal(image, cameraParams, timestamp))
    {
        if (sink != null)
            sink.handle(frame);
    }
    imageBuffer.Dispose();
}