How to Create EasyAR HMD Extension

This article explains how to add EasyAR support on a device that is not yet supported. If you are looking for how to use EasyAR on an already supported device, please reference EasyAR HMD Extensions.

At the time of the article was written (year of 2023), there is not yet a unified standard for the interface in AR/VR/MR/XR industry. OpenXR is a good candidate, but it takes time for spec evolution and implementation from the industry. So usually, it is not easy to run EasyAR on devices on the market, there would be large probability that data or interface are missing. If you are an app or content developer, please contact hardware vender or EasyAR sales agent. Apple has opened some APIs for Vision Pro in 2024, and it is sufficient, suggest to reference if you are hardware venders.

So, this article is written for hardware venders instead of consumers. Generally, this article offers some kind of standard or rules, but implementation details are not restricted. Any implementation or interface definition are welcome for discussion, please contact us through business channel.

The hardware covered by this article is required to have motion tracking (VIO) capability. We assume EasyAR features runs on top of good device motion tracking ability normally. It is not suggested to optimize device tracking using EasyAR which will lead to loop dependency (It cannot be ruled out that the error could be amplified by positive feedback and the system tends to be unstable at the view of top-level architecture). If the device itself does not have motion tracking capability, then the support solution is not covered by this article, and communication can be carried out through commercial channels if necessary.

Purpose and Reasonable Expectations

Do not misunderstand the purpose of what is to be done is extremely important. Have a reasonable expectation of the resources that will be invested and the results that will be achieved in the end.

What is EasyAR HMD Extension

The extension you are writing is,

  • A bunch of code which grab data from your device API and send it into EasyAR Sense using EasyAR Sense custom camera interface.

  • In Unity, the HMD extension will take use of inheritable frame source API and a workflow of EasyAR Sense defined by EasyAR Sense Unity Plugin to simplify EasyAR Sense custom camera development.

  • In Unity, the HMD extension will be a Unity package with runtime scripts, editor scripts and samples of the extension, and you or EasyAR can distribute it to downstream users.

When you are bringing up HMD Extension, you may,

  • Change your SDK interfaces and internal implementations.

  • Discuss with your team for the solution about getting and using required data.

  • Spend a lot of time validating data correctness instead of writing code.

After finishing the extension, you will have,

  • Most EasyAR Sense features (like image tracking, dense spatial map) working on your device using your device’s motion tracking (VIO) capability.

  • EasyAR cloud services which are already supported within EasyAR Sense working on your device.

  • All limitations to EasyAR license when using custom camera applies in the same way.

  • Only xr license can be used. Personal, prefessional or classic licenses cannot be used.

What is NOT EasyAR HMD Extension

The HMD extension will not work without EasyAR Sense,

  • The HMD extension will not work alone, EasyAR Sense is required as a dependency and EasyAR Sense Unity Plugin must be used in Unity.

  • There is no direct way to call EasyAR cloud service API like EasyAR Mega localization, EasyAR Sense does the job.

  • In Unity, there is not any direct callings to image tracking API of EasyAR, EasyAR Sense Unity Plugin does the job.

  • In Unity, the HMD extension will not handle object or target transform in the scene, EasyAR Sense Unity Plugin does the job.

The HMD extension will not work without your device SDK,

  • In Unity, the HMD extension or EasyAR Sense Unity Plugin will not handle camera transform in the scene, your device SDK must do the job.

Some of the EasyAR features will not work even with the HMD extension,

  • Surface tracking will not work.

  • EasyAR motion tracking will not work.

  • Plane detection (which is a part of EasyAR motion tracking) will not work.

Generally, you must not limit your resource to Unity develop only,

  • Due to standard missing, it cannot be written only above the 3D engine normally. So, suggest letting your low-level engineers like system engineers and SDK engineers participate in the work from the first day.

So how do I use Mega on my device?

Bring up EasyAR Sense on your device, then EasyAR Mega will be supported naturally without any other work. Make sure not to use Mega sample directly during EasyAR bring-up on your device, you would probably fail.

Background Knowledge

Building AR/VR devices require some domain knowledge, and similarly, bring up EasyAR Sense on the device will require you or your team to be experts in the following areas,

If you are working in Unity, you will also need to know,

A little more knowledge in the following areas would help you understand the system better, especially how to send correct data to EasyAR,

Data requirements

To make EasyAR work on your device, the tricky part and also most important work is to ensure data correctness.

EasyAR Sense require generally two parts of data, we call it in the following way according to the call timing and data characteristics,

  1. Camera Frame data

  2. Rendering frame data

Camera Frame data

Required data,

  1. Timestamp

  2. Tracking status

  3. Device pose

  4. Intrinsics (including image size, focal length and principal point. Distortion model and coefficients are also required if there are any distortions.)

  5. Extrinsics (T_head_camera)

  6. Raw camera image data

Data timing,

  • Exposure mid-time

Data usages,

  • API calling timing: can be changed according to your design. A common way used by most devices is to query in each rendering update of 3D engine, and do the data process work based on timestamp in the frame

  • API calling thread: 3D engine game thread or any other thread if all your APIs are thread safe

API calling Demo in Unity,

void TryInputCameraFrameData()
{
    double timestamp;

    if (timestamp == curTimestamp) { return; }
    curTimestamp = timestamp;

    PixelFormat format;
    Vector2Int size;
    Vector2Int pixelSize;
    int bufferSize;

    var bufferO = TryAcquireBuffer(bufferSize);
    if (bufferO.OnNone) { return; }
    var buffer = bufferO.Value;

    IntPtr imageData;
    buffer.tryCopyFrom(imageData, 0, 0, bufferSize);

    Pose devicePose;
    MotionTrackingStatus trackingStatus;

    using (buffer)
    using (var image = Image.create(buffer, format, size.x, size.y, pixelSize.x, pixelSize.y))
    {
        HandleCameraFrameData(timestamp, image, cameraParameters, extrinsics, devicePose, trackingStatus);
    }
}

The above code will not pass compile, it is just a simplified API calling demo in Unity. Please find a workable code sample in com.easyar.sense.ext.hmdtemplate template.

Rendering frame data

Required data,

  1. Timestamp

  2. Tracking status

  3. Device pose

Data timing,

  • On screen. Timewrap not count. Device pose from the same timing is used to set 3D camera transform to render current frame.

Data usages,

  • API calling timing: each 3D engine rendering frame

  • API calling thread: 3D engine game thread

API calling Demo in Unity,

void InputRenderingFrameData()
{
    double timestamp;
    Pose devicePose;
    MotionTrackingStatus trackingStatus;

    HandleRenderFrameData(timestamp, devicePose, trackingStatus);
}

The above code will not pass compile, it is just a simplified API calling demo in Unity. Please find a workable code sample in com.easyar.sense.ext.hmdtemplate template.

Additional details

Camera Image Data,

  • Image coordinates: data should be horizontal when captured in sensor horizontal orientation. The data should start at upper left corner in row major. The image should not be flipped or inversed.

  • Image FPS: normal 30/60 fps data is acceptable. The minimum acceptable fps is 2 to reach reasonable algorithm performance if there are special impact with high fps. Suggest using higher fps than 2, use raw data fps in normal case.

  • Image size: to reach a better result, the longest side should be 960 or larger. A time-consuming resize is not encouraged in the data chain, it is suggested to use raw data directly except when full size memory copy is using unacceptable time.

  • Pixel Format: generally, the format priority is YUV > RGB > RGBA > Gray (Y-component from YUV), when prioritize the tracking effect and consider the performance comprehensively. When using YUV data, the complete format definition is required, including data packing and padding details. Compared to a single-channel image, EasyAR Mega will gain improvement with a color image input, which is different from other EasyAR features.

  • Data access: data pointer or equivalent. It is better to reduce all possible unnecessary copy in the data chain. EasyAR will use the data asynchronously after copying. And be careful with data ownership.

Timestamp,

  • All timestamp should be clock synchronized; hardware synchronization is preferred.

Tracking status,

  • The tracking status should be defined by the device containing tracking lost (when VIO not usable), and more levels are preferred.

Device pose,

  • All poses (including camera transform in 3D engine) should have the same origin

  • In Unity, the pose data should be given in Unity coordinate definition. If the HMD extension is implemented by EasyAR and poses are using other coordinate definition, you should provide a clear coordinate definition or method to convert pose to Unity coordinates.

  • In Unity, only device mode compatibility is required when using Unity XR framework.

Intrinsics,

  • All values should match with the camera image data. Resize the intrinsics before sending them to EasyAR if needed.

  • If the HMD extension is implemented by EasyAR, you should explain if intrinsics will change each frame (the difference is whether API should be called only once or every time).

Extrinsics,

  • It is requried. It is a calibrated matrix T_head_camera to transform from head coordinates to camera coordinates. If your device pose is equal to camera pose, it should be identity matrix.

  • Apple Vision Pro API for this is CameraFrame.Sample.Parameters.extrinsics

Performance,

  • Data should be provided for the best performance. The API calling will happen in rendering procedure in most implementation, so it is suggested to not block API calling when it requires time-consuming tasks or use those API in a reasonable way.

  • If the HMD extension is implemented by EasyAR, you should provide explanation for all API callings with performance impact.

Multiple cameras,

  • At least one camera data is required. The camera can be any one from RGB cameras, VST cameras, localization cameras, etc. If only one camera data is provided, we would generally recommend using RGB cameras or VST cameras at the center or near the eye.

  • EasyAR algorithm performance could be improved with multiple cameras. Camera Frame data should be provided at the same timing from all available cameras.

Multiple cameras are not fully supported yet, please contact us for more details.

Preparation

Before start to write the HMD extension, there are certain works that need to be done.

Prepare your device for AR/MR

  • Prepare your SLAM/VIO system

    Make sure device tracking error is under control. Some EasyAR features like Mega could reduce device accumulative error in some manner, but large local error will make EasyAR algorithms unstable as well. Usually, we would expect VIO drift is below 1‰. It is wrong to use EasyAR to reduce VIO errors on purpose.

  • Prepare your display system

    Make sure when a virtual object of same size and contour with a certain real-world object is placed at a pose same with real world transform from your device to the object, the virtual object can be overlaid on top of the real-world object, and movement of the device will not break the effect.

    Suggest to reference Vision Pro.

  • Prepare your device SDK

    Make sure you have API to provide data mentioned in the Data requirements section. Normally, it would be a good idea to retrieve two sets of data in one or two API, because the data should be generated from two and only two time point from your system, and more API may lead to mis-aligned data by design.

Learn EasyAR Sense as an App Developer

It is important to learn EasyAR Sense first, and also EasyAR Sense Unity Plugin if you are using Unity. You must be able to run samples on Android, there are some basic configurations required by EasyAR, and you will learn to use EasyAR Sense license key.

It is suggested to run these samples in Unity first on your mobile phone and learn how would these features behave. There would not be fundamental differences when they are running on your headset,

Notice: make sure to run Unity samples even if you are working for other 3D engines.

Prepare for Package Development in Unity

You can import EasyAR Sense Unity Plugin (the package com.easyar.sense) using Package Manager window to install the plugin from a local tarball file or in any other way Unity allows.

You should extract HMD Extension template (the package com.easyar.sense.ext.hmdtemplate) to a place you can develop the package, or create a new package according to Unity creating custom packages guide.

Notice: if your device SDK is not organized in Unity package, then you should extract HMD Extension template to Unity Assets folder, and remove package.json and any file with extension named .asmdef in the extracted folder. Please note in this way the user using both SDKs are not able to have reasonable way to get version dependency.

Get Familiar with HMD Template

The package com.easyar.sense.ext.hmdtemplate is a sample and template for you. It is an SDK implementation, with samples for your users. You should get familiar with all files first. Do not desperate to make changes before you know what those files are, especially the scripts. You should take full control of the package.

The HMD extension is an SDK, its upstream is EasyAR Sense (or EasyAR Sense Unity Plugin in Unity) and your device SDK, and its downstream is user app. To develop an SDK, you need to look at the package in a way not only as an app developer, but also as an SDK provider. So, you need to learn more about EasyAR than normal app developers.

The package structure follows recommended package layout by Unity,

.
├── CHANGELOG.md
├── Documentation~
├── Editor
├── LICENSE.md
├── package.json
├── Runtime
└── Samples~
    └── Combination_BasedOn_HMD

Make sure to learn package development from Unity documents. We list a few important items below.

  • Runtime: Runtime platform-specific Assets folder. This is the most important folder in the template, you should mainly change scripts in it.

  • Samples~: Folder to store any samples included in the package. It contains samples for the downstream, you can also use it as a demo to test the extension. To develop the sample in place, make sure to change the folder name to Samples. The Unity function Client.Pack will rename it to Samples~ automatically when you package a new release.

  • Editor: Editor platform-specific Assets folder. Scripts in this folder help to create menu items, usually you need to change a few words to represent your device.

  • package.json: The package manifest, make sure to change it before release.

Dive into EasyAR Sense Unity Plugin: ARSession Work Flow

Please reference Session Validation Tool and Workflow_ARSession sample.

Camera transform is not set when using HMD.

Please read API documents or source code for ARSession to learn more details.

Dive into EasyAR Sense Unity Plugin: Frame Source

The most important part to write an HMD extension is to write a new custom camera device (or external frame source in the Unity Plugin).

Frame source is a component designed in EasyAR Sense Unity Plugin which abstracts all camera devices or other image (and pose) provider. You can find CameraDeviceFrameSource, MotionTrackerFrameSource, ARKitFrameSource, ARCoreFrameSource in EasyAR Sense Unity Plugin representing frame sources. Same components like CameraDevice, MotionTrackerCameraDevice, ARKitCameraDevice, ARCoreCameraDevice represent camera devices in EasyAR Sense. Make sure to have a glance at EasyAR Sense API overview to learn EasyAR Sense data flow and custom camera.

In Unity, there are a few pre-defined abstract frame sources, you can use in your HMD extension as base class.

../_images/image_h2_2.png

The above image shows the relationship of these frame sources and a few HMD device frame sources provided by EasyAR.

There are some important details you need to know to write a new frame source. Please read API docs lined below.

FrameSource:

ExternalFrameSource:

ExternalDeviceFrameSource:

ExternalDeviceMotionFrameSource:

“External” device motion means SLAM is provided by non-EasyAR component, and 3D camera transform is already controlled by your device SDK.

ExternalDeviceRotationFrameSource:

“External” device rotation means 3DOF rotation tracking instead of 6DOF tracking is provided by non-EasyAR component, and 3D camera transform is already controlled by your device SDK.

Please read API documents or source code for each frame source to learn more details.

When you need to define Unity messages like Awake or OnEnable, make sure to check if your base class has used that already, and make sure to call base method in your implementation.

Write HMD Extension for EasyAR Sense Unity Plugin

In this section, we will use package com.easyar.sense.ext.hmdtemplate for demonstration, but may not cover every detail in the template. Please make sure to read every detail in the package when writing your HMD extension. There are already explanations for all interfaces and data requirements in the source code.

Note: Details may change across different versions.

Write Frame Source: Choose Base Class

Override IsHMD and set it to true.

public override bool IsHMD { get => true; }

Override Display and set to reasonable value.

protected override IDisplay Display => easyar.Display.DefaultHMDDisplay;

Write Frame Source: Availability

All EasyAR features including devices define availability interface to let user know whether that feature is usable on the device in certain session status in runtime. The availability interface is used during ARSession.Assemble , an unavailable component will not be selected, and its methods will not be called when the ARSession is running.

You need to override property IsAvailable and method CheckAvailability. If IsAvailable has value before session start, then CheckAvailability will not be called. CheckAvailability is a coroutine. Sometimes you may need to wait for the device to be ready or data update before the availability can be determined. Returen null if you do not need to wait.

protected override Optional<bool> IsAvailable => throw new NotImplementedException("Please finish this method using your device SDK API");

Sample implementation in NrealFrameSource,

protected override Optional<bool> IsAvailable => Application.platform == RuntimePlatform.Android && SetupCameraRig();

Write Frame Source: Rendering Camera

The rendering camera defined in frame source is used to display messages in front of your eyes in ARSession.

You need to override property Camera. It will be used during ARSession.Assemble. It is not needed to override when ExternalDeviceFrameSource.OriginType is DeviceOriginType.XROrigin, and it will use the camera defined by Unity XR Framework.

protected override Camera Camera
{
    get
    {
        if (OriginType == DeviceOriginType.XROrigin) { return base.Camera; }

        // NOTE: Return the rendering camera. It is used to display messages in front of your eyes.
        //       If OriginType is XROrigin, just remove the following line.
        throw new NotImplementedException("Please finish this method using your device SDK API");
    }
}

Sample implementation in NrealFrameSource,

protected override Camera Camera => SetupCameraRig() ? CameraRigCandidate.centerCamera : null;

Write Frame Source: Session Origin

You will gain more flexibility when origin defined. Otherwise, you will lose flexibility especially how the object moves and support less center mode . App developers have to be careful about where their virtual objects are because EasyAR objects will always move when using this class. Objects put directly under Unity world coordinates will never show in the right place in any configuration.

you need to override property Origin and OriginType to return origin defined by your device SDK. It will be used during ARSession.Assemble. It is not needed to override when ExternalDeviceFrameSource.OriginType is not DeviceOriginType.Custom, and it will use XR.CoreUtils.XROrigin automatically.

protected override DeviceOriginType OriginType => throw new NotImplementedException("Please set your origin type");

protected override GameObject Origin
{
    get
    {
        if (OriginType != DeviceOriginType.Custom) { return base.Origin; }

        // NOTE: The Origin is used to setup transform base in SessionOrigin center mode and to transform camera-origin pair together in other center modes.
        //       If OriginType is not Custom, just remove the following line.
        throw new NotImplementedException("Please finish this method using your device SDK API");
    }
}

Write Frame Source: Session Start/Stop

OnSessionStart will be called in each EasyAR component during session assembling. OnSessionStart in the frame source will be called only after ARSession.Assemble finish component picking, and your frame source is selected. OnSessionStart is designed for lazy initialization.

You need to override method OnSessionStart and do AR specific initialization work in the method. Make sure to call base.OnSessionStart first.

protected override void OnSessionStart(ARSession session)
{
    base.OnSessionStart(session);
    StartCoroutine(InitializeCamera()); // NOTE: Start to do initialization for acquiring camera data, and / or wait for device ready.
}

This is a good place to open device cameras (RGB cameras, VST cameras, etc.) if they were not designed to be always open, and also a good place to retrieve some calibration data that will not change during the whole lifetime. Sometimes you may need to wait for the device to be ready or for data update before those data could be retrieved.

private IEnumerator InitializeCamera()
{
    yield return new WaitUntil(() => false); // NOTE: Wait until device initialized, so don't forget to change this endless waiting sample.

    var cameraModel = (CameraModelType)(-1); // NOTE: Replace with device calibration data. Use CameraModelType.Pinhole if camera model is pinhole.
    var imageSize = new Vec2I(); // NOTE: Replace with device calibration data.
    var cameraParamList = new List<float>(); // NOTE: Replace with device calibration data. When using Mega, CLS v3 or later is required to support non-pinhole.
    var cameraDeviceType = CameraDeviceType.Back; // NOTE: No need to change in most cases.
    var cameraOrientation = 0; // NOTE: Replace with device calibration data. Acceptable value: 0, 90, 180, 270.
    var parameters = CameraParameters.tryCreateWithCustomIntrinsics(imageSize, cameraParamList, cameraModel, cameraDeviceType, cameraOrientation); // NOTE: If online optimize exists, generate in TryInputCameraFrameData instead.
    if (parameters.OnNone)
    {
        throw new InvalidOperationException("Invalid intrinsics in CameraParameters.tryCreateWithCustomIntrinsics");
    }
    cameraParameters = parameters.Value;
    extrinsics = new Extrinsics(new Pose(), Extrinsics.CoordinateSystem.Unity); // NOTE: Replace with device calibration data.

    StartCoroutine(InputDeviceData()); // NOTE: Start to input data into EasyAR.

    throw new NotImplementedException("Please finish this method using your device SDK API");
}

And it is a good place to start data input loop. You can also write this loop in Unity script Update or other methods, especially when your data need to be retrieved at certain timing within Unity execution order. Do not input data before the session is ready.

private IEnumerator InputDeviceData()
{
    while (true)
    {
        InputRenderFrameMotionData();
        TryInputCameraFrameData();
        yield return null;
    }
}

You can ignore the startup procedure and do data check in each update if you wish, it is up to you.

Sample implementation in QiyuFrameSource,

private IEnumerator InitializeCamera()
{
    QiyuARCore.InitAR(500);
    if (ControlSeeThrough)
    {
        QiyuARCore.EnableSeeThrough(true);
    }

    var frameData = new QiyuARPlugin.VstFrameDataNative();
    QiyuARPlugin.QVR_GetVstFrame(ref frameData);
    while (frameData.headTimeStamp <= 0 || frameData.dataLength <= 0)
    {
        QiyuARPlugin.QVR_GetVstFrame(ref frameData);
        yield return null;
    }

    QiyuARPlugin.QVR_GetVstFrame(ref frameData);
    var radialDisortion = new float[4];
    System.Runtime.InteropServices.Marshal.Copy(frameData.radialDisortion, radialDisortion, 0, radialDisortion.Length);

    var cameraParamList = new List<float> { frameData.focal.x, frameData.focal.y, frameData.center.x, frameData.center.y }.Concat(radialDisortion).ToList();
    cameraParameters = CameraParameters.tryCreateWithCustomIntrinsics(new Vec2I((int)frameData.size.x, (int)frameData.size.y), cameraParamList, CameraModelType.OpenCV_Fisheye, CameraDeviceType.Back, 0).Value;

    extrinsics = new Extrinsics(frameData.cameraOffset.ToPose(), Extrinsics.CoordinateSystem.Unity);
}

Write Frame Source: Camera Frame

This is where you send Camera Frame data into EasyAR Sense. Please reference Data requirements for details.

There is no need to call every frame. Acceptable minimum frame rate = 2. It can be called in any thread if all your APIs are thread safe. These data should match the one when camera sensor exposure. It is recommended to input the color data into EasyAR Sense as long as it can be obtained, which is helpful for the effect of EasyAR Mega. To get a better performance, you can design the whole data chain to let raw YUV data passthrough directly from shared memory and pass data pointer directly into EasyAR Sense. Be careful with data ownership.

private void TryInputCameraFrameData()
{
    ...
    if (timestamp == curTimestamp) { return; } // NOTE: Do not waste time sending the same data again. And if possible, do not copy memory or do any time-consuming tasks in your own API getting camera data.
    curTimestamp = timestamp;

    ...
    // NOTE: Make sure dispose is called. There will be serious memory leak otherwise.
    using (buffer)
    using (var image = Image.create(buffer, format, size.x, size.y, pixelSize.x, pixelSize.y))
    {
        HandleCameraFrameData(timestamp, image, cameraParameters, extrinsics, historicalHeadPose, trackingStatus);
    }

    throw new NotImplementedException("Please finish this method using your device SDK API");
}

Important: Don’t forget to dispose EasyAR Sense data after usage. Otherwise, there will be serious memory leak, or the buffer pool will fail to acquire buffer.

Sample implementation in QiyuFrameSource,

void Update()
{
    if (extrinsics.OnNone) { return; }
    var frameData = new QiyuARPlugin.VstFrameDataNative();
    QiyuARPlugin.QVR_GetVstFrame(ref frameData);

    ...
    OnCameraFrameReceived(frameData);
}

private void OnCameraFrameReceived(QiyuARPlugin.VstFrameDataNative frameData)
{
    if (frameData.cameraTimeStamp == curTimestamp) { return; }

    curTimestamp = frameData.cameraTimeStamp;
    var size = new Vector2Int((int)frameData.size.x, (int)frameData.size.y);
    var pixelSize = size;
    var yLen = pixelSize.x * pixelSize.y;
    var bufferBlockSize = yLen;

    var bufferO = TryAcquireBuffer(bufferBlockSize);
    if (bufferO.OnNone) { return; }

    var bufferO = TryAcquireBuffer(bufferBlockSize);
    if (bufferO.OnNone) { return; }

    var buffer = bufferO.Value;
    buffer.tryCopyFrom(frameData.data, 0, 0, bufferBlockSize);

    var pose = frameData.cameraPose.ToPose();
    // NOTE: Qiyu did not give a reasonable tracking status. Request submitted. We are wating for update...
    var trackingStatus = MotionTrackingStatus.Tracking;

    using (buffer)
    using (var image = Image.create(buffer, PixelFormat.Gray, size.x, size.y, pixelSize.x, pixelSize.y))
    {
        HandleCameraFrameData(frameData.cameraTimeStamp * 1e-9, image, cameraParameters, extrinsics.Value, pose, trackingStatus);
    }
}

Write Frame Source: Rendering Frame

This is where you send Rendering frame data into EasyAR Sense. Please reference Data requirements for details.

Make sure to call every frame after device data ready, do not skip. These data should match the one driving current Unity rendering camera in the same frame.

private void InputRenderFrameMotionData()
{
    ...
    HandleRenderFrameData(timestamp, headPose, trackingStatus);

    throw new NotImplementedException("Please finish this method using your device SDK API");
}

Sample implementation in QiyuFrameSource,

void Update()
{
    if (extrinsics.OnNone) { return; }
    var frameData = new QiyuARPlugin.VstFrameDataNative();
    QiyuARPlugin.QVR_GetVstFrame(ref frameData);

    if (frameData.headTimeStamp <= 0) { return; }
    // NOTE: Qiyu did not give a reasonable tracking status. Request submitted. We are wating for update...
    HandleRenderFrameData(frameData.headTimeStamp * 1e-9, frameData.headPose.ToPose(), MotionTrackingStatus.Tracking);
    ...
}

HMD Extension Bring-up

Prepare the Sample for Bring-up

The sample is located in Samples~/Combination_BasedOn_HMD. To develop the sample in place, make sure to change the folder name from Samples~ to Samples. There is no code in the sample to keep it simple. All features are configured in the scene.

  1. Add your device support objects to the scene.

You can also do this in the reverse way, using a scene which can run on your device, and add EasyAR components and other objects in the sample scene to your scene.

  1. If there is any origin defined, move “Cube” and “UI” to be child of origin.

../_images/image_h2_3.png

The cube will provide a reference of device SLAM behavior, which will help to determine the reason of an unstable tracking.

  1. Set constraint source of “UI” to your rendering camera, so that the “HUD” button can work as expected.

../_images/image_h2_4.png
  1. Take care of “Canvas” under “UI” node, make sure raycast can work so that all buttons and toggles can work as expected.

../_images/image_h2_5.png
  1. If you are not using EasyAR Mega at the moment, make sure to disable the Mega Tracker object under ARSession, there will be errors otherwise.

../_images/image_h2_17.png

To learn sample scene details and how was this sample created, please reference Finish The Package: Sample.

Build and Run the Sample

Make sure to learn how to use samples from How to Use Samples. For Android specific configurations, read Android Project Configuration.

You can reference EasyAR HMD Extensions Sample for how to use the sample.

To run image tracking, you need to print namecard.jpg to A4 paper, and make sure the image fit width to the paper.

When you bring up EasyAR on your device on the first time, make sure to run these features one by one, especially do not rush to run Mega because EasyAR Mega has some tolerance to data error which may be hard to discover when running for a short period or in a single real-world scene.

  • Read the dump message in front of your eyes, make sure there is no surprise, and make sure the frame count is growing.

  • Run Image, the EasyAR Planar Image Tracking feature, compare it with mobile phone.

  • Run Dense, the EasyAR Dense Spatial Map feature, compare it with mobile phone.

Attention

HMD support is implemented as EasyAR Sense custom camera. You can use EasyAR for 100 seconds per run if a personal edition EasyAR Sense license or trial version of EasyAR Mega service is being used. There is no limitation when using a paid license for EasyAR Sense and paid EasyAR Mega service. The Unity Plugin will actively crash by default after 100s, you can change the “crash” behavior from SenseError and SessionError option in Message Display and Error Fence.

Problem Break Down during Bring-up

To make EasyAR work on your device, the tricky part and also most important work is to ensure data correctness. Over 90% of the problems during EasyAR bring-up on a new device is caused by incorrect data. It is strongly encouraged to validate data at your side without EasyAR. We will provide a few empirical methods to validate your data using EasyAR features below, it could help you understand the Data requirements, but it is not the best way to ensure data correctness with such heavy coupling system.

If Image (tracking and target overlay) and Dense (mesh position, generating speed and quality) both behave exactly same with or better than those features on a mobile phone (iPhone preferred), then most features of EasyAR will work well on your device and you can try Mega if that is your goal. Please note that EasyAR Dense Spatial Map may not run on some Android devices, and mesh quality may vary across devices.

If you cannot reproduce the same result as running on a mobile phone, then below is a detailed breaking down procedure, you can reference it to find the root cause.

Make sure to always keep an eye on logs from adb. There are known issues in dense spatial map, it may cause error logs about mesh components after a while. This will not impact the displayed mesh quality, and will be fixed in later versions.

  1. Your system error

    Remember SLAM and display requirement described in the prepare section?

    SLAM/VIO error can always make EasyAR algorithms unstable, in different ways. Always keep this in mind.

    The display system error may lead to virtual objects and real-world objects failing to overlap perfectly. In some cases, with large errors, the virtual objects could look like float above or under the real-world object and (looks like) keep drifting. You could observe this phenomenon in Pico 4E even with its own VST alone.

  2. Dump Session Display

    Required healthy feature or data:

    • Frame source Availability

    • Frame source Rendering Camera

    If you cannot see it, try to change the option to Log and read the session status and frame source name being used.

    Try to print logs in ARSession and find out what happened. You can also try remove all other frame sources in the scene under the node ARSession and see if anything changes.

  3. Camera Frame Count Received

    Required healthy feature or data:

    • Frame source Camera Frame data path in EasyAR Sense Unity Plugin (not including data correctness or data path to EasyAR Sense)

    The value should increase with time, otherwise popup warnings may appear in a few seconds.

    If you do not see the value increase, you should debug to find out the reason.

  4. Record EIF on Device, Playback on Unity Editor

    Required healthy feature or data:

    • Frame source Camera Frame data path to EasyAR Sense (not including data correctness)

    Click EIF to start recording, click again to stop. You must stop recording to record a usable EIF file, otherwise the file would not be usable. It is better to create a pure EasyAR scene or use EasyAR Samples when running EIF on Unity Editor to avoid incorrect setup in the scene.

    There are a lot of thing you can do using EIF, you can run both EasyAR Planar Image Tracking and EasyAR Dense Spatial Map using EIF in Unity Editor. But keep in mind, the display result may be different when running on device.

    EasyAR will use distortion coefficients in calculation but will not undistort images. So when you playback EIF files in Unity, you may observe undistorted images if you input such data and this meets expectation.

    Required healthy feature or data:

    • Raw camera image data from Camera Frame data

    • Timestamp from Camera Frame data (not including timing and data synchronization)

    You can see camera frames playback in the Unity editor. The image data is not byte equal, there are lossy encoding and decoding in the whole process. Change aspect ratio of the Unity game window to fit data input, otherwise the data will be cropped.

    If the data plays fast or slow, you should check your Timestamp input.

  5. EasyAR Planar Image Tracking using EIF

    Required healthy feature or data:

    • Raw camera image data from Camera Frame data

    • Intrinsics from Camera Frame data (data correctness cannot be fully ensured because there are some tolerances.)

    Run EIF in ImageTracking_Targets sample in Unity editor, you should record EIF in the way the image can be tracked.

    Please note that EasyAR Planar Image Tracking require target to be tracked occupies a certain percentage of the whole image. If you cannot track the image, try to move your head closer to the image.

    If the tracking keeps failing or the virtual object displays away from the target in the image, it is quite possible the Intrinsics is wrong.

    If your image data has distortion, you may see the virtual object not overlapping the target on the image and this meets expectation.

  6. EasyAR Planar Image Tracking on device

    Required healthy feature or data:

    • Your display system

    • Raw camera image data from Camera Frame data

    • Intrinsics from Camera Frame data (data correctness cannot be fully ensured because there are some tolerances.)

    • Extrinsics from Camera Frame data

    • Coordinates consistency of Device pose from Camera Frame data and Rendering frame data

    • Time duration of Device pose from Camera Frame data and Rendering frame data

    Please note that EasyAR Planar Image Tracking require target to be tracked occupies a certain percentage of the whole image. If you cannot track the image, try to move your head closer to the image.

    Please note the EasyAR Planar Image Tracking require target scale to be real-world size, the sample require you track the image fill A4 paper in width, so do not open image to the computer screen except you use a ruler to adjust the image size to A4.

    If the image can be tracked perfectly using EIF but not on device, fix it first before continuing. Solving problems in later steps is much more difficult.

    If the virtual object is floating at somewhere away from the real-world image even when you do not move, it is possible that Intrinsics or Extrinsics is not correct or Device pose from Camera Frame data and Rendering frame data are not in the same coordinates, or your display system is producing the error.

    If the virtual objects keep moving when you move your head and there looks like to be a lag, then it is quite possible Device pose is not as healthy as described above. It usually happens that Device pose is not from the same timing of Raw camera image data, or same pose is used in Camera Frame data and Rendering frame data.

  7. EasyAR Dense Spatial Map using EIF or on device

    Required healthy feature or data:

    • Your display system

    • Raw camera image data from Camera Frame data

    • Intrinsics from Camera Frame data (data correctness cannot be fully ensured because there are some tolerances.)

    • Extrinsics from Camera Frame data

    • Device pose from Camera Frame data

    If the mesh generating speed is very slow and/or the ground reconstructed is full of bumps and hollows, then it is quite possible that the Device pose is not correct. It is also possible that the pose coordinate is wrong, or the pose timing is wrong.

    It is not that easy to distinguish precise mesh location, so your display system error may not be observed when using EasyAR Dense Spatial Map.

EasyAR Mega

Preparation

You should read EasyAR Mega Getting Started Guide first before using EasyAR Mega. Then, follow EasyAR Mega Unity development Guide to learn Unity development when using EasyAR Mega.

Make sure to run Mega on mobile phone first.

Do not forget to enable the Mega Tracker object under ARSession if you disabled it before.

../_images/image_h2_18.png

Problem Break Down during Bring-up

Required healthy feature or data:

  • Your display system

  • All Camera Frame data and rendering frame data

If you have followed this article to brought up both EasyAR Planar Image Tracking and EasyAR Dense Spatial Map successfully, then EasyAR Mega would have been supported already. If the behavior is obviously worse than running on mobile phone, please pay close attention to the pose data and timestamp in both Camera Frame data and rendering frame data. And pay attention to the output of your SLAM/VIO system. The cube under Origin is a reference for that.

If you are testing EasyAR Mega without trying EasyAR Planar Image Tracking and EasyAR Dense Spatial Map, please be patient and go back to follow break down guide one by one.

Package and Distribution

Finish The Package: Editor

Change “HMD Template” string in MenuItems class to represent your device. You can also add other scripts if you need other custom editor behaviors.

Finish The Package: Sample

The sample is located in Samples~/Combination_BasedOn_HMD. To develop the sample in place, make sure to change the folder name from Samples~ to Samples. The Unity function Client.Pack will rename it to Samples~ automatically when you package a new release.

The sample in the template is provided for two main purposes, validating your device during bring-up and providing a reference for downstream users. The sample needs to be finished and adjusted before release to the app developers.

First, let’s take a look at how was this sample created before we released the template.

  1. Create EasyAR ARSession

You can reference Start from Zero for how to create EasyAR components in the scene.

Create ARSession,

../_images/image_h2_6.png

Add a few filters into the session. Below image shows how to add image tracker into the session,

../_images/image_h2_7.png
  1. Create image target and sparse spatial map to be used in the sample

For example, to create image target, use this menu,

../_images/image_h2_8.png

Setup the image target,

../_images/image_h2_9.png

Please note the image displayed in the Unity Scene window after setup is a gizmo, the sample use a quad to show the same image as a virtual object.

Add virtual objects to be displayed on top of the target,

../_images/image_h2_10.png
  1. Add Cube for SLAM reference

This cube is important for you and us and downstream users to decoupling device SLAM and EasyAR algorithms.

../_images/image_h2_11.png
  1. Add UI for feature selection

../_images/image_h2_12.png
  1. Disable EasyAR features at startup and use UI toggle to enable them

For example, the image tracking feature can be disabled at startup by setting the component enable to false,

../_images/image_h2_13.png

And add handler in the UI toggle,

../_images/image_h2_14.png

You need to finish this sample using your device SDK. You have done this in the HMD Extension Bring-up section, so we do not list details here. Make sure to add your device support to the scene and take care of the “Cube”, “UI” and “Canvas” under “UI”.

Do not forget to enable the Mega Tracker object under ARSession if you disabled it before.

../_images/image_h2_18.png

Don’t forget to cleanup some strings in the scene, like “(Move into Origin if there is any, set constraint source to your rendering camera)” or “Cube (Move into Origin if there is any)”, these strings are written for you and not for app developers.

Finish The Package: Package Define

The package is defined in package.json, create a new package or change this file according to Unity creating custom packages guide. Make sure to change your package “name” and “displayName”, otherwise it will conflict with the template itself and maybe other extension providers.

Finish The Package: Regenerate Meta

Please regenerate .meta files for all files in your package, otherwise they will conflict with the template itself and maybe other extension providers.

Distribution

You may also want to change some other files in the package, make sure to review the whole package before distribution.

Check version compatibility with your device SDK and EasyAR Sense Unity Plugin. Please note that EasyAR does not follow semantic versioning required by Unity package. The major difference in general is the minor version change may introduce breaking changes, but not always. And if you are using a pre-release for Mega, API changes may happen in every version.

Suggest packaging your files in a Unity package, but you can also choose to release in asset package if your device SDK is not package ready.

You need to remind your users that EasyAR license key limitation (especially limitations about custom camera usages) applies to your package.

Write HMD Extension for EasyAR Sense

You should have a 3D engine replacement for Unity. This is not easy, so normally you should use Unity and do as described in above sections. Below are some highlights that should be taken care of when you are trying to bring up EasyAR Sense on such platforms.

We are working closely with XR-Frame on Wechat, Paladin on Alipay and GritWorld. If you are trying to make your device work on these platforms, make sure to contact us first for the latest progress.

Extra Background Knowledge and Preparation

  1. Be master of the 3D engine you use.

  2. Know how to handle c++/java/jni development and other cross language development according to the script language your 3D engine can provide.

  3. Treat EasyAR Sense Unity Plugin as a sample of EasyAR Sense, you need to learn details using source code of EasyAR Sense Unity Plugin.

Key Points and Differences with Mobile Phone

  • EasyAR Sense Initialization

  • Get sync data

    • Pay special attention to MegaTracker.setResultAsyncMode and MegaTracker.getSyncResult. Make sure to call setResultAsyncMode(false) after creation and replace data from OutputFrame with getSyncResult called each rendering frame. You would still need the whole OutputFrame path for other procedures, so do not remove it. Please take IsSyncResultRequired part of MegaTrackerFrameFilter.OnResult source code as a reference. This is same for other components.

  • Special code path for EasyAR Mega

    • Do not connect Accelerometer output with MegaTracker.accelerometerResultSink (IsHMD control in EasyAR Sense Unity Plugin).

    • If one CLS has multiple blocks, organize them using a single parent node and move blocks together using the the parent node.

  • Special code path for EasyAR Planar Image Tracking and most other features

    • Transform pose according to motion diff between data peeked from OutputFrame (it is camera frame data) and rendering frame data. Please take motionWorkaround part of ImageTrackerFrameFilter.OnResult source code as a reference.

  • 3D camera (IsCameraUnderControl control in EasyAR Sense Unity Plugin)

    • Do not render camera images using data from EasyAR Sense.

    • Do not set 3D camera projection matrix using data from EasyAR Sense.

    • Do not set camera transform using data from EasyAR Sense.

  • Center mode design, origin design, and node/camera control

    • Design center mode similar to ARSession.ARCenterMode if possible.

    • Strongly suggest to design origin similar to XR.CoreUtils.XROrigin.

    • The best way is to design like Unity XR Framework using XR.CoreUtils.XROrigin, set 3D camera local transform with data from device, and move the origin using data from EasyAR Sense. If there is no Origin design, set 3D camera world transform with data from device, and move EasyAR Sense target or blocks using data from EasyAR Sense, this is SessionOrigin center mode in EasyAR Sense Unity Plugin.

    • Never set 3D camera transform using target pose calculated from EasyAR Sense, you will not get the correct display position doing so.

    • You can always reference EasyAR Sense Unity Plugin source code for pose calculation, you can start from ARSession.OnFrameUpdate.

Extra Problem Break Down Methods

  • Compare with EasyAR Sense Unity Plugin and find the differences

  • For Mega, you can also compare with Mega Toolbox

  • Put boxes or other objects under Origin to decoupling device SLAM and EasyAR algorithms

  • Record EIF and run on Unity Editor (or your 3D editor)

  • Record rendering frame data for analyze (rendering frame data is not included in EIF files)

  • Use similar break down steps described in the HMD Extension Bring-up section in Unity