EasyAR Motion Fusion¶
Motion Tracking¶
Before reading the follwing sections about motion fusion, please read Motion Tracking and EasyAR Features to konwn the relationship between EasyAR and motion tracking.
What is EasyAR Motion Fusion¶
EasyAR motion fusion is a technic to optimize image and object tracking when a motion tracking device is present.
If a device (mobile phones, eyewears, or headsets) has VIO capability through any hardware or software solution, we call it a motion tracking device. Such devices include but not limited to devices which can run EasyAR Motion Tracking, ARCore, ARKit, Huawei AR Engine, or VIO-capable eyewears like Vision Pro.
EasyAR motion fusion makes image and object tracking stable and jitter-free, and they can be tracked even when the image or object goes out of camera scope.
How to Use EasyAR Motion Fusion¶
A scene with and without motion fusion differs in both EasyAR AR Session and target.
Create AR Session suitable for Motion Fusion¶
To run on mobile phone, you can create motion fusion AR Session using presets with Motion Fusion in the name from Image Tracking and Object Tracking,
To run on mobile phone, you can also add Motion Tracking Frame Source Group of Builtin Frame Source to the existing AR Session.To use motion fusion, you will need a FrameSource which represents a motion tracking device, so usually you will need different frame sources on different devices. Here we select AR Session (EasyAR), create a group of framesources by EasyAR Sense > Motion Tracking > Frame Source : *, the frame source used by the session will be selected at runtime. You can add different frame source to the session according to your needs.
If you want the tracking to fallback to normal tracking when motion tracking device not available, you can simply add a new CameraDeviceFrameSource to the end of frame source group using EasyAR Sense > Image Tracking > Frame Source : Camera Device (Object Sensing).
To run on headsets, there is no need to make spectial changes to the AR Session, because frame sources of headsets usually support motion tracking already and motion fusion can be used directly.
Make sure to turn on motion fusion for the tracker¶
For example, when ImageTrackerFrameFilter is being used, Enable Motion Fusion need to be set to true on ImageTrackerFrameFilter .
This option can be turned on or off in runtime anytime and tracker behaviour will be changed immediately.
Target Limitation¶
There are two points to be noticed when using motion fusion,
The target scale (ImageTarget.scale and ObjectTarget.scale, or TargetController input ImageTargetController.ImageFileSourceData.Scale and ObjectTargetController.ObjFileSourceData.Scale) must match the object scale in real world.
The target image or object cannot move in real world.
You can reference sample ImageTracking_MotionFusion for more details.