EasyAR Surface Tracking

Overview

Surface Tracking implements cross-platform tracking of environmental surfaces. It can be used in AR games, AR short video shooting and AR product display.

Surface Tracking implements simplified six degree-of-freedom tracking which tracks the relative pose between the device and reference feature point. Compared to EasyAR Motion Tracking, EasyAR surface tracking supports more devices and no initialization is needed.

World and camera coordinate system used in surface tracking follow right-handed convention: the y-axis points upward, the z-axis points toward the viewer and the x-axis points toward the viewer’s right.

How Surface Tracking Works

To create the correspondence between real and virtual spaces, surface tracking uses both the camera and IMU information. Surface Tracking recognizes significant features in the camera image, tracks differences in the positions of those features across continuous frames. The virtual object is always placed near selected feature point. On start-up, the virtual object is placed on the surface of feature points near the middle of the screen. During the movement of the device, the position of feature will be continuously updated and the virtual object will fit on the surface of feature points. The result is six degree-of-freedom pose between the device of the feature point. When the underlying feature point being tracked is occluded, another feature point is automaticlly selected as new reference point. Note this might cause the drift of the pose of tracking result.

API Reference

Best Practices

  • Only one virtual object is supported. The bottom of the virtual object should be placed at the origin of the world system.

  • The device should contain camera, gyroscope and accelerometer

  • CPU computing power meets or exceeds Snapdragon 410