Thanks for being a part of WWDC25!

How did we do? We’d love to know your thoughts on this year’s conference. Take the survey here

I need to troubleshoot Transform Drift in ARKit

Hi all,

I'm currently developing a real-time object reconstruction app using ARKit. The goal is to scan large objects using ARKit’s depth and transform data, and generate a point cloud.

However, I’m facing a major challenge - Transform Drift / World Alignment Issues

The localToWorld transform provided by ARKit frequently seems to drift or become unstable across frames.

This results in misaligned point clouds even when the device is moved slowly or kept relatively still.

In some cases, a static surface scanned over a few seconds results in clearly misaligned fragments.

This makes it difficult to accurately stitch a multi-frame point cloud. I have experimented with various lighting conditions and object textures, but the issue persists in all cases. At times, the relative error between frames reaches up to 20 cm, while in other instances the error is minimal; however, the drift gradually accumulates over time, leading to an overall enlargement of the reconstructed object. I have attached images of both cases here.

Questions:

Are there specific conditions under which ARKit’s world transform is expected to drift? Is there a way to detect or recover from this drift during runtime? Any best practices for maintaining consistent tracking during scanning or measurement sessions?

I need to troubleshoot Transform Drift in ARKit
 
 
Q