Thanks for being a part of WWDC25!

How did we do? We’d love to know your thoughts on this year’s conference. Take the survey here

Depth matrix accuracy with the iPhone 14 Pro and Lidar

Hello Community,

I’m currently working with the sample code “CapturingDepthUsingTheLiDARCamera” and using it to capture the depth map of an image taken with the iPhone 14 Pro. From this depth map, I generate a point cloud using the intrinsic camera parameters.

I've noticed that objects not facing the camera directly appear distorted in the resulting point cloud. For example: An object with surfaces that are perpendicular to each other appears with a sharper angle in the point cloud — around 60° instead of 90°.

My question is: Is this due to the general accuracy limitations of the LiDAR sensor? Or could it be related to the sample code? To obtain the depth map, I’m using: AVCapturePhoto.depthData.converting(toDepthDataType: kCVPixelFormatType_DepthFloat32)

Thanks in advance for your help!

Depth matrix accuracy with the iPhone 14 Pro and Lidar
 
 
Q