Can iOS and visionOS Devices Share the Same Spatial World in a Multiuser AR Session?

Hey everyone,

I'm working on an object viewer where users can place objects in a real room using AR, and I want both visionOS (Apple Vision Pro) and iOS devices (iPad, iPhone) to participate in the same shared spatial experience. The idea is that a user with a Vision Pro can place an object, and peers using iPhones/iPads can see the same object in the same position in their AR view.

I've looked into ARKit's Shared ARWorldMap and MultipeerConnectivity, but I'm not sure if this extends seamlessly to visionOS or if Apple has an official way to sync spatial data between visionOS and iOS devices.

Has anyone tried sharing a spatial world between visionOS and iOS? Are there any built-in frameworks that allow for a shared multiuser AR session across these devices? If not, what would be the best way to sync object positions between them? Would love to hear if anyone has insights or experience with this! 🚀

Thanks!

Answered by Vision Pro Engineer in 827586022

Hi @bbthegreat

While there's no built-in abstraction for automatically synchronizing content location across iPhone and Apple Vision Pro, it's achievable. Here's how:

  • Establish a common origin by using an image as a shared reference point. Both visionOS and iOS apps can locate this image using image anchors: ImageAnchor in visionOS and ARKImageAnchor in iOS. This allows content to be placed relative to the detected image in a consistent, shared spatial context.
  • Synchronize the content's position across devices using either MultipeerConnectivity or the Network framework, both are compatible with visionOS and iOS.

Hi @bbthegreat

While there's no built-in abstraction for automatically synchronizing content location across iPhone and Apple Vision Pro, it's achievable. Here's how:

  • Establish a common origin by using an image as a shared reference point. Both visionOS and iOS apps can locate this image using image anchors: ImageAnchor in visionOS and ARKImageAnchor in iOS. This allows content to be placed relative to the detected image in a consistent, shared spatial context.
  • Synchronize the content's position across devices using either MultipeerConnectivity or the Network framework, both are compatible with visionOS and iOS.

@bbthegreat

No worries.

When I tried this, I made the image anchor the root node in the scene. Players scaned the anchor image before the game started to get things lined up. After that, I turned off anchor detection. Drift could be a problem, but I had decent resultsl. Keeping anchor detection on might be worth a shot, especially if everyone can always see the anchor. But I'm not sure it'll be any better, since the anchor might seem to move for some players and not others (like if it's too far away or updates happen at different times).

Can iOS and visionOS Devices Share the Same Spatial World in a Multiuser AR Session?
 
 
Q