Hey everyone,
I'm working on an object viewer where users can place objects in a real room using AR, and I want both visionOS (Apple Vision Pro) and iOS devices (iPad, iPhone) to participate in the same shared spatial experience. The idea is that a user with a Vision Pro can place an object, and peers using iPhones/iPads can see the same object in the same position in their AR view.
I've looked into ARKit's Shared ARWorldMap and MultipeerConnectivity, but I'm not sure if this extends seamlessly to visionOS or if Apple has an official way to sync spatial data between visionOS and iOS devices.
Has anyone tried sharing a spatial world between visionOS and iOS? Are there any built-in frameworks that allow for a shared multiuser AR session across these devices? If not, what would be the best way to sync object positions between them? Would love to hear if anyone has insights or experience with this! 🚀
Thanks!
Hi @bbthegreat
While there's no built-in abstraction for automatically synchronizing content location across iPhone and Apple Vision Pro, it's achievable. Here's how:
- Establish a common origin by using an image as a shared reference point. Both visionOS and iOS apps can locate this image using image anchors: ImageAnchor in visionOS and ARKImageAnchor in iOS. This allows content to be placed relative to the detected image in a consistent, shared spatial context.
- Synchronize the content's position across devices using either MultipeerConnectivity or the Network framework, both are compatible with visionOS and iOS.