This is related to the WWDC presentation, What's new in Metal rendering for immersive apps.. Specifically, the macOS spatial streaming to visionOS feature: For reference: the page in the docs. The presentation demonstrates it using a full immersive space and Metal rendering using compositor services. I'd like clarity on a few things:
- Is the remote device wireless, or must the visionOS device be connected via a wired connected?
- Is there a limit to the number of remote devices, and if not, could macOS render different things per remote device simultaneously?
- Can I also use mixed mode with passthrough enabled, instead of just a fully-immersive mode?
- Can I use RealityKit instead of Metal? If so, may I have an example, or would someone point to an example?
Hi KTRosenberg,
I'm an engineer on visionOS, and I can help with your questions.
With macOS spatial rendering:
- You can connect both wirelessly and through an USB connection with the Vision Pro Developer Strap.
- You can only connect to a single Vision Pro simultaneously.
- Mixed mode is not supported, only the full and progressive immersion modes are supported.
- RealityKit is not supported, you have to do your own rendering using Metal 3 or Metal 4.
Hope that helps!
Ricardo