大多数浏览器和
Developer App 均支持流媒体播放。
-
借助 SwiftUI 在 visionOS 中设置场景
探索精彩的全新 API,为你的 visionOS App 打造更出色的窗口、空间容器和沉浸式空间。微调场景在重新启动或固定位置时的行为。通过裁剪边缘和对齐,根据周边环境来调整空间容器。将流媒体格式的沉浸式内容从 Mac 传输到 Apple Vision Pro。借助空间容器和沉浸式空间,让基于 UIKit 的现有 App 更上一层楼。
章节
资源
- Adopting best practices for persistent UI
- Canyon Crosser: Building a volumetric hike-planning app
- Petite Asteroids: Building a volumetric visionOS game
- Tracking accessories in volumetric windows
相关视频
WWDC25
-
搜索此视频…
Hi, I’m Miguel, an engineer on the SwiftUI team. In this video, we’ll cover some of the incredible new capabilities added to scenes in visionOS 26. And perhaps we’ll learn how to stage some of our own.
visionOS has three scene types: windows, volumes, and immersive spaces. Apps can combine these together to create unique and exciting experiences. Today, we’ll see new APIs that apply to all three types, and some that are more focused on volumes and immersive spaces. My friends Maks, Amanda, and Trevor are working on some exciting improvements to BOTanist, a game about helping a robot grow beautiful plants in a floating garden.
As for me, I’ve been working on my own app, building scenes and decor to help our robot friends bring Shakespeare’s characters to life and take on the world of acting.
When opening my app, I’m greeted with a stage selection screen. Let's create a new one.
Now I can set the scene by moving the BOTanist all around the stage.
With this, I can have a robot friend reenact my favorite plays like Robo and Juliet.
Let’s see how I can improve my app with the new scene APIs of visionOS 26.
First, we’ll take a look at new lifecycle APIs to define behaviors when launching and locking windows to rooms. Second, I’ll introduce new volumetric enhancements that adapt to people's surroundings. Next, I’ll add a RemoteImmersiveSpace to preview my scene on Apple Vision Pro from a macOS app.
Then, for existing UIKit apps, we’ll wrap up by adding these amazing volumetric and immersive experiences with the new scene bridging APIs.
So let’s get started with launching and locking.
visionOS 26 brings over some macOS lifecycle APIs that will come in quite handy. We’ll cover APIs to manage scene restoration and app launch as well as an API to create unique windows.
In visionOS 26, people can now persist windows, volumes, and even the new widgets by locking them to particular rooms in their physical surroundings. This helps virtual content feel more present in their space. These locked windows are tied to the room they were used in. Come back to that room at a later time, and the windows spring back to life. Scene restoration with locking is awesome. I can keep all my windows around and get back to them whenever I want.
In general, people will expect to be able to lock all their windows in place, and for the system to restore them. So, prefer restoration for most scenes. The system will do this for you automatically.
However, for some windows, it might not make sense to persist them this way.
Consider disabling scene restoration for transient elements like a welcome screen, context-dependent UI like a tools window tied to a specific app state, or completed one-time actions like a login prompt.
I added an immersive mode to my app so people can fully appreciate our favorite robot in action. The toolbar detaches into a separate tools window in front of them, giving easy access to the stage controls while immersed.
Note that immersive spaces are not restored. So, when coming back to this room, the immersive space will not be brought back. However, if someone had locked the tools window in their space, it would show up all alone with nothing to modify.
I can avoid this unexpected state by adding the restorationBehavior(.disabled) modifier to my WindowGroup to explicitly opt the tools window out of restoration and locking in place.
Now, when coming back to this room at a later time, the extra window does not come back. Launching the app shows the first window for a fresh start.
In UIKit, you can disable restoration for your UI scenes using the new destructionConditions API with a .systemDisconnection property. Check out the documentation to learn more.
In some cases, your app might benefit from dynamically changing which scene to start with.
For example, I want to add a welcome window on the app’s first launch to greet people before the stage selection.
To customize which window to display on launch based on my app state, I can use the defaultLaunchBehavior modifier.
Here, I’ll prioritize the welcome window by marking it as presented if it's the first time launching the app.
I can toggle off this value once that window appears, as I won’t need to show the window anymore.
Note that the role of the chosen launch window must match the preferred default scene session role property in the Application Scene Manifest of your Info.plist.
This means if I set my default scene session role to Window, only regular window scenes will be considered by the system during app launch.
In that case, volumes will be ignored, even if I try to prioritize them with defaultLaunchBehavior. So be sure to match your desired scene with that session role.
The defaultLaunchBehavior modifier has one additional trick up its sleeve that may be useful for us. I talked about how I don’t want the tools panel to return when I come back to a room, and how I could use restorationBehavior to fix that. I've got a similar problem here. Currently, if I dismiss the immersive space by pressing the crown and close the Tools window, I’ll find this window coming back when launching the app again.
This leaves us in that same unexpected state that restoration would have. Instead, I want to resume the app from a safe state by starting with a stage creation window.
I can do this by adding the defaultLaunchBehavior(.suppressed) modifier to my tools window. This tells the system not to bring this window back when relaunching the app from the Home view.
In general, you should prefer the .suppressed defaultLaunchBehavior on secondary scenes to avoid getting people stuck in an unexpected state.
In UIKit, you can achieve the same behavior by adding the userInitiatedDismissal option to your UIScene's destructionConditions.
visionOS 26 also adds support for making unique windows. These are windows that cannot be duplicated. Only one unique instance can exist at a time.
Just like on the Mac, you declare them using the Window API instead of WindowGroup.
Use these to prevent duplication of important interfaces, like a game window or a video call.
Or use them to provide supplemental functionality that doesn’t require more than one instance.
I don’t need more than one instance of the welcome window. So I’ll replace it with a unique Window.
However, I’ll keep the main stage volume as a WindowGroup to allow creating multiple stages simultaneously.
Amazing. My app's lifecycle is better than ever. We customized which windows should show out when locking in place and during app launch. And we made sure to keep our windows unique when it made sense.
There’s a slew of new enhancements to volumes in visionOS 26 that can help take your volumes to even greater depths.
I’ll talk about the new surface snapping feature, advancements to presentations, and the Clipping Margins API.
Let's dive right in.
New in visionOS 26, people can snap windows and volumes to their physical environment. They can do so by gently moving the window close to the surface. For restorable windows, this is what locks them in place for persistence. People can snap the back of windows to vertical surfaces like a wall. They can also snap the bottom of volumes to horizontal surfaces, like the floor or a table. And widgets, which are new in visionOS 26, can be snapped to either kind.
Learn more about adding widgets to your visionOS app with “What’s new in widgets.” For windows and volumes, you can even get information about the snapping event.
With my app, I can snap the volume to a table and have it anchored horizontally. This is a start. However, I’d like to make the stage feel more present in my space. I want to have the robot stand directly on the table when the volume is snapped to it.
We can get some information about the snapping state using the new SurfaceSnappingInfo API.
The API gives us a simple isSnapped property to determine the general snapping status of our window.
For more advanced use cases, we can get the ARKit classification of the snapped surface. Note that this level of detail requires user permission. I'll show you how.
To enable detailed snapped surface information, I first need to set the 'Application Wants Detailed Surface Info' key to YES, as well as the 'Privacy World-Sensing Usage-Description' with a description to display when asking for permission. These two keys can be set in the app's Info.plist.
Once that’s done, I can jump into the code. Here, I get the surfaceSnappingInfo from the environment.
In the onChange, I check if the scene is currently snapped. And I check if I’m authorized to access the classification of the snapping surface. Checking the authorizationStatus will automatically ask the person for permission if needed.
Now, when snapped to a table, I want to hide the platform under the stage. I’m using a state variable to keep track of this.
With these changes, I can snap my volume to a table and the robot can act its heart out in my own environment. Awesome! I’ve also made it so that walking around the table hides the walls that are in my way, so I can always see into my volume.
I did this by reacting to changes in point of view of the scene using the onVolumeViewpointChange modifier. Check out how Owen added it to BOTanist in “Dive deep into volumes and immersive spaces” from WWDC24 to learn more.
I also want people to be able to place new props all around the stage. I can add a popover in my volume's toolbar with different props to add. Awesome! I can finally recreate The Tragedy of King Gear.
Previously, presentations were only supported in Windows. With visionOS 2.4, support for nested presentations was added, allowing for things like popovers presented from sheets or context menus presented from ornaments.
Now, with visionOS 26, presentations gained a whole new set of sources. You are free to present from within volumes, ornaments of volumes, attachments to RealityViews, or directly in RealityKit using the PresentationComponent.
To learn more about using presentations in RealityKit with attachments and the PresentationComponent, see “Better Together: SwiftUI and RealityKit.” This isn't limited to a small subset. All presentation types are available, that is menus, tooltips, popovers, sheets, alerts, and confirmation dialogs. Check out the documentation to learn how to create these presentations with SwiftUI.
These presentations all have special visual treatments to keep them visible when occluded by 3D content.
By default, the presentation will subtly blend in with the occluding content.
However, this can be customized to prominently break through or hide behind the occluding content. Use the subtle, prominent, or none options to customize this.
These options can be applied to presentations using the presentationBreakthroughEffect modifier. For elements other than presentations, you can achieve the same effect with a breakthroughEffect modifier.
With presentations, I can now add custom UI anywhere I want. Let's try to add some more.
I added another popover menu to change the stage decor. With this, I can transport our robot friend away from the old theater and onto a tropical island. Perfect for The Tempest! This set has a lot of potential. However, I think it could use some more pizzazz. How about a waterfall? Oh, and some thunderous clouds. Still, I want to make sure that these don’t crowd the center of the action. I can use the new Clipping Margins for this.
With Volumes, the new preferredWindowClippingMargins API lets you render content outside of your scene bounds.
This content is not interactive. Thus, you should use it for visual flourish only.
Note that these bounds may not be granted by the system. To account for this, read the granted margins with the windowClippingMargins environment variable. Let's see it in action. I can specify my desired clipping margins with the preferredWindowClippingMargins API. Here, I want margins at the bottom. I’m making sure to convert my maxWaterfallHeight, which is in meters, into points by multiplying with the pointsPerMeter factor I got from PhysicalMetric.
I then read the granted margins with the windowClippingMargins environment variable. With this, I can scale my waterfall to render within the margins.
I’m taking the minimum of the margins and the waterfall height to be sure we always render the entire waterfall model, regardless of what was granted.
And there we go. That looks much better. The clouds add a nice stormy ambiance, and the waterfall renders below the base plate without shifting the content up, keeping our focus on the island and its robot. I hope the BOTanist is freshly oiled.
And with that, our latest theatrical production feels more real than ever. With surface snapping and clipping margins, the content adapts to our physical space. And with presentations, I can create powerful interfaces to craft the perfect scene.
Now, let’s see what I can do to improve the immersive experience of my app.
Immersive spaces bring your spatial experiences all around you. And visionOS 26 brings in some great new ways to do even more with immersive spaces. I'll introduce a world recentering event, new capabilities with immersion styles, remote immersive space on macOS, and advances to Compositor-based immersive spaces for rendering with Metal. When navigating in their space, people can long press the digital crown to recenter the app’s experience around them. If your app uses ARKit data, this can invalidate positions you might have stored for later use.
You can listen to the world recentering event with the new onWorldRecenter view modifier to be alerted about this.
This is quite useful to recompute and store positions based on the new coordinate system.
visionOS 26 also comes with some new customizations to the different immersion styles available to immersive spaces.
The progressive immersion style is a great way to partially present an immersive space while keeping people grounded in the real world. The immersive content is presented inside a portal that can be resized by turning the digital crown. This range of immersion can be customized in the progressive immersion style.
In visionOS 26 you can also customize the aspect ratio of this portal. You can use the existing landscape aspect ratio or the new portrait aspect ratio. Consider using the portrait aspect ratio for vertical experiences such as when bringing your iPhone games to Apple Vision Pro or for experiences that contain a high degree of motion, as having stable surroundings in the periphery can help people feel more comfortable.
You can specify this aspect ratio with a parameter to the progressive style, as you might with the immersion range. In addition to the progressive style, there’s also new customization for immersive spaces in the mixed immersion style.
When setting the immersion style to mixed, the immersive space’s content blends in with people’s surroundings. This is the default style in my app.
In visionOS 26, immersive space content can blend in with system environments too. This means I can watch my robot’s latest production while on the moon.
Use the immersiveEnvironmentBehavior scene modifier with the coexist behavior to allow this. Do this if your mixed immersive space does not require users to be aware of their real world surroundings.
I love the props I've added to my app, but I just know people will want to bring in their own models when creating new scenes. They might create these models in their favorite macOS apps.
To support using these models directly from their Mac, I brought my app to macOS with the same stage creation capabilities.
For faster iteration, wouldn’t it be cool if people could preview their scenes as an immersive space directly without transferring their stage from macOS to visionOS? visionOS 26 and macOS Tahoe add RemoteImmersiveSpaces to help me do just that. With remote immersive spaces, you can use CompositorLayer to render content with Metal using app code and resources from your Mac and display it as an immersive experience on your Vision Pro. Let’s see it work in action in my app.
On my Mac app, I built a new immersive space using Metal and added a “Preview in Vision Pro“ button. Clicking on this asks me to select a target Vision Pro device.
Over on my Vision Pro, I’ll accept the connection request.
And just like that, my ImmersiSpace opens up and I can see the new props I’ve added in my robot’s latest show.
I did this by adding a RemoteImmersiveSpace scene containing my CompositorLayer. This will be presented on visionOS, whereas the rest of my scenes, like my main stage, will still present directly on the Mac.
To learn more in adapting your CompositorLayer and ARKitSession to a remote Vision Pro device, check out “What’s new in Metal rendering for immersive apps.” Using CompositorLayer in my remote immersive space gives me a lot of power to create immersive experiences with Metal. However, CompositorLayer is not a View, and so it cannot be used in contexts that require Views, like my ImmersiveContent. So far, this has meant environment variables and View modifiers were not available to CompositorLayer.
visionOS 26 adds a new CompositorContent builder type, which lets you use the full power of SwiftUI with CompositorLayer. You can now access environment variables, add modifiers, or use state variables just as you can with SwiftUI views.
CompositorContent brings over a whole lot of useful environment variables like scenePhase and openWindow, and some modifiers like onImmersionChange and onWorldRecenter.
All of these make CompositorLayer much more powerful to use in both remote immersive spaces, and regular ones running directly on visionOS.
Upgrading my app to use CompositorContent has been a great way to revisit some of the Immersive Space modifiers that are available to me and to see how they can be applied to my app.
So that’s what’s new with immersive spaces. We've got a world recentering event, new customizations for the immersion styles, immersive spaces from the Mac with RemoteImmersiveSpace, and CompositorContent.
My app is looking amazing with all these capabilities. In fact, I think I want to add some of these cool volumetric experiences to more of my apps.
However, some of my apps were built with UIKit, and UIKit doesn’t support volumes and immersive spaces. But now, it does with scene bridging. Scene bridging lets you take existing UIKit apps into the next dimension by adding SwiftUI volumes and immersive spaces.
Consider Safari. It uses SwiftUI views, but is built with the UIKit lifecycle. Safari is making great use of scene bridging for their new Spatial Browsing feature. Let's see how we can do this too.
To bridge a SwiftUI scene into my UIKit app, I start by creating a class type that extends from UIHostingSceneDelegate. With this type, I can declare my SwiftUI scenes in the rootScene property using the familiar scene body syntax.
I can now request this scene as I would any other UIKit scene by creating a UISceneSessionActivationRequest. In this case, I pass in my hosting delegate class, which declares my scenes, and the ID of the scene I’d like to open.
All that’s left to do is to send this request with activateSceneSession.
You can also respond to external events by setting your hosting delegate class in configurationForConnecting. This API also comes with a matching AppKit API to bridge SwiftUI scenes into your existing macOS AppKit apps.
My app is now taking full advantage of the new capabilities of visionOS 26, like locking in place, snapping to surfaces, and opening remotely from a Mac. I'm excited to show it off to my friends.
Now, take a look at your apps. Audit your scenes and make sure they take full advantage of locking in place and restoration. Adapt your scenes to people’s surroundings with snapping and clipping margins. And immerse your macOS app's content on Vision Pro with remote immersive spaces.
The curtain falls, but in our app, the show goes on, one scene at a time. Thanks for watching.
-
-
4:10 - Disabling restoration
// Disabling restoration WindowGroup("Tools", id: "tools") { ToolsView() } .restorationBehavior(.disabled)
-
4:36 - Disabling restoration in UIKit
// Disabling restoration windowScene.destructionConditions = [ .systemDisconnection ]
-
5:02 - Specifying launch window
// Specifying launch window @AppStorage("isFirstLaunch") private var isFirstLaunch = true var body: some Scene { WindowGroup("Stage Selection", id: "selection") { SelectionView() } WindowGroup("Welcome", id: "welcome") { WelcomeView() .onAppear { isFirstLaunch = false } } .defaultLaunchBehavior(isFirstLaunch ? .presented : .automatic) // ... }
-
6:39 - "suppressed" behavior
// "suppressed" behavior WindowGroup("Tools", id: "tools") { ToolsView() } .restorationBehavior(.disabled) .defaultLaunchBehavior(.suppressed)
-
7:44 - Unique window
// Unique window @AppStorage("isFirstLaunch") private var isFirstLaunch = true var body: some Scene { // ... Window("Welcome", id: "welcome") { WelcomeView() .onAppear { isFirstLaunch = false } } .defaultLaunchBehavior(isFirstLaunch ? .presented : .automatic) WindowGroup("Main Stage", id: "main") { StageView() } // ... }
-
10:24 - Surface snapping
// Surface snapping @Environment(\.surfaceSnappingInfo) private var snappingInfo @State private var hidePlatform = false var body: some View { RealityView { /* ... */ } .onChange(of: snappingInfo) { if snappingInfo.isSnapped && SurfaceSnappingInfo.authorizationStatus == .authorized { switch snappingInfo.classification { case .table: hidePlatform = true default: hidePlatform = false } } } }
-
14:41 - Clipping margins
// Clipping margins @Environment(\.windowClippingMargins) private var windowMargins @PhysicalMetric(from: .meters) private var pointsPerMeter = 1 var body: some View { RealityView { content in // ... waterfall = createWaterfallEntity() content.add(waterfall) } update: { content in waterfall.scale.y = Float(min( windowMargins.bottom / pointsPerMeter, maxWaterfallHeight)) // ... } .preferredWindowClippingMargins(.bottom, maxWaterfallHeight * pointsPerMeter) }
-
16:44 - World recenter
// World recenter var body: some View { RealityView { content in // ... } .onWorldRecenter { recomputePositions() } }
-
17:58 - Progressive immersion style
// Progressive immersion style @State private var selectedStyle: ImmersionStyle = .progressive var body: some Scene { ImmersiveSpace(id: "space") { ImmersiveView() } .immersionStyle( selection: $selectedStyle, in: .progressive(aspectRatio: .portrait)) }
-
18:37 - Mixed immersion style
// Mixed immersion style @State private var selectedStyle: ImmersionStyle = .progressive var body: some Scene { ImmersiveSpace(id: "space") { ImmersiveView() } .immersionStyle(selection: $selectedStyle, in: .mixed) .immersiveEnvironmentBehavior(.coexist) }
-
20:14 - Remote immersive space
// Remote immersive space // Presented on visionOS RemoteImmersiveSpace(id: "preview-space") { CompositorLayer(configuration: config) { /* ... */ } } // Presented on macOS WindowGroup("Main Stage", id: "main") { StageView() }
-
20:48 - 'CompositorLayer' is a 'CompositorContent'
// 'CompositorLayer' is a 'CompositorContent' struct ImmersiveContent: CompositorContent { @Environment(\.scenePhase) private var scenePhase var body: some CompositorContent { CompositorLayer { renderer in // ... } .onImmersionChange { oldImmersion, newImmersion in // ... } } }
-
23:00 - Scene bridging
// Scene bridging import UIKit import SwiftUI // Declare the scenes class MyHostingSceneDelegate: NSObject, UIHostingSceneDelegate { static var rootScene: some Scene { WindowGroup(id: "my-volume") { ContentView() } .windowStyle(.volumetric) } } // Create a request for the scene let requestWithId = UISceneSessionActivationRequest( hostingDelegateClass: MyHostingSceneDelegate.self, id: "my-volume")! // Send a request UIApplication.shared.activateSceneSession(for: requestWithId)
-