Alternatives to SceneView

Hey there,

since SceneView has been marked as „deprecated“ for SwiftUI, I‘m wondering which alternatives should be considered for the following situation:

I have a SwiftUI app (for iOS and iPadOS) where users can view (with rotate, scale, move gestures) 3D models (USDZ) in a scene. The models will be downloaded from web backend and called via local URL paths.

What I tested:

  • I‘ve tried ARView in .nonAR mode, RealityView, however I didn‘t get the expected response -> User can rotate, scale the 3D models in a virtual space.

  • ARView in nonAR mode still shows the object like in normal AR mode without camera stream.

  • I tried to add Gestures to the RealityView on iOS - loading USDZ 3D models worked but the gestures didn’t).

  • Model3D is only available for visionOS (that would be amazing to have it for iOS)

  • I also checked QuickLook Preview however it works pretty strange via Filepicker etc, which is not the way how the user should load the 3D models in my app.

Maybe I missed something, I couldn’t find anything which can help me. I‘m pretty much stucked adopting the latest and greatest frameworks/APIs in my App and taking the next steps porting my app to visionOS.

Long story short 😃: Does someone have an idea what is the alternative to SceneView for USDZ 3D models?

I appreciate your support!! Thanks in advance!

nvrmnd

RealityKit and RealityView are my suggestions.

I think you were on the right track with gestures + RealityView:

I tried to add Gestures to the RealityView on iOS - loading USDZ 3D models worked but the gestures didn’t).

We can use SwiftUI gestures with RealityKit entities. Things like TapGesture, DragGesture, etc. There is a bit of work needed to make these work with RealityKit.

  • Load you model in a RealityView as an entity
  • Add components to the entity: InputTargetComponent and CollisionComponent are both required to use system gestures with entities.
  • The gesture code needs to target entities

Example using targetedToAnyEntity.

var tapExample: some Gesture {
TapGesture()
.targetedToAnyEntity() // 3. make sure to use this line to target entities
.onEnded { value in
if selected === value.entity {
// If the same entity is tapped, lower it and deselect.
selected?.position.y = 0
selected = nil
} else {
// Lower the previously selected entity (if any).
selected?.position.y = 0
// Raise the new entity and select it.
value.entity.position.y = 0.1
selected = value.entity
}
}
}

Hello @ggintli , thank you for your question!

@radicalappdev has provided a great answer (thank you!), I just want to chime in and +1 the suggestion to use RealityView with a TapGesture, DragGesture, etc. I recommend taking a look at the sample project BOT-anist, as it is a cross-platform RealityKit project that contains RealityViews with gestures.

You mentioned you would like to have Model3D for iOS: I recommend filing an enhancement request for this via Feedback Assistant, and include as much detail about your use case as possible.

I know your app is not for visionOS, but for anyone else reading: in visionOS 26 in RealityKit, we introduce a new API manipulable(coordinateSpace:operations:inertia:isEnabled:onChanged:) that you can apply to a Model3D that makes it easier to implement these interactions.

Thank you!

import SwiftUI
import RealityKit
struct ContentView: View {
var body: some View {
RealityView { content in
content.camera = .virtual
// content.add(entity)
}
.realityViewCameraControls(.orbit)
}
}
Alternatives to SceneView
 
 
Q