Thanks for being a part of WWDC25!

How did we do? We’d love to know your thoughts on this year’s conference. Take the survey here

App crashes after multiple transitions to screen containing AR Kit using SwiftUI NavigationStack

Hello.

I am currently building an app using AR Kit.

As for the UI, I am using SwiftUI and NavigationStack + NavigationLink for navigation and screen transitions!

Here I need to go back and forth between the AR screen and other screens.

If the number of screen transitions is small, this is not a problem.

However, if the number of screen transitions increases to 10 or 20, it crashes somewhere.

We are struggling with this problem. (The nature of the application requires multiple screen transitions.)

The crash log showed the following.

error: read memory from 0x1e387f2d4 failed

Incident Identifier: B23D806E-D578-4A95-8828-2A1E8D6BB7F8
Beta Identifier:     924A85AB-441C-41A7-9BC2-063940BDAF32
Hardware Model:      iPhone16,1
Process:             AR_Crash_Sample [2375]
Path:                /private/var/containers/Bundle/Application/FAC3D662-DB10-434E-A006-79B9515D8B7A/AR_Crash_Sample.app/AR_Crash_Sample
Identifier:          ar.crash.sample.AR.Crash.Sample
Version:             1.0 (1)
AppStoreTools:       16C7015
AppVariant:          1:iPhone16,1:18
Beta:                YES
Code Type:           ARM-64 (Native)
Role:                Foreground
Parent Process:      launchd [1]
Coalition:           ar.crash.sample.AR.Crash.Sample [1464]

Date/Time:           2025-03-07 11:59:14.3691 +0900
Launch Time:         2025-03-07 11:57:47.3955 +0900
OS Version:          iPhone OS 18.3.1 (22D72)
Release Type:        User
Baseband Version:    2.40.05
Report Version:      104

Exception Type:  EXC_CRASH (SIGABRT)
Exception Codes: 0x0000000000000000, 0x0000000000000000
Termination Reason: SIGNAL 6 Abort trap: 6
Terminating Process: AR_Crash_Sample [2375]

Triggered by Thread:  7

Application Specific Information:
abort() called

Thread 7 name:   Dispatch queue: com.apple.arkit.depthtechnique
Thread 7 Crashed:
0   libsystem_kernel.dylib        	       0x1e387f2d4 __pthread_kill + 8
1   libsystem_pthread.dylib       	       0x21cedd59c pthread_kill + 268
2   libsystem_c.dylib             	       0x199f98b08 abort + 128
3   libc++abi.dylib               	       0x21ce035b8 abort_message + 132
4   libc++abi.dylib               	       0x21cdf1b90 demangling_terminate_handler() + 320
5   libobjc.A.dylib               	       0x18f6c72d4 _objc_terminate() + 172
6   libc++abi.dylib               	       0x21ce0287c std::__terminate(void (*)()) + 16
7   libc++abi.dylib               	       0x21ce02820 std::terminate() + 108
8   libdispatch.dylib             	       0x199edefbc _dispatch_client_callout + 40
9   libdispatch.dylib             	       0x199ee65cc _dispatch_lane_serial_drain + 768
10  libdispatch.dylib             	       0x199ee7158 _dispatch_lane_invoke + 432
11  libdispatch.dylib             	       0x199ee85c0 _dispatch_workloop_invoke + 1744
12  libdispatch.dylib             	       0x199ef238c _dispatch_root_queue_drain_deferred_wlh + 288
13  libdispatch.dylib             	       0x199ef1bd8 _dispatch_workloop_worker_thread + 540
14  libsystem_pthread.dylib       	       0x21ced8680 _pthread_wqthread + 288
15  libsystem_pthread.dylib       	       0x21ced6474 start_wqthread + 8

Perhaps I am using too much memory!

How can I address this phenomenon?

For the AR functionality, we are using UIViewRepresentable, which is written in UIKit and can be called from SwiftUI

import ARKit
import AsyncAlgorithms
import AVFoundation
import SCNLine
import SwiftUI

internal struct MeasureARViewContainer: UIViewRepresentable {
    @Binding var tapCount: Int
    @Binding var distance: Double?
    @Binding var currentIndex: Int

    var focusSquare: FocusSquare = FocusSquare()
    let coachingOverlay: ARCoachingOverlayView = ARCoachingOverlayView()

    func makeUIView(context: Context) -> ARSCNView {
        let arView: ARSCNView = ARSCNView()
        arView.delegate = context.coordinator

        let configuration: ARWorldTrackingConfiguration = ARWorldTrackingConfiguration()
        configuration.planeDetection = [.horizontal, .vertical]

        if ARWorldTrackingConfiguration.supportsFrameSemantics(.sceneDepth) {
            configuration.frameSemantics = [.sceneDepth, .smoothedSceneDepth]
        }

        arView.session.run(configuration, options: [.resetTracking, .removeExistingAnchors])
        context.coordinator.sceneView = arView
        context.coordinator.scanTarget()

        coachingOverlay.session = arView.session
        coachingOverlay.delegate = context.coordinator
        coachingOverlay.goal = .horizontalPlane
        coachingOverlay.activatesAutomatically = true
        coachingOverlay.autoresizingMask = [.flexibleWidth, .flexibleHeight]
        coachingOverlay.translatesAutoresizingMaskIntoConstraints = false
        arView.addSubview(coachingOverlay)

        return arView
    }

    func updateUIView(_ _: ARSCNView, context: Context) {
        context.coordinator.mode = MeasurementMode(rawValue: currentIndex) ?? .width
        if tapCount == 0 {
            context.coordinator.resetMeasurement()
            return
        }
        if distance != nil {
            return
        }
        DispatchQueue.main.async {
            if context.coordinator.distance == nil {
                context.coordinator.handleTap()
            }
        }
    }

    static func dismantleUIView(_ uiView: ARSCNView, coordinator: Coordinator) {
        uiView.session.pause()
        coordinator.stopScanTarget()
        coordinator.stopSpeech()
        DispatchQueue.main.async {
            uiView.removeFromSuperview()
        }
    }

    func makeCoordinator() -> Coordinator {
        Coordinator(self)
    }

    class Coordinator: NSObject, ARSCNViewDelegate, ARSessionDelegate, ARCoachingOverlayViewDelegate {
        var parent: MeasureARViewContainer
        var sceneView: ARSCNView?
        var startPosition: SCNVector3?
        var pointedCount: Int = 0
        var distance: Float?
        var mode: MeasurementMode = .width
        let synthesizer: AVSpeechSynthesizer = AVSpeechSynthesizer()
        var scanTargetTask: Task<Void, Never>?
        var currentResult: ARRaycastResult?

        init(_ parent: MeasureARViewContainer) {
            self.parent = parent
        }
        // ... etc
    }
}

The following process calls the screen

import SwiftUI

internal struct ListView: View {
    @State private var scanData: ScanData = ScanData()

    var body: some View {
        NavigationStack {
            List {
                ListItem(
                    title: "shoulder width",
                    description: "scan shoulder max width",
                    length: $scanData.shoulderWidth
                )
                .overlay {
                    NavigationLink(value: ScanType.shoulderWidth) {
                        EmptyView()
                    }
                    .opacity(0)
                }
                .listRowBackground(Color.white)
                .listRowSeparator(.hidden)
                ListItem(
                    title: "forearm width",
                    description: "scan forearm max width",
                    length: $scanData.forearmWidth
                )
                .overlay {
                    NavigationLink(value: ScanType.shoulderWidth) {
                        EmptyView()
                    }
                    .opacity(0)
                }
                .listRowBackground(Color.white)
                .listRowSeparator(.hidden)
                ListItem(
                    title: "knee thickness",
                    description: "scan knee thickness",
                    length: $scanData.kneeThicknessWidth
                )
                .overlay {
                    NavigationLink(value: ScanType.shoulderWidth) {
                        EmptyView()
                    }
                    .opacity(0)
                }
                .listRowBackground(Color.white)
                .listRowSeparator(.hidden)
            }
            .listStyle(.plain)
            .navigationDestination(for: ScanType.self) { type in
                MeasureARView(
                    title: "scan for ",
                    bodyImageName: "PersonForearm",
                    startPointBackgroundImageName: "BackgroundForearm",
                    endPointBackgroundImageName: "BackgroundForearmEnd",
                    startPointExplainText: "Tap the start point. Move the cursor and tap the button at the point",
                    lastResult: nil,
                    mode: .width,
                    bindLength: type == .foreArm ? $scanData.forearmWidth :
                        type == .knee ? $scanData.kneeThicknessWidth :
                        type == .shoulderWidth ? $scanData.shoulderWidth : .constant(nil)
                )
            }
        }
    }
}
import ARKit
import AVFoundation
import SwiftUI

internal struct MeasureARView: View {
    let synthesizer: AVSpeechSynthesizer = AVSpeechSynthesizer()
    let title: String
    let bodyImageName: String
    let startPointBackgroundImageName: String
    let endPointBackgroundImageName: String
    let startPointExplainText: String
    let lastResult: Double?
    let mode: MeasurementMode
    @Binding var bindLength: Double?
    @State private var length: Double?
    @State private var tapCount: Int = 0
    @State private var currentIndex: Int = 0

    @Environment(\.dismiss)
    private var dismiss: DismissAction

    init(
        title: String,
        bodyImageName: String,
        startPointBackgroundImageName: String,
        endPointBackgroundImageName: String,
        startPointExplainText: String,
        lastResult: Double?,
        mode: MeasurementMode,
        bindLength: Binding<Double?>
    ) {
        self.title = title
        self.bodyImageName = bodyImageName
        self.startPointBackgroundImageName = startPointBackgroundImageName
        self.endPointBackgroundImageName = endPointBackgroundImageName
        self.startPointExplainText = startPointExplainText
        self.lastResult = lastResult
        self.mode = mode
        self.currentIndex = mode.rawValue
        self._bindLength = bindLength
    }

    var body: some View {
        ZStack {
            MeasureARViewContainer(
                tapCount: $tapCount,
                distance: $length,
                currentIndex: $currentIndex
            )
            GeometryReader { geometry in
                SegmentedPicker(
                    width: geometry.size.width,
                    contents: ["thickness", "width"],
                    selection: $currentIndex
                )
                Image(tapCount == 0 ? startPointBackgroundImageName : endPointBackgroundImageName)
                    .resizable()
                    .frame(width: geometry.size.width, height: geometry.size.height)
                    .accessibility(label: Text("imageIcon"))
                    .opacity(0.5)
            }
            Text("+")
                .font(.largeTitle)
                .foregroundStyle(.red)
                .frame(maxWidth: .infinity, maxHeight: .infinity, alignment: .center)
            VStack {
                Image(bodyImageName, label: Text(bodyImageName))
                    .resizable()
                    .frame(width: 30, height: 80)
            }
                .padding()
                .background(Color.black.opacity(0.6))
                .cornerRadius(10)
                .foregroundColor(.white)
                .frame(maxWidth: .infinity, maxHeight: .infinity, alignment: .bottomLeading)
                .padding()
            Button(action: {
                withAnimation {
                    tapCount += 1
                }
            }, label: {
                ZStack {
                    Circle()
                        .foregroundStyle(Color.black.opacity(0.6))
                        .frame(width: 60, height: 60)
                    Circle()
                        .foregroundStyle(Color.black.opacity(0.6))
                        .frame(width: 80, height: 80)
                }
            })
            .frame(maxWidth: .infinity, maxHeight: .infinity, alignment: .bottom)
            .padding(.vertical)
            .alert("result", isPresented: Binding<Bool>(
                get: {
                    $length.wrappedValue != nil
                },
                set: { newValue in if !newValue {
                    $length.wrappedValue = nil
                }
                }
            )) {
                Button("re scan") {
                    length = nil
                    tapCount = 0
                }
                Button("register") {
                    bindLength = length
                    dismiss()
                }
            } message: {
                if let length {
                    Text(
                    """
                    result: \("\("\(String(format: "%0.1f", length)) cm")")
                    """
                    )
                }
            }
        }
        .navigationTitle(
            Text(title)
        )
    }
}

I am so sorry.

I made a very silly mistake.

I thought I had terminated the task when I left the screen, but I did not terminate it properly.

By properly terminating the infinite loop when I left the screen, this problem no longer occurs.

The loops were running multiple and endless times and just using up memory

    class Coordinator: NSObject, ARSCNViewDelegate, ARSessionDelegate, ARCoachingOverlayViewDelegate {
        var scanTargetTask: Task<Void, Never>?
        var isComplete: Bool = false
 @MainActor
        func scanTarget() {
            if scanTargetTask != nil {
                return
            }
            scanTargetTask = Task {
                let timer: AsyncTimerSequence = AsyncTimerSequence(
                    interval: .seconds(0.05),
                    clock: .continuous
                )
                let _: AsyncStream = AsyncStream<Any> { continuation in
                    if isComplete {
                        continuation.yield(1)
                        continuation.finish()
                    } else {
                        Task {
                            for await _ in timer {
                                if isComplete {
                                    return
                                }
                                updateFocusSquare()
                            }
                        }
                    }
                }
            }
        }

        func stopScanTarget() {
            guard let scanTargetTask else {
                return
            }
            print("stopScanTarget")
            scanTargetTask.cancel()
            self.scanTargetTask = nil
        }
}
App crashes after multiple transitions to screen containing AR Kit using SwiftUI NavigationStack
 
 
Q