Thanks for being a part of WWDC25!

How did we do? We’d love to know your thoughts on this year’s conference. Take the survey here

Photos & Camera

RSS for tag

Explore technical aspects of capturing high-quality photos and videos, including exposure control, focus modes, and RAW capture options.

Posts under Photos & Camera subtopic

Post

Replies

Boosts

Views

Activity

DockKit gimbal reported yaw drifts by upwards of 45 degrees after running for a while
This is an issue with the Insta360 Flow Pro 2. My iOS app uses DockKit to control the gimbal; in particular, my app disables tracking and sends angular velocity commands to control the gimbal's orientation. I only try to modify the yaw (rotation around the vertical axis); never the pitch or yaw. Note that I don't send the gimbal to a particular orientation directly; I modify the velocity. Everything works great for a long period of time: typically for a continuous run of 4-6 hours; in the most recent case, I managed about 36 hours of continous operation before the following problem occurred. I came back to check on the system, and because no visual activity had occurred in the camera's field of view for a while, the phone had commanded the gimbal to rotate back to a yaw angle of 0 degrees. So the phone in the gimbal should have been looking straight ahead (i.e. the 0 degree yaw position), but it was definitely looking off at an angle. I've seen this twice now. The first time, when it should have been looking straight ahead, it was in fact looking 60 degrees off center. This time (caught on video, see below), it was off by 22 degrees from center. Here's the weird part: the gimbal reports this way off center positioning as zero degrees (well close enough to zero, like 0.2 or something that's fine). But, mechanically, the gimbal still knows where zero degrees is: if we double click on the trigger of the Flow Pro 2, which is supposed to reset the gimbal to 0 degrees yaw and pitch, the gimbal responds correctly and reorients to a 0 degree position. However, the yaw values it reports are not zero, but as shown in my video, 22 degrees off axis or so. Power cycling the gimbal and restarting immediately fixes the problem. Also, I switched from my app to the Insta360 app, which caused the phone to flip from landscape to portrait, then when I returned to my app and switched back to landscape, the gimbal now started reporting correct yaw angles. Is there a possibility this is a bug in the DockKit framework? Has anyone seen this? I have a case open with Insta360, but although it's clearly a software issue, it's not clear if it's in Insta360's code or the DockKit layer. Any ideas for how I can get out of this mode? My concern is that the phone is in a tripod about 10' off the floor, and not very accessible. Also, if all goes well, we may have about 50 of these systems running, and having to fix them one by one after a few hours is not good. For a demonstration of this bug, see the following video: https://octoparry.com/offset.MOV Any help greatly appreciated.
4
0
335
Mar ’25
Create live photo throw Optional(Error Domain=PHPhotosErrorDomain Code=-1 "(null)")
I want to create a Live Photo. The project includes a .jpg image and a .mov video (2 seconds). I am sure they are correct. Two permissions in xcode have been added: Privacy - Photo Library Usage Description Privacy - Photo Library Additions Usage Description Simulate: iphone 16, ios 18.3 The codes in ContentView.swift : private func saveLivePhoto(imageURL: URL, videoURL: URL, completion: @escaping (Bool, Error?) -> Void) { PHPhotoLibrary.shared().performChanges { let creationRequest = PHAssetCreationRequest.forAsset() let options = PHAssetResourceCreationOptions() options.shouldMoveFile = false creationRequest.addResource(with: .photo, fileURL: imageURL, options: options) creationRequest.addResource(with: .pairedVideo, fileURL: videoURL, options: options) } completionHandler: { success, error in DispatchQueue.main.async { print(error) completion(success, error) } } } guard let imageURL = Bundle.main.url(forResource: "livephoto", withExtension: "jpeg"), let videoURL = Bundle.main.url(forResource: "livephoto", withExtension: "mov") else { showAlertMessage(title: "error", message: "cant find Live Photo ") return } print("imageURL: \(imageURL)") print("videoURL: \(videoURL)") saveLivePhoto(imageURL: imageURL, videoURL: videoURL) { success, error in if success { xxxxx } else { xxxxx } } Really need help, thanks
1
1
174
Mar ’25
Create live photo error
I want to create a Live Photo. The project includes a .jpg image and a .mov video (2 seconds). Two permissions in xcode have been added: Privacy - Photo Library Usage Description Privacy - Photo Library Additions Usage Description Simulate: iphone 16, ios 18.3 The codes in ContentView.swift : private func saveLivePhoto(imageURL: URL, videoURL: URL, completion: @escaping (Bool, Error?) -> Void) { PHPhotoLibrary.shared().performChanges { let creationRequest = PHAssetCreationRequest.forAsset() let options = PHAssetResourceCreationOptions() options.shouldMoveFile = false creationRequest.addResource(with: .photo, fileURL: imageURL, options: options) creationRequest.addResource(with: .pairedVideo, fileURL: videoURL, options: options) } completionHandler: { success, error in DispatchQueue.main.async { print(error) completion(success, error) } } } guard let imageURL = Bundle.main.url(forResource: "livephoto", withExtension: "jpeg"), let videoURL = Bundle.main.url(forResource: "livephoto", withExtension: "mov") else { showAlertMessage(title: "error", message: "cant find Live Photo ") return } print("imageURL: \(imageURL)") print("videoURL: \(videoURL)") saveLivePhoto(imageURL: imageURL, videoURL: videoURL) { success, error in if success { xxxxx } else { xxxxx } } Really need help, thanks
0
0
145
Mar ’25
Capturing & Processing ProRaw 48MP images is slow
Hey, I have a camera app that captures a ProRaw photo and then runs a few Core Image filters before saving it to the device as a HEIC. However I'm finding that capturing at 48MP is rather slow. Testing a minimal pipeline on an iPhone 16 Pro: Shutter press => file received in output: 1.2 ~ 1.6s CIRawFilter created using photo file representation then rendered to context, without any filters: 0.8s ~ 1s Saving to device ~0.15s Is this the expected time for capturing processing? The native camera app seems to save the images within half a second. I'm using QualityPrioritization.balanced and the highest resolution available which is 48MP. Would using the CIRawFilter with the pixelBuffer from the photo output be faster? I tried it but couldn't get it to output an image. Are there any other things I could try to speed this up? Is it possible to capture at 24MP instead? Thanks, Alex
1
1
305
Mar ’25
Prevent iOS from Switching Between Back Camera Lenses in getUserMedia (Safari/WebView, iOS 18)
I’m developing a hybrid app (WebView / Turbo Native) that uses getUserMedia to access the back camera for a PPG/heart rate measurement feature (the user places their finger on the camera). Problem: Even when I specify constraints like: { video: { deviceId: '...', facingMode: { exact: 'environment' }, advanced: [{ zoom: 1.0 }] }, audio: false } On iPhone 15 (iOS 18), iOS unexpectedly switches between the wide, ultra-wide, and telephoto lenses during the measurement. This breaks the heart rate detection, and it forces the user to move their finger in the middle of the measurement. Question: Is there any way, via getUserMedia/WebRTC, to force iOS to use only the wide-angle lens and prevent automatic lens switching? I know that with AVFoundation (Swift) you can pick .builtInWideAngleCamera, but I’m hoping to avoid building a custom native layer and would prefer to stick with WebView/JavaScript if possible to save time and complexity. Any suggestions, workarounds, or updates from Apple would be greatly appreciated! Thanks a lot!
1
0
319
Mar ’25
iPhone 16 Camera Control and AVCaptureSlider – Is there a way to detect which slider is active?
I am following the Apple sample code and trying to add a manual focus lens position slider: @available(iOS 18.0, *) private func addCameraControls() { if !self.session.controls.isEmpty { for control in self.session.controls { self.session.removeControl(control) } } self.cameraControlFocusSlider = nil //Focus Slider if self.videoDevice!.isLockingFocusWithCustomLensPositionSupported { self.cameraControlFocusSlider = AVCaptureSlider("Focus", symbolName: "dot.square", in: 0.0...1.0) self.cameraControlFocusSlider!.setActionQueue(self.sessionQueue) { focusValue in //Do manual focus } if self.session.canAddControl(self.cameraControlFocusSlider!) { self.session.addControl(self.cameraControlFocusSlider!) } } } So there are these AVCaptureSessionControlsDelegate methods: final func sessionControlsDidBecomeActive(_ session: AVCaptureSession) { print ("sessionControlsDidBecomeActive") } final func sessionControlsWillEnterFullscreenAppearance(_ session: AVCaptureSession) { print ("sessionControlsWillEnterFullscreenAppearance") } final func sessionControlsWillExitFullscreenAppearance(_ session: AVCaptureSession) { print ("sessionControlsWillExitFullscreenAppearance") } final func sessionControlsDidBecomeInactive(_ session: AVCaptureSession) { print ("sessionControlsDidBecomeInactive") } So when self.cameraControlFocusSlider is presented, I have to show the current value of the lense position. Lens position can change from auto focus and also from manual focus by the user using the app UI. Is there a way to see if self.cameraControlFocusSlider is active or being used? Please note that I will have more than one AVCaptureSlider in the final code.
0
0
344
Mar ’25
Crash in Photos framework
Hi This is one of our top crashes. It does not contain any of our code in the stacktrace and we can't reproduce it. Those points make this crash very hard to understand and fix. We know that most of the crashes are happening on iPhone 13 with iOS 18.x.x. Also we see that a lot of cases happen when app goes into background (stacktrace contains -[UIApplication _applicationDidEnterBackground]). 2025-03-04_16-06-00.3670_-0500-6a273c7d5da97f098b5cc24898bb9761dc45208e.crash 2025-03-04_20-21-08.6609_-0500-2c08f640900f8a62c4f8a4f6f2a61feb052e66dd.crash 2025-03-04_20-46-27.7138_+0000-4d7ea89b1b564eda22ca63e708f7ad3909c7b768.crash
2
0
348
Mar ’25
Above Xcode16 operation project, in the project use AVPictureInPictureController opportunities (PIP) function open system blackout
I found that when the development tool above Xcode16 ran my app, I opened the suspended inscription function, and then opened the system camera, the content in the suspended window would not be displayed, and the suspended window would have a black screen. However, this phenomenon does not appear on Xcode15.4 development tools, it is the same code, I do not know why
5
0
552
Feb ’25
Accessing AV External Storage
Is it possible to use the AVExternalStorageDevice to access external storage from a connected camera or usb drive (via USB C or Lightning connector) on an iPad/iPhone. I have tested the following code on an iPhone 14 (iOS 18.1.1) and an iPad Gen 10 (18.3.1), and both return false for: // returns false on iPhone 14, iPad gen 10 print(AVExternalStorageDeviceDiscoverySession.isSupported) The following code returns null, when I try to access the external storage discovery session. // returns null on iOS devices print(AVExternalStorageDeviceDiscoverySession.shared) The following returns false, without displaying a permission dialog: AVExternalStorageDevice.requestAccess(completionHandler: { (granted: Bool) in // returns false with no permission dialog print(granted); What type of iOS devices are supported by AVExternalStorageDeviceDiscoverySession? What situations has it been used for (e.g. connecting to Camera via the external storage protocol, accessing photos from a SD card with an adapter, accessing photos from usb drive). Is there are sample code for using the AV External Storage api?
0
0
350
Feb ’25
Distortion corrected images
Hi, Currently I am developing a 3D reconstruction project. Which requires images to be distortion-free (rectilinear) and with known intrinsics. The session I am developing on is a builtInDualWideCamera, with isGeometricDistortionCorrectionEnabled set to false to be able to get the intrinsic matrix of the images, isVirtualDeviceConstituentPhotoDeliveryEnabled set to true and isAutoVirtualDeviceFusionEnabled set to false to get both images and isCameraCalibrationDataDeliveryEnabled set to true to actually get the calibration data. The distortion correction parameters such as lensDistortionLookupTable are used. The 42 coefficients mapping array is used as described in the AVCameraCalibrationData header file. A simple piecewise linear interpolation. There are two questions I would like to get support on: A way to set the calibration parameters in each image. I have an approach that sets the parameters in the kCGImagePropertyExifDictionary -> "UserComment". Is there a better approach to write calibration parameter data into the images? I feel like this is a bit dirty and there might be a better and neat approach. For the ultra-wide angle camera's images, the lensDistortionLookupTable contains several zeros at the end of the array. For example (last 10 elements are zero): "LensDistortionLookupTable":"0.000000000000000,0.000349554029526,0.001385628827848,0.003071037586778,... ,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000,0.000000000000000" The problem comes when the complete array is used to correct the image (including zeros), the end result is a wrapped-like-circle image close to the edges of it which is completely wrong. In contrast, if the LensDistortionLookupTable is used without the last zeros and the new size accommodated the image looks better (although not as rectilinear as if you take the image from the iPhone's camera app), but definitely less distorted. Including zeros (full array): Excluding zeros (array size changed): Am I missing an important point in the usage of the lensDistortionLookupTable where this case is addressed (zeros at the end)? What is the criteria to shrink/exclude elements of the array? Any advice is very much welcome.
0
0
365
Feb ’25
Finding which effect has been selected for a Live Photo
I am writing an iOS app to present a slide show of assets in a Photo album, in a random order, including videos and live photos. I have got it all working quite nicely but for a Live Photo, I need to know what effect is selected (Live, Loop, Bounce, Long Exposure, Live Off) to display the image correctly. I can't find any mention of getting this information in the documentation. Anyone know how to do this? Thanks in advance. Adrian. (Xcode 16.1 iOS 18.0)
0
1
279
Feb ’25
kCGImageSourceDecodeToHDR and CGImageSourceCopyPropertiesAtIndex
Hi, I'm using Core Graphics to load a .DNG photo shot by a Leica Q3 camera. The photo is shot in portrait, however the embedded preview is rotated 90 degrees to landscape. I load the photo like this: let options = [kCGImageSourceDecodeRequest: kCGImageSourceDecodeToHDR] as CFDictionary let source = CGImageSourceCreateWithData(data as CFData, nil) let cgimage = CGImageSourceCreateImageAtIndex(source, 0, options) let properties = CGImageSourceCopyPropertiesAtIndex(source, 0, nil) as? [CFString : Any] When doing this I can see that the orientation property is 1 indicating that the orientation is 'Up', which it isn't. If I don't specify the kCGImageSourceDecodeToHDR option (eseentially setting options to nil) - the orientation property is 8 (rotated 90 degrees). What puzzles me is that a chang to the CGImageSourceCreateImageAtIndex call can have an influence on that latter call to CGImageSourceCopyPropertiesAtIndex ? I would expect these to work independently? Cheers Thomas
1
0
378
Feb ’25
Set the capture device color space to apple log not works.
I set the device format and colorspace to Apple Log and turn off the HDR, why the movie output is still in HDR format rather than ProRes Log? Full runnable demo here: https://github.com/SpaceGrey/ColorSpaceDemo session.sessionPreset = .inputPriority // get the back camera let deviceDiscoverySession = AVCaptureDevice.DiscoverySession(deviceTypes: [.builtInWideAngleCamera], mediaType: .video, position: .back) backCamera = deviceDiscoverySession.devices.first! try! backCamera.lockForConfiguration() backCamera.automaticallyAdjustsVideoHDREnabled = false backCamera.isVideoHDREnabled = false let formats = backCamera.formats let appleLogFormat = formats.first { format in format.supportedColorSpaces.contains(.appleLog) } print(appleLogFormat!.supportedColorSpaces.contains(.appleLog)) backCamera.activeFormat = appleLogFormat! backCamera.activeColorSpace = .appleLog print("colorspace is Apple Log \(backCamera.activeColorSpace == .appleLog)") backCamera.unlockForConfiguration() do { let input = try AVCaptureDeviceInput(device: backCamera) session.addInput(input) } catch { print(error.localizedDescription) } // add output output = AVCaptureMovieFileOutput() session.addOutput(output) let connection = output.connection(with: .video)! print( output.outputSettings(for: connection) ) /* ["AVVideoWidthKey": 1920, "AVVideoHeightKey": 1080, "AVVideoCodecKey": apch,<----- prores has enabled. "AVVideoCompressionPropertiesKey": { AverageBitRate = 220029696; ExpectedFrameRate = 30; PrepareEncodedSampleBuffersForPaddedWrites = 1; PrioritizeEncodingSpeedOverQuality = 0; RealTime = 1; }] */ previewSource = DefaultPreviewSource(session: session) queue.async { self.session.startRunning() } }
1
0
356
Feb ’25
PHPhotoLibrary.shared().performChanges raised error 3303
I want to modify the photo's exif information, which means putting the original image in through CGImageDestinationAddImageFromSource, and then adding the modified exif information to a place called properties. Here is the complete process: static func setImageMetadata(asset: PHAsset, exif: [String: Any]) { let options = PHContentEditingInputRequestOptions() options.canHandleAdjustmentData = { (adjustmentData) -> Bool in return true } asset.requestContentEditingInput(with: options, completionHandler: { input, map in guard let inputNN = input else { return } guard let url = inputNN.fullSizeImageURL else { return } let output = PHContentEditingOutput(contentEditingInput: inputNN) let adjustmentData = PHAdjustmentData(formatIdentifier: AppInfo.appBundleId(), formatVersion: AppInfo.appVersion(), data: Data()) output.adjustmentData = adjustmentData let outputURL = output.renderedContentURL guard let source = CGImageSourceCreateWithURL(url as CFURL, nil) else { return } guard let dest = CGImageDestinationCreateWithURL(outputURL as CFURL, UTType.jpeg.identifier as CFString, 1, nil) else { return } CGImageDestinationAddImageFromSource(dest, source, 0, exif as CFDictionary) let d = CGImageDestinationFinalize(dest) // d is true, and I checked the content of outputURL, image has been write correctly, it could be convert to UIImage and image is ok. PHPhotoLibrary.shared().performChanges { let changeReq = PHAssetChangeRequest(for: asset) changeReq.contentEditingOutput = output } completionHandler: { succ, err in if !succ { print(err) // 3303 here, always! } } }) }
1
0
275
Feb ’25
Do we need an MFi authentication chip to send video/commands to iPhone?
Hello, my company is developing a product that will send data to/from the phone over cable and Wi-Fi. I have three questions: Do we need an MFi authentication chip in our product if we plan to send video and commands to the iPhone/iPad over USB or Lightning cable? Likewise, do we need an MFI authentication chip for communication over Wi-Fi? (Informal research suggests that the answer is no to this one.) And, do we even still need MFI certification at all for Wi-Fi comms? (We are not using HomeKit.) Thank you!
0
0
392
Feb ’25
PHPickerViewController update preselectedAssetIdentifiers
I am trying to recreate the iOS Messages app photo selection UI, where a PHPickerViewController is displayed half screen, with the message text field and a scrolling photo viewer on top. I have that UI mostly working, but cannot figure out how to allow a user to remove an image from my scrolling photo viewer (just like in the iOS Messages app). When the picker is initially displayed, I can show selected images using the preselectedAssetIdentifiers. However, if the user taps the "x" to remove an image from the scrolling photo viewer, there is no way that I have found to update that selection in the picker. I can dismiss/show a new picker with the animated property set to false, but that creates a very apparent bounce in the screen. Are there any ways I am missing to accomplish this? Here is what I have so far:
1
0
345
Feb ’25
[Crash Report] PHAssetResourceManager.writeDataForAssetResource causes app crash on iOS18
We are encountering a critical, intermittently occurring crash issue when accessing photo data using PHAssetResourceManager.writeDataForAssetResource on iOS 18. The problem does not arise on iOS 17 or earlier versions. We have been unable to identify a consistent reproduction path. Based on user feedback, the issue seems to involve Live Photo and Raw image files. Our investigation has revealed that the crash occurs in the +[PISchema identifier] method of the PhotoImaging Framework. When called manually, this method causes a crash on iOS 18 but works without issues on iOS 17. Reproduction Steps: 1.Fetch PHAsset. 2.Get PHAssetResource by [PHAssetResource assetResourcesForAsset:]. 3.Call [PHAssetResourceManager writeDataForAssetResource:toFile:options:completionHandler:]. Crash Log: Incident Identifier: CFD60092-FDB1-43B4-BA42-3F507F7B8B96 CrashReporter Key: 260b4780989083a54e0cb451930fe9a3bed64862 Hardware Model: iPhone13,4 AppStoreTools: 16C5031b AppVariant: 1:iPhone13,4:18 Code Type: ARM-64 (Native) Role: Foreground Parent Process: launchd [1] Date/Time: 2025-02-15 19:07:57.7054 +0800 Launch Time: 2025-02-15 19:07:55.4106 +0800 OS Version: iPhone OS 18.3.1 (22D72) Release Type: User Baseband Version: 5.20.03 Report Version: 104 Exception Type: EXC_CRASH (SIGABRT) Exception Codes: 0x0000000000000000, 0x0000000000000000 Termination Reason: SIGNAL 6 Abort trap: 6 Terminating Process: mCloud_iPhone [11109] Triggered by Thread: 11 Application Specific Information: abort() called Thread 11 name: Dispatch queue: com.apple.NSXPCConnection.m-user.com.apple.photos.service Thread 11 Crashed: 0 libsystem_kernel.dylib 0x1e850b2d4 __pthread_kill + 8 1 libsystem_pthread.dylib 0x221b4959c pthread_kill + 268 2 libsystem_c.dylib 0x19ec24b08 abort + 128 3 NeutrinoCore 0x1bdcdbdec -[NUAssertionPolicyAbort notifyAssertion:] + 68 4 NeutrinoCore 0x1bdcdbbf4 -[NUAssertionPolicyComposite notifyAssertion:] + 160 5 NeutrinoCore 0x1bdcdc098 -[NUAssertionPolicyUnique notifyAssertion:] + 176 6 NeutrinoCore 0x1bdcdb524 -[NUAssertionHandler handleFailureInFunction:file:lineNumber:currentlyExecutingJobName:description:arguments:] + 156 7 NeutrinoCore 0x1bdcdc4bc _NUAssertFailHandler + 176 8 NeutrinoCore 0x1bdc8ea98 -[NUIdentifier initWithNamespace:name:version:] + 2352 9 NeutrinoCore 0x1bdc8eba8 -[NUIdentifier initWithName:version:] + 84 10 NeutrinoCore 0x1bdc8ec10 -[NUIdentifier initWithName:] + 68 11 PhotoImaging 0x1bda54ce4 +[PISchema identifier] + 36 12 PhotoImaging 0x1bda550fc +[PISchema registeredPhotosSchemaIdentifier] + 32 13 PhotoImaging 0x1bd9d7128 +[PIPhotoEditHelper newComposition] + 28 14 PhotoImaging 0x1bd940798 +[PICompositionSerializer deserializeCompositionFromAdjustments:metadata:formatIdentifier:formatVersion:sidecarData:error:] + 160 15 PhotoImaging 0x1bd9412ec +[PICompositionSerializer deserializeCompositionFromData:formatIdentifier:formatVersion:sidecarData:error:] + 224 16 PhotoLibraryServices 0x1afabf75c -[PLPhotoEditPersistenceManager loadCompositionFrom:formatIdentifier:formatVersion:sidecarData:error:] + 1856 17 PhotoLibraryServices 0x1afabffe4 +[PLPhotoEditPersistenceManager validateAdjustmentData:formatIdentifier:formatVersion:error:] + 108 18 Photos 0x1af4ac360 __167+[PHContentEditingInputRequestContext contentEditingInputRequestContextForAsset:requestID:managerID:networkAccessAllowed:downloadIntent:progressHandler:resultHandler:]_block_invoke + 260 19 Photos 0x1af4ac67c -[PHAdjustmentData(ContentEditingInput) _contentEditing_readableByClientWithVerificationBlock:] + 136 20 Photos 0x1af4ac4b0 -[PHAdjustmentData(ContentEditingInput) _contentEditing_requiredBaseVersionReadableByClient:verificationBlock:] + 88 21 Photos 0x1af4abb8c -[PHContentEditingInputRequestContext _adjustmentBaseVersionFromResult:request:canHandleAdjustmentData:] + 404 22 Photos 0x1af4a911c -[PHContentEditingInputRequestContext produceChildRequestsForRequest:reportingIsLocallyAvailable:isDegraded:result:] + 624 23 Photos 0x1af2c1d10 -[PHMediaRequestContext _produceChildRequestsForRequest:withResult:] + 88 24 Photos 0x1af2c11e8 -[PHMediaRequestContext mediaRequest:didFinishWithResult:] + 88 25 Photos 0x1af505184 -[PHAdjustmentDataRequest _finishFromAsynchronousCallback] + 124 26 Photos 0x1af5050a0 __39-[PHAdjustmentDataRequest startRequest]_block_invoke + 584 27 PhotoLibraryServicesCore 0x1b001be8c __106-[PLAssetsdResourceClient adjustmentDataForAsset:networkAccessAllowed:trackCPLDownload:completionHandler:]_block_invoke.86 + 864 28 CoreFoundation 0x196dd8e34 __invoking___ + 148 29 CoreFoundation 0x196dd7e7c -[NSInvocation invoke] + 428 30 Foundation 0x195a64ae0 __NSXPCCONNECTION_IS_CALLING_OUT_TO_EXPORTED_OBJECT__ + 16 31 Foundation 0x195a63514 -[NSXPCConnection _decodeAndInvokeReplyBlockWithEvent:sequence:replyInfo:] + 532 32 Foundation 0x195a6653c __88-[NSXPCConnection _sendInvocation:orArguments:count:methodSignature:selector:withProxy:]_block_invoke_5 + 188 33 libxpc.dylib 0x221babb80 _xpc_connection_reply_callout + 116 34 libxpc.dylib 0x221b9e2d0 _xpc_connection_call_reply_async + 80 35 libdispatch.dylib 0x19eb6b028 _dispatch_client_callout3 + 20 36 libdispatch.dylib 0x19eb88b64 _dispatch_mach_msg_async_reply_invoke + 340 37 libdispatch.dylib 0x19eb7242c _dispatch_lane_serial_drain + 352 38 libdispatch.dylib 0x19eb73158 _dispatch_lane_invoke + 432 39 libdispatch.dylib 0x19eb7e38c _dispatch_root_queue_drain_deferred_wlh + 288 40 libdispatch.dylib 0x19eb7dbd8 _dispatch_workloop_worker_thread + 540 41 libsystem_pthread.dylib 0x221b44680 _pthread_wqthread + 288 42 libsystem_pthread.dylib 0x221b42474 start_wqthread + 8
0
1
314
Feb ’25
3D scanning and 3D model with texture mapping
I am new here and would appreciate help in coding or an explanation what to use in swift for an app which will be able to capture LiDAR scanning and RGB data from taken pictures, generate a 3D mesh, and create .OBJ, .MTL, and .JPEG file set for further manipulation of 3D model. I am able to create from LiDAR scanning 3D mesh and .OBJ file but can not generate .MTL and .JPEG for a texture of 3D model.
1
0
345
Feb ’25