I found that the aggregated device correctly obtains input channels in the standard microphone mode. However, in voice isolation mode, it only retrieves channels from the first sub-device in the aggregated device's list. If I want to properly obtain channel information in voice isolation mode, how should I do it?
How did we do? We’d love to know your thoughts on this year’s conference. Take the survey here
Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
i have a wallpaper app , i need a photos access when i try to add it in capabilities i don't find it.is there any solution?
Can i use iokit usb lib to disable build-in camera?
Hi,
I'm developing a SwiftUI app using RealityKit and ARKit for an AR measuring feature. I’ve noticed that after navigating away from my AR view and performing extensive cleanup (including removing all anchors/entities, pausing the ARSession, and nil-ing out all references), memory usage remains elevated and sometimes grows with repeated AR sessions.
Each time I enter and exit the AR view, memory increases
The memory does not return to the baseline after cleanup, even though all custom objects are deallocated.
Are there best practices beyond what I’ve described to ensure all ARKit/RealityKit resources are released after an AR session?
When I create a SFSpeechRecognizer object, I find SFLocalSpeechRecognitionClient remains in memory and never gets released.
You can create a demo with a single UIButton whose touch action is
SFSpeechRecognizer(locale: Locale(identifier: "zh_CN"))
Your draft looks great! Here's a refined version with the iOS 17 comparison emphasized and slightly better flow:
Hi Apple Engineers and fellow developers,
I'm experiencing a critical regression with ShazamKit's background operation on iOS 18. ShazamKit's SHManagedSession stops identifying songs in the background after approximately 20 seconds on iOS 18, while the exact same code works perfectly on iOS 17.
The behavior is consistent: the app works perfectly in the foreground, but when backgrounded or device is locked, it initially works for about 20 seconds then stops identifying new songs. The microphone indicator remains active suggesting audio access is maintained, but ShazamKit doesn't send identified songs in the background until you open the app again. Detection immediately resumes when bringing the app to foreground.
My technical setup uses SHManagedSession for continuous matching with background modes properly configured in Info.plist including audio mode, and Background App Refresh enabled. I've tested this on physical devices running iOS 18.0 through 18.5 with the same results across all versions. The exact same code running on iOS 17 devices works flawlessly in the background.
To reproduce: initialize SHManagedSession and start matching, begin song identification in foreground, background the app or lock device, play different songs which are initially detected for about 20 seconds, then after the timeout period new songs are no longer identified until you bring the app to foreground.
This regression has impacted my production app as users who rely on continuous background music identification are experiencing a broken feature. I submitted this as Feedback ID FB15255903 last September with no solution so far.
I've created a minimal demo project that reproduces this issue: https://github.com/tfmart/ShazamKitBackground
Has anyone else experienced this ShazamKit background regression on iOS 18? Are there any known workarounds or alternative approaches? Given the time this issue has persisted, could we please get acknowledgment of this regression, expected timeline for a fix, or any recommended workarounds?
Testing environment is Xcode 16.0+ on iOS 18.0-18.5 across multiple physical device models.
Any guidance would be greatly appreciated.
In Final Cut Pro, keyframes for transform parameters (such as Position, Scale, and Rotation) are automatically set to “Smooth” interpolation. This often results in undesired easing between keyframes, especially when linear motion is required.
Currently, we have to manually adjust each keyframe to "Linear" using the Video Animation Editor, which can be time-consuming when working with many keyframes.
Would it be possible to add an option to set the default keyframe interpolation to "Linear"—either globally in Preferences or per parameter in the Inspector?
This would greatly streamline the animation workflow for many editors.
Thank you for considering this request!
I take that MusicKit JS is built with TypeScript, based on the attributions in the script: https://js-cdn.music.apple.com/musickit/v3/musickit.js
In the script it points to https://js-cdn.music.apple.com/musickit/v1/acknowledgements.txt – I assume this should be the v3 URL for the v3 version? It returns the same content nonetheless.
This contains attributions for TypeScript.
Currently there's a third-party effort with DefinitelyTyped, which publishes the NPM package @types/musickit-js. The latest supported version available is v1.
However, there is no version compatible with v3.
This makes it hard to use MusicKit JS v3 in a TypeScript project.
Please publish the types, ideally on the CDN along with the musickit.js file. Also consider publishing an officially Apple supported DefinitelyTyped package, or help to maintain the existing @types/musickit-js to make consuming this even easier.
Hi,
I’m building a PPG-based heart rate feature where the user places their finger over the rear telephoto camera. On iPhone 16 Pro Max, I'm explicitly selecting the telephoto lens like this:
videoDevice = AVCaptureDevice.default(.builtInTelephotoCamera, for: .video, position: .back)
And trying to lock it:
if #available(iOS 15.0, *),
device.activePrimaryConstituentDeviceSwitchingBehavior != .unsupported {
try? device.lockForConfiguration()
device.setPrimaryConstituentDeviceSwitchingBehavior(.locked, restrictedSwitchingBehaviorConditions: [])
device.unlockForConfiguration()
}
I also lock everything else to prevent dynamic changes:
try device.lockForConfiguration()
device.focusMode = .locked
device.exposureMode = .locked
device.whiteBalanceMode = .locked
device.videoZoomFactor = 1.0
device.automaticallyEnablesLowLightBoostWhenAvailable = false
device.automaticallyAdjustsVideoHDREnabled = false
device.unlockForConfiguration()
Despite this, the camera still switches to another lens, especially under different lighting, even though the user’s finger fully covers the lens.
Questions:
How can I completely prevent lens switching in this scenario?
Would using videoZoomFactor = 3.0 or 5.0 better enforce use of the telephoto lens?
Thanks!
Gal
Hello,
We're seeing an intermittent issue when playing back FairPlay-protected HLS downloads while the device is offline.
Assets are downloaded using AVAggregateAssetDownloadTask with FairPlay protection.
After download, asset.assetCache.isPlayableOffline == true.
On first playback attempt (offline), ~8% of downloads fail.
Retrying playback always works. We recreate the asset and player on each attempt.
During the playback setup, we try to load variants via:
try await asset.load(.variants)
This call sometimes fails with:
Error Domain=NSURLErrorDomain Code=-1009 “The Internet connection appears to be offline.” UserInfo={NSUnderlyingError=0x105654a00 {Error Domain=NSURLErrorDomain Code=-1009 “The Internet connection appears to be offline.” UserInfo={NSDescription=The Internet connection appears to be offline.}}, NSErrorFailingURLStringKey=file:///private/var/mobile/Containers/Data/Application/2DDF9D7C-9197-46BE-8690-C23EE75C9E90/Library/com.apple.UserManagedAssets.XVvqfh/Baggage_9DD4E2D3F9C0E68F.movpkg/, NSErrorFailingURLKey=file:///private/var/mobile/Containers/Data/Application/2DDF9D7C-9197-46BE-8690-C23EE75C9E90/Library/com.apple.UserManagedAssets.XVvqfh/Baggage_9DD4E2D3F9C0E68F.movpkg/, NSURL=file:///private/var/mobile/Containers/Data/Application/2DDF9D7C-9197-46BE-8690-C23EE75C9E90/Library/com.apple.UserManagedAssets.XVvqfh/Baggage_9DD4E2D3F9C0E68F.movpkg/, AVErrorFailedDependenciesKey=(
“assetProperty_HLSAlternates”
), NSLocalizedDescription=The Internet connection appears to be offline.}
This variant load is used to determine available audio tracks, check for Dolby support, and apply user language preferences.
After this step, the AVPlayerItem also fails via Combine’s publisher for .status.
However, retrying the entire process immediately after (same offline conditions, same asset path, new AVURLAsset) results in successful playback.
Assets are represented using the following class:
public class DownloadedAsset: AVURLAsset {
public let id: String
public let localFileUrl: URL
public let fairplayLicenseUrlString: String?
public let drmToken: String?
var isProtected: Bool {
return fairplayLicenseUrlString != nil
}
public init(id: String,
localFileUrl: URL,
fairplayLicenseUrlString: String?,
drmToken: String?) {
self.id = id
self.localFileUrl = localFileUrl
self.fairplayLicenseUrlString = fairplayLicenseUrlString
self.drmToken = drmToken
super.init(url: localFileUrl, options: nil)
}
}
We use user-selected quality levels to control bitrate and multichannel (e.g. Dolby 5.1) downloads:
let downloadQuality = UserDefaults.standard.downloadVideoQuality
let bitrate: Int
let shouldDownloadMultichannelTracks: Bool
switch downloadQuality {
case .dataSaver:
shouldDownloadMultichannelTracks = false
bitrate = 596564
case .standard:
shouldDownloadMultichannelTracks = false
bitrate = 1503844
case .best:
shouldDownloadMultichannelTracks = true
bitrate = 7038970
}
var selections = multichannelIdentifiedMediaSelections
if !shouldDownloadMultichannelTracks {
selections = selections.filter { !$0.isMultichannel }
}
let task = session.aggregateAssetDownloadTask(
with: asset,
mediaSelections: selections.map { $0.mediaSelection },
assetTitle: title,
assetArtworkData: nil,
options: [AVAssetDownloadTaskMinimumRequiredMediaBitrateKey: bitrate]
)
Seen on devices running iOS 16, iOS 17, and iOS 18.
What could cause the initial failure of an otherwise valid, offline-ready FairPlay HLS asset?
Could .load(.variants) internally trigger a failed network resolution, even when offline?
Is there an internal caching or initialization behavior in AVFoundation that might explain why the second attempt works?
Any guidance would be appreciated.
Topic:
Media Technologies
SubTopic:
Streaming
Tags:
FairPlay Streaming
iOS
HTTP Live Streaming
AVFoundation
Hi! I am making an app for Apple Vision pro (VisionOS 2.5) that is scanning the surroundings and recognises all the texts around you. I tried to use the AVCaptureSession library, but when I run the app from xcode on the real AVP device, the camera is not accessible. I enabled the camera access in my Info.plist: NSCameraUsageDescription Used for live text recognition and I checked camera settings in the AVP, there are no restrictions. However I have always a black square with a crossed camera icon displayed instead of the image from the camera.
I tried a couple of different apps from Github using the AVCaptureSession and they all display the black square instead of the picture.
What can be wrong with the camera?
Topic:
Media Technologies
SubTopic:
Photos & Camera
FxPlug is one of Apple’s official SDKs, recently updated to version 4.3.2. In theory the SDK should guarantee third-parties can build plug-ins that are backward compatible with older versions of Final Cut Pro, Motion and Compressor.
FxPlug SDK includes two frameworks that third-party developers like me end up bundling inside our third-party plugins: FxPlug.framework and PlugInManager.framework.
Behind the scenes, the SDK relies on PlugInKit, but the FxPlug.framework provides abstractions so that third-parties don't have to handle the intricacies of XPC directly.
The most recent version of FxPlug.framework included with the SDK was possibly built with an error: the Info.plist shows a LSMinimumSystemVersion entry of 14.6, suggesting the binary may have been compiled and linked with MACOSX_DEPLOYMENT_TARGET set to 14.6 by accident.
The problem: when older versions of Final Cut Pro or Motion load a third-party plugin (itself built with the appropriate deployment target, macOS 11 or 12, for example) on pre-macOS 14.6, the dynamic linker immediately loads Apple’s own FxPlug.framework, but this causes the process to crash immediately:
Thread 0 Crashed:: Dispatch queue: com.apple.main-thread
0 libobjc.A.dylib 0x7ff81e065955 map_images_nolock + 5399
1 libobjc.A.dylib 0x7ff81e0643d6 map_images + 67
2 dyld 0x10bd551fb invocation function for block in dyld4::RuntimeState::setObjCNotifiers(void (*)(unsigned int, char const* const*, mach_header const* const*), void (*)(char const*, mach_header const*), void (*)(char const*, mach_header const*)) + 275
3 dyld 0x10bd506c9 dyld4::RuntimeState::withLoadersReadLock(void () block_pointer) + 41
4 dyld 0x10bd550e2 dyld4::RuntimeState::setObjCNotifiers(void (*)(unsigned int, char const* const*, mach_header const* const*), void (*)(char const*, mach_header const*), void (*)(char const*, mach_header const*)) + 82
5 dyld 0x10bd68d45 dyld4::APIs::_dyld_objc_notify_register(void (*)(unsigned int, char const* const*, mach_header const* const*), void (*)(char const*, mach_header const*), void (*)(char const*, mach_header const*)) + 79
6 libobjc.A.dylib 0x7ff81e064244 _objc_init + 1279
7 libdispatch.dylib 0x7ff81e01d993 _os_object_init + 13
8 libdispatch.dylib 0x7ff81e02b1b8 libdispatch_init + 311
9 libSystem.B.dylib 0x7ff828fd585f libSystem_initializer + 238
10 dyld 0x10bd5ae4f invocation function for block in dyld4::Loader::findAndRunAllInitializers(dyld4::RuntimeState&) const + 182
11 dyld 0x10bd81aad invocation function for block in dyld3::MachOAnalyzer::forEachInitializer(Diagnostics&, dyld3::MachOAnalyzer::VMAddrConverter const&, void (unsigned int) block_pointer, void const*) const + 242
12 dyld 0x10bd78e26 invocation function for block in dyld3::MachOFile::forEachSection(void (dyld3::MachOFile::SectionInfo const&, bool, bool&) block_pointer) const + 557
13 dyld 0x10bd47db3 dyld3::MachOFile::forEachLoadCommand(Diagnostics&, void (load_command const*, bool&) block_pointer) const + 129
14 dyld 0x10bd78bb7 dyld3::MachOFile::forEachSection(void (dyld3::MachOFile::SectionInfo const&, bool, bool&) block_pointer) const + 179
15 dyld 0x10bd81604 dyld3::MachOAnalyzer::forEachInitializer(Diagnostics&, dyld3::MachOAnalyzer::VMAddrConverter const&, void (unsigned int) block_pointer, void const*) const + 466
16 dyld 0x10bd5ad82 dyld4::Loader::findAndRunAllInitializers(dyld4::RuntimeState&) const + 144
17 dyld 0x10bd6165a dyld4::PrebuiltLoader::runInitializers(dyld4::RuntimeState&) const + 30
18 dyld 0x10bd6e76e dyld4::APIs::runAllInitializersForMain() + 38
19 dyld 0x10bd4c38d dyld4::prepare(dyld4::APIs&, dyld3::MachOAnalyzer const*) + 3443
20 dyld 0x10bd4b4e4 start + 388
Can someone at Apple with the right domain expertise confirm that this is the type of crash you would see because the framework was built assuming it would run on macOS 14.6 and later, and when facing an older environment (e.g. ObjC runtime) it lacks extra code that would ensure backward compatibility with the earlier ObjC runtime found on macOS 12.x?
Hi,
I am getting into a trap. Please check stack-trace, howto fix this?
regards, Joël
stack-trace with ExtAudioFileWrite
I have a music app I'm developing and having a weird issue where I can see now playing info for every other platform than tvOS. As far as I can tell I have correctly configured the MPNowPlayingInfoCenter
MPNowPlayingInfoCenter.default().nowPlayingInfo = nowPlayingInfo MPNowPlayingInfoCenter.default().playbackState = .playing
Are there any extra requirements to get my app's now-playing info showing in control center on tvOS? Another strange issue that might be related is I can use the apple TV remote to pause audio but not resume playback, so I feel like there's something I'm missing about registering audio playback on tvOS specifically.
Hi,
I am creating an app that can include videos or images in it's data. While
@Attribute(.externalStorage)
helps with images, with AVAssets I actually would like access to the URL behind that data. (as it would be stupid to load and then save the data again just to have a URL)
One key component is to keep all of this clean enough so that I can use (private) CloudKit syncing with the resulting model.
All the best
Christoph
Howdy. I'm trying to access media from a users song library and receive:
<ICUserIdentityStoreACAccountBackend: 0x148f8af30> Failed to initialize active account, error=Error Domain=ICError Code=-7013 "Client is not entitled to access account store" UserInfo={NSDebugDescription=Client is not entitled to access account store}
I'm told I need to add a Media Library Access Capability. Nothing like this shows up in Xcode under Signing & Capabilities > +Capabilities. Also I can't find anything like this in my account in dev.apple.com.
How do I enable myself and a test user using another iPhone device to access my music and their music respectively?
Thanks!
Topic:
Media Technologies
SubTopic:
General
Tags:
App Tracking Transparency
Media Player
iOS
MusicKit
There are significant crash reports coming from iOS 18 users regarding AVKit framework that starts from this line [AVPlayerController _observeValueForKeyPath:oldValue:newValue:] which seems to be coming from iOS internal SDK. There are 2 kinds of crash we found:
UI modification on background thread
From the stack trace it seems like when AVPictureInPictureController is being deallocated and its view is being removed from superview somehow the code is being executed in background thread because there is this line there _AssertAutoLayoutOnAllowedThreadsOnly highlighted before the crash.
But I’ve checked our code that plays around AVPictureInPictureController, in the locations where we would deallocate the object it will always be called on main thread which are insideviewDidLoad and deinit inside UIViewController class. From the log, it seems like the crash happened when user try to open another content when PIP player is active resulting in the current PIP instance will be replaced with a new one. My suspect is the observation logic inside AVPlayerController could be the hint to this issue, probably something broken over there since this issue happened across our app versions on iOS 18 users only.
Unfortunately, I was unable to reproduce this issue yet but one of my colleagues reproduced it once but haven’t been able to do it again since. The reports keep raising each day up to 1.3k events in the last 30 days now.
Over release object
This one has lower reports than the first one but I decided to include it since it might have relevant information regarding the first crash since the starting stack trace is similar. The crash timing seems to be similar to the first one, where we deallocate existing AVPictureInPictureController and later replace it with a new one and also found only in iOS 18 users which also refers to [AVPlayerController _observeValueForKeyPath:oldValue:newValue:]. I also was unable to reproduce this issue so far.
Oh, and both of the issues happened on both iPhone and iPad.
We’d appreciate any advice on what we can do to avoid this in the future and probably any hint on why it could happened.
I have reported this issue with bug number: FB15620734
I also attached one sample crash report for each of the crashes here.
non ui thread access.crash
over release.crash
The two ScreenCaptureKit WWDC22 sessions show how to capture with the new framework but the retina factor is hardcoded to 2 in SCStreamConfiguration.
When using on a non-retina display, the screencapture is floating on the upper-left corner of the image buffer.
There does not seem to be a simple way to retrieve the retina factor from the SCShareableContent data (when configuring the capture).
When processing the streaming output, the SCStreamFrameInfo attachment is supposed to have a scaleFactor property but .scaleFactor does not return a value.
I have found out that the attachement dictionary contains SCStreamUpdateFrameDisplayResolution. This entry gives me the retina factor but it is not an official SCStreamFrameInfo key. I list the keys to access it.
What is the proper way with ScreenCapture to handle the retina factors ?
I'm trying to implement airplay into my app. I can successfully playback sound and trigger the airplay selector sheet. If the target device is a Bluetooth only device I can connect with no problem and stream the audio to the Bluetooth device, but if the audio device is a airplay specific device like a HomePod or an Apple TV when I select it, I get a spinning icon, indicating that it is trying to connect, and eventually it times out and stops without connecting.
I don't believe it is an AirPlay audio issue because if I go to a different app, for example a podcast app and select my HomePods for output, and then switch back to my app. My audio will correctly stream to the HomePod. Not only that, I have it so that my icon will change color to indicate that it is connected via airplay and it is correctly indicating that it is connected via AirPlay. But I cannot then disconnect it using the Airplay selector.
The issue appears to be in the AirPlay selection side, which I have spent several days attempting to troubleshoot mostly using ChatGPT to suggest code different than what I have to maybe work around the issue. Mostly it is focused on the audio player section, but it doesn't seem like that is really the route that is the problem.
After upgrading to iOS 18.4, I'm no longer able to establish an AirPlay v1 connection to an audio system. The symptom is that the AirPlay route picker just spins when trying to connect to an audio system. It eventually gives up.
I tested this on an iPhone 14, connecting to a HomePod, AirPort express, AppleTV and a Wiim Pro. If I try connecting with AirPlay v2, ex: using Apple Music, the connection succeeds and audio can be played.
I'm the developer of an app that plays audio over AirPlay while also recording. My app has to use AirPlay v1 because AvAudioSession doesn't allow the policy .longFormAudio when the category is .playAndRecord. This issue is a real pain as it means my app is suddenly broken for many thousands of users.
Is anyone else seeing this issue? Any suggestions for a workaround?