Hello, we are embedding a PHPickerViewController with UIKit (adding the vc as a child vc, embedding the view, calling didMoveToParent) in our app using the compact mode. We are disabling the following capabilities .collectionNavigation, .selectionActions, .search.
One of our users using iOS 17.2.1 and iPhone 12 encountered a crash with the following stacktrace:
Crashed: com.apple.main-thread
0 libsystem_kernel.dylib 0x9fbc __pthread_kill + 8
1 libsystem_pthread.dylib 0x5680 pthread_kill + 268
2 libsystem_c.dylib 0x75b90 abort + 180
3 PhotoFoundation 0x33b0 -[PFAssertionPolicyCrashReport notifyAssertion:] + 66
4 PhotoFoundation 0x3198 -[PFAssertionPolicyComposite notifyAssertion:] + 160
5 PhotoFoundation 0x374c -[PFAssertionPolicyUnique notifyAssertion:] + 176
6 PhotoFoundation 0x2924 -[PFAssertionHandler handleFailureInFunction:file:lineNumber:description:arguments:] + 140
7 PhotoFoundation 0x3da4 _PFAssertFailHandler + 148
8 PhotosUI 0x22050 -[PHPickerViewController _handleRemoteViewControllerConnection:extension:extensionRequestIdentifier:error:completionHandler:] + 1356
9 PhotosUI 0x22b74 __66-[PHPickerViewController _setupExtension:error:completionHandler:]_block_invoke_3 + 52
10 libdispatch.dylib 0x26a8 _dispatch_call_block_and_release + 32
11 libdispatch.dylib 0x4300 _dispatch_client_callout + 20
12 libdispatch.dylib 0x12998 _dispatch_main_queue_drain + 984
13 libdispatch.dylib 0x125b0 _dispatch_main_queue_callback_4CF + 44
14 CoreFoundation 0x3701c __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 16
15 CoreFoundation 0x33d28 __CFRunLoopRun + 1996
16 CoreFoundation 0x33478 CFRunLoopRunSpecific + 608
17 GraphicsServices 0x34f8 GSEventRunModal + 164
18 UIKitCore 0x22c62c -[UIApplication _run] + 888
19 UIKitCore 0x22bc68 UIApplicationMain + 340
20 WorkAngel 0x8060 main + 20 (main.m:20)
21 ??? 0x1bd62adcc (Missing)
Please share if you have any ideas as to what might have caused that, or what to look at in such a case. I haven't been able to reproduce this myself unfortunately.
How did we do? We’d love to know your thoughts on this year’s conference. Take the survey here
General
RSS for tagExplore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
I often find when doing basic actions in MusicKit it is incredibly slow compared to Apple's Music App. I've tried different versions, devices, networks, Apple's sample code, it all throughout the last several years, and it is all the same. Does anyone else have this issue?
Under Sonoma 14.4 the compression option doesn't work with PNG images. It works for JPG/HEIF. Preview can export PNG file to HEIC with compression option. What am I missing? Previously this has worked. I am trying with 0.01 and 0.9 as compression quality and the file size is the same for PNG.
Is Preview using some trick to convert the image using ciContext.createCGImage?
PS: Compression option of 1.0 was broken under 14.4 RC and Preview created empty file.
func heifImageDataUsingDestination(at url: URL, compressionQuality : CGFloat) -> Data? {
guard let imageSource = CGImageSourceCreateWithURL(url as CFURL, nil) else { return nil }
guard let cgImage = CGImageSourceCreateImageAtIndex(imageSource, 0, nil) else { return nil }
var mutableData = NSMutableData()
guard let imageDestination = CGImageDestinationCreateWithData(mutableData, "public.heic" as CFString, 1, nil) else { return nil }
let options = [ kCGImageDestinationLossyCompressionQuality: compressionQuality ] as CFDictionary
CGImageDestinationAddImage(imageDestination, cgImage, options)
let success = CGImageDestinationFinalize(imageDestination)
if success {
return mutableData as Data
}
return nil
}
func heifImageDataUsingCIContext(at url: URL, compressionQuality : CGFloat) -> Data? {
guard let ciImage = CIImage(contentsOf: url) else { return nil }
let context = CIContext()
let colorspace = ciImage.colorSpace ?? CGColorSpaceCreateDeviceRGB()
let options = [CIImageRepresentationOption(rawValue: kCGImageDestinationLossyCompressionQuality as String) : compressionQuality]
return context.heifRepresentation(of: ciImage, format: .RGBA8, colorSpace: colorspace, options: options)
}
Hi, is there a way to display the animated cover art in an app? Not every album has a cover art but for the ones that have one I would like to display them instead of the artwork.
Thank you :)
In SwiftUI there is a built-in component for displaying album artworks called Artwork but there is no equivalent for UIKit.
My current approach is to use the .url() method to read image's URL and download the image or read it from the disk but the performance is much worse than it was previously with MPMediaItem's artworkImage method.
let artworkQueue = DispatchQueue(
label: "MusicKit-ArtworkQueue",
qos: .default,
attributes: .concurrent
)
let artworkSemaphore = DispatchSemaphore(value: 5)
extension Song {
func artworkImage(for size: CGSize, completion: @escaping (UIImage?) -> Void) {
artworkQueue.async {
artworkSemaphore.wait()
defer {
artworkSemaphore.signal()
}
let imageURL = artwork?.url(
width: Int(size.width),
height: Int(size.height)
)
// I hate doing this as it might very well break in the future
guard let imageURL, imageURL.scheme == "musicKit"
else {
return completion(nil)
}
guard let imageData = try? Data(contentsOf: imageURL),
let image = UIImage(data: imageData) else {
return completion(nil)
}
completion(image)
}
}
}
I really dislike this approach because it feels hacky but somewhat works. You might ask what's the semaphore for? Well, without it I could notice that MusicKit was choking and after reading too many artworks at once.
Can someone from Apple please provide us with an example on how to use MusicKit with UIKit properly?
Ideally (IMO) we would have a method defined on Song and other MusicKit structures that returns the image for us, just like MPMediaItem had the .artwork() method. It would make our lives so much easier.
Hi,
Is it possible to get the various audio variants for a given song ID, I am unable to find any documentation for this online. In particular, could I see if Dolby Atmos is available for the track?
Here is an example of a request I am sending: https://api.music.apple.com/v1/catalog/us/songs/1168770969
I am using MusicKit ApplicationMusicPlayer to play music in my app. Everything works fine as long as I'm not playing large playlists that contain hundreds of songs. When I to play collection of songs that is larger than around 300 I'm always getting the error message saying:
"Prepare to play failed" UserInfo={NSDebugDescription=Prepare to play failed, NSUnderlyingError=0x121d42dc0 {Error Domain=MPMusicPlayerControllerErrorDomain Code=9 "Remote call timed out" UserInfo={NSDebugDescription=Remote call timed out}}}))
It doesn't matter if songs are downloaded to the device or not.
I am aware that there is another initializer for player's queue that accepts Playlist instances but in my app users can choose to sort playlist tracks in different order than the default and that makes using that initializer not feasible for me.
I tried everything I could think of, I tried to fall back on MPMusicPlayerController and pass array of MPMusicPlayerPlayParameters to it but the result was the same.
typealias QueueEntry = ApplicationMusicPlayer.Queue.Entry
let player = ApplicationMusicPlayer.shared
let entries: [QueueEntry] = tracks
.compactMap {
guard let song = $0 as? Song else { return nil }
return QueueEntry(song)
}
Task(priority: .high) { [player] in
do {
player.queue = .init(entries, startingAt: nil)
try await player.play() // prepareToPlay failed
} catch {
print(error)
}
}
Hi,
just generated a HDR10 MVHEVC file, mediainfo is below:
Color range : Limited
Color primaries : BT.2020
Transfer characteristics : PQ
Matrix coefficients : BT.2020 non-constant
Codec configuration box : hvcC+lhvC
then generate the segment files with below command:
mediafilesegmenter --iso-fragmented -t 4 -f av_1 av_new_1.mov
then upload the segment files and prog_index.m3u8 to web server.
just find that can not play the HLS stream on Safari...
the url is http://ip/vod/prog_index.m3u8
just checked that if i remove the tag Transfer characteristics : PQ when generating the MVHEVC file.
above same mediafilesegmenter command and upload the files to web server.
the new version of HLS stream is can play on Safari...
Is there any way to play HLS PQ video on Safari. thanks.
Since iOS 12 it has become difficult to detect the end of playback using the system music player.
In earlier iOS versions, the now playing item would be set nil and you would receive a notification that the player stopped.
In iOS 12 and later, nowPlayingItem still contains the current song and the only notification you get is MPMusicPlayerControllerPlaybackStateDidChangeNotification with the playbackState set to MPMusicPlaybackStatePaused.
Pressing pause in my car (or any remote access) generates the same conditions making it difficult to correctly detect the difference.
It would be nice if they added a notification that playback was done (similar to the other players).
Any suggestions?
I'm trying to set a specific start time for the song, using ApplicationMusicPlayer.shared.playbackTime but is not working
musicPlayer.playbackTime = 10
try await musicPlayer.prepareToPlay()
try await musicPlayer.play()
Hi, i have try with RPBroadcastSampleHandler Broadcast Extension and RPSystemBroadcastPickerView but i don't understand how can i mirror my iOS screen to Android smart TV?
Hello everyone,
I am thrilled about the iPhone Mirroring demo on WWDC24 and I have a few thoughts to share.
Will it work through a local network, or can the iPhone be accessed within a global network? Will there be an API to initiate iPhone mirroring via an app? This would be a great feature for MDMs, allowing administrators to provide support for their users. Could you share more details from the development perspective?
Topic:
Media Technologies
SubTopic:
General
Hello, currently working on a shareplay feature that allows users to pull 3d models from icloud and view it via volumes/immersive space on the vision pro. Was able to get the sharing working with multiple windows recently so now all that's left is to be able to sync/share the model in the SharePlay session.
As I understand it, we should generally use GroupSessionMessenger for commands and light data like model positioning/syncing properties. Whereas for bigger pieces of data (images/videos/models), we should send these through GroupSessionJournal which the group session manages and syncs it for all users in the call.
I have a button to get the current user's model data and add it to the journal via
/// modelData is type `Data`
try await journal.add(modelData)
I have also set up a task to observe/receive updates to the journal's attachments in when receiving a group session.
for await groupSession in MyModelActivity.sessions() {
...
tasks.insert {
Task {
for await attachments in journal.attachments {
for attachment in attachments {
do {
let modelData = try await attachment.load(Data.self) // throws error here - `notSupported`
let modelUrl = writeModelDataToTempDirectory(modelData: modelData)
self.modelUrlToLoadForGroupSession = modelUrl
} catch let error {
print("Error: \(error)")
}
}
}
}
}
}
Not quite sure why I'm running into an error being thrown when attempting to load the attachment data on the other devices, any thoughts? The documentation for add(_:) and load(_:) say that the attachment should conform to Transferable but Data.Type should already conform to Transferable
Hello,
We've a music app reading MPMediaItem.
We got items using MPMediaQuery. But we realized that some downloaded tracks from Apple Music were fetched too. Not all downloaded track but only those who were played recently.
Of course, since these tracks are protected with DRM we can't play them in our player.
It's weird to get them in our query because we added predicate in order to dont fetch protected asset and iCloud item
MPMediaPropertyPredicate(value: false, forProperty: MPMediaItemPropertyHasProtectedAsset)
MPMediaPropertyPredicate(value: false, forProperty: MPMediaItemPropertyIsCloudItem)
To be sure, we made a second check on each item we've fetched
extension MPMediaItem {
public func isValid() -> Bool {
return self.assetURL != nil && !self.isCloudItem && !self.hasProtectedAsset
}
}
But we still get these items. Their hasProtectedAsset attribute always return false.
I dont know if it's a bug, but since we can't detect this items as Apple Music downloaded track, we can't either:
filter them to not add them in our application library
OR
switch on a MPMusicPlayerController.applicationMusicPlayer to allow the user to play them
Topic:
Media Technologies
SubTopic:
General
Tags:
Apple Music API
Media Player
Media Library
AVFoundation
I'm working on a macOS application that captures audio and video. When the user selects a video capture source (most likely an elgato box), I would like the application to automatically select the audio input from the same device. I was achieving this by pairing video and audio sources that had the same name, but this doesn't work when the user plugs in two capture devices of the same make and model.
With the command system_profiler SPUSBDataType I can list all the USB devices, and I can see that the two elgato boxes have different serial numbers. If I could find this serial number, then I could figure out which AVCaptureDevices come from the same hardware.
Is there a way to get the manufacturer's serial number from the AVCaptureDevice object? Or a way to identify the USB device for an AVCaptureDevice, and from there I could get the serial or some other unique ID?
Hi everyone, I'm currently developing an iOS app using React Native and recently got accepted into the Apple Music Global Affiliate Program. To fully utilize this opportunity, I need to implement the following functionalities:
Authorize Apple Music usage
Play Apple Music within my app
Identify if a user has an Apple Music subscription
Initiate and complete Apple Music subscription within my app
I've successfully implemented the first three functionalities using the react-native-apple-music module. Now, I need your help to understand how I can directly trigger the Apple Music subscription process from within my app.
Thank you for your help!
Hello Apple,
I am concerned about the new iOS Screen Mirroring that is available on iOS.
I have an app that is only meant to be viewed on iPhones (not Macs or Computers, due to security reasons.
I am assuming that Screen Mirroring is using AirPlay underneath, otherwise is there an API being planned or coming that can disable this functionality or is there a way for my app to opt out out of iOS Screen Mirroring?
Thanks.
I am looking to do the OOB (Out of band) pairing using QR code with a device from iOS app. I referred the documentation but could not find if this is feasible or not. Few forum says it is not feasible, few says it is. May I know the latest state from Apple development support team?
I have an IOSurface and I want to turn that into a CIImage. However, the constructor of CIImage takes a IOSurfaceRef instead of a IOSurface.
On most platforms, this is not an issue because the two types are toll-free bridgeable... except for Mac Catalyst, where this fails.
I observed the same back in Xcode 13 on macOS. But there I could force-cast the IOSurface to a IOSurfaceRef:
let image = CIImage(ioSurface: surface as! IOSurfaceRef)
This cast fails at runtime on Catalyst.
I found that unsafeBitCast(surface, to: IOSurfaceRef.self) actually works on Catalyst, but it feels very wrong.
Am I missing something? Why aren't the types bridgeable on Catalyst?
Also, there should ideally be an init for CIImage that takes an IOSurface instead of a ref.
How to add multiple resolutions to CMIO camera extension
Topic:
Media Technologies
SubTopic:
General