I'm building a camera app that does some post processing after the photo has been taken. With 12MP the processing is pretty good, but larger images 24MP is very slow.
I created a very simple example to demonstrate the issue, which is loading an image and the rendering it to data.
let context = CIContext()
let imageUrl = Bundle.main.url(forResource: "12mp", withExtension: "jpg")!
let data = try! Data(contentsOf: imageUrl)
let ciImage = CIImage(data: data)!
let start = CFAbsoluteTimeGetCurrent()
let data = context.jpegRepresentation(of: ciImage, colorSpace: context.workingColorSpace!)
print(data?.count)
print("Resize Completed: " + String(CFAbsoluteTimeGetCurrent() - start))
Running this code on an iPhone 16 Pro with different images produces these benchmarks:
12MP => 0.03s
24MP => 1.22s
48MP => 2.98s
I understand that processing time will increase with resolution but it doesn't seem linear. I have tried setting different CiContext options such as .useSoftwareRenderer: false but it has made no difference.
From profiling the process it looks like the JPEG decoding is the bottle neck. This is for a 48MP Image:
Is there any way this can be improved?
General
RSS for tagExplore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Selecting any option will automatically load the page
Post
Replies
Boosts
Views
Activity
Hi,
I am trying to enable the default MIDINetworkSession in a Catalyst app on MacOS like this:
MIDINetworkSession.default().isEnabled = true
MIDINetworkSession.default().connectionPolicy = .anyone
In the AppSandbox I have both incoming and outgoing network connections enabled. And I also added the NSLocalNetworkUsageDescription key to the info.plist. Bonjour services are also added to the info.plist:
NSBonjourServices
_apple-midi._udp.
Nevertheless the session stays disabled. Running the same code works just fine on iOS.
Is there any special setup I need to make on MacOS to enable the MIDINetworkSession?
Thanks!
I am building an app for MacOS and I am trying to implement the code to add songs to a library playlist (which is added below). The issue I am having is that if I use Music Kit to load a users library playlists, the ID for the playlist (which is just a string of numbers) does not work with the Add tracks to a Library Playlist endpoint of Apple Music API. If I retrieve the playlists from the Apple Music API and use that playlist ID (which is different than the id I get from MusicKit) my code works fine and adds the song to the playlist. The problem is that when getting a users library playlists from Apple Music API is that it does not give me all of the library playlists that I get when using Music Kit and it also does not give me Artwork for playlists that have the collage of album covers, so I would prefer to use Music Kit to get the playlists.
I have also tested trying to retrieve a single playlist using the Apple Music API with the playlist Id from Music Kit and it does not work. I get the error that the resource cannot be found. Since this is a macOs app I cannot use MusicKit to add songs to library playlists.
Does anyone know a way to resolve this? Or a possible workaround? Ideally I want to use MusicKit to get the library playlists and have some way to use the playlist Id and add songs to that playlist. Below is my code for adding a song to a playlist using the Apple Music API, which works correctly only if I originally get the library playlist's id value from a playlist retrieved from the Apple Music API.
Also, does anyone know why the playlist Id's are not universal and are different when using Music Kit and Apple Music API? For songs and tracks it does not seem to matter if I use music kit or Apple Music API, the Id's are in the correct format for Apple Music API to use and work with my code. Thanks everyone for any and all help!
func addToPlaylist(songs: [Track], playlist: Playlist, alert: Binding<AlertItem?>) async {
let tracks = AppleMusicPlaylistPostRequestBody(data: songs.compactMap {
AppleMusicPlaylistPostRequestItem(id: $0.id.rawValue, type: "songs") // or "library-songs"
})
let playlistID = playlist.id
// Build the request URL for adding a song to a playlist
guard let url = URL(string: "https://api.music.apple.com/v1/me/library/playlists/\(playlistID)/tracks") else {
alert.wrappedValue = AlertItem(title: "Error", message: "Invalid URL for the playlist.")
return
}
// Authorization Header
guard let musicUserToken = try? await MusicUserTokenProvider().getUserMusicToken() else {
alert.wrappedValue = AlertItem(title: "Error", message: "Unable to retrieve Music User Token.")
return
}
do {
var request = URLRequest(url: url)
request.httpMethod = "POST"
request.setValue("Bearer \(musicUserToken)", forHTTPHeaderField: "Authorization")
request.setValue("application/json", forHTTPHeaderField: "Content-Type")
let encoder = JSONEncoder()
let data = try encoder.encode(tracks)
request.httpBody = data
let musicRequest = MusicDataRequest(urlRequest: request)
let musicRequestResponse = try await musicRequest.response()
// Check if the request was successful (status 201)
if musicRequestResponse.urlResponse.statusCode == 201 {
alert.wrappedValue = AlertItem(title: "Success", message: "Song successfully added to the playlist.")
} else {
print("Status Code: \(musicRequestResponse.urlResponse.statusCode)")
print("Response Data: \(String(data: musicRequestResponse.data, encoding: .utf8) ?? "No Data")")
// Attempt to decode the error response into the AppleMusicErrorResponse model
if let appleMusicError = try? JSONDecoder().decode(AppleMusicErrorResponse.self, from: musicRequestResponse.data) {
let errorMessage = appleMusicError.errors.first?.detail ?? "Unknown error occurred."
alert.wrappedValue = AlertItem(title: "Error", message: errorMessage)
} else {
alert.wrappedValue = AlertItem(title: "Error", message: "Failed to add song to the playlist.")
}
}
} catch {
alert.wrappedValue = AlertItem(title: "Error", message: "Network error: \(error.localizedDescription)")
}
}
Hello!
I am having trouble setting start times for songs when using the ApplicationMusicPlayer.
When I initialize a new MusicPlayer.Queue.Entry using the following constructor, I am seeing strange results:
init(
_ playableMusicItem: PlayableMusicItem,
startTime: TimeInterval? = nil,
endTime: TimeInterval? = nil
)
It appears that any value I provide for startTime is also applied to the endTime. For example:
MusicPlayer.Queue.Entry(playable, startTime: TimeInterval(30), endTime: TimeInterval(183))
provides the following console output:
MusicPlayer.Queue.Entry(id: "3D6A3DA3-595E-4657-8DBA-DDD245BBB7EF", transientItem: PlayableMusicItem, startTime: 30.0, endTime: 30.0)
I have also tried setting the endTime to nil with the same result. Does anyone have any experience setting start times for songs using the MusicKit ApplicationMusicPlayer?
Any feedback is greatly appreciated!
My iphone 15 plus suddenly turns black and a losing icon keeps spinning. Then it turns off and I can use it again, it is only for a few seconds.
I have updated to iOS 18.1 beta, could this be the issue. Is my phone broken?
I have tried restarting my phone
Topic:
Media Technologies
SubTopic:
General
Hi, I'm wondering about one of the properties in the MPNowPlayingInfoCenter: MPNowPlayingInfoPropertyElapsedPlaybackTime. The docs say that updating this property frequently is not required, because the system can automatically calculate elapsed playback time based on the infrequent values we provide.
Is performance harmed by updating this property every second? Should I add some filtering/throttling to update this property infrequently? Am I overthinking this, and it doesn't matter either way?
Kind regards.
Hi -
Of course I may be doing something wrong, but I'm getting exactly the opposite of what I would expect from
ApplicationMusicPlayer.shared.state.playbackStatus
It returns .playing when the music is paused and .paused when the music is playing.
Am I holding it wrong?
Thanks,
Daniel
What is the immersive space projection method? erp, fisheye, cube
We want to achieve the same effect as Apple immersive
I have a custom app running on a Mac Studio with Ventura that grabs a snapshot image from a network camera. It then adds some extra information into the EXIF "MakerNote" field. However the metadata cannot be read back out of the image when running Ventrua, it can however be read out of the same image file on a Mac that is not running Ventura.
It would appear Apple has removed support for reading MakerNote in Ventura but still supports writing MakerNote in Ventura.
This code is about 7 years old and written in ObjC and has worked with no issue until Ventura came along.
Calls used
CGImageDestinationAddImageFromSource(); // used to write the image to disk with the extra metadata - Works on Ventura
CGImageSourceCopyPropertiesAtIndex(); // used to read the meta data from an image - does not return "MakeNote" data
Is there a new way to read EXIF "MakeNote" data from image files that was introduced with Ventura?
I’m a tracking company and have my own tracking platform. Looking for the solution that using tag device for animals like Air Tags but running on my platform.
Is there a way to allow my platform to interface with the Find My Phone to get the location data of my Tags ?
Hey there, I'm trying to display all user's albums using the MediaPlayer library. I'm getting many albums returning nil, but I know artwork exists because they show up in the default Music app. There doesn't seem to be much rhyme or reason for what shows up and what doesn't. All downloaded albums display artwork, but some cloud album artwork displays as well. Here's the code I'm using to debug this.
let query = MPMediaQuery.albums()
if let albumCollections = query.collections {
albums = albumCollections
}
for album in albums {
let artwork = album.representativeItem?.artwork
print(artwork, artwork?.image(at: CGSize(width: 100, height: 100)))
}
Any help would be greatly appreciated. Thanks!
We develop a video playback app on Apple TV which has the two following features:
Its content browsing screen has installed a gesture recognizer for presses on the PlayPause Siri remote button in order to directly launch a playback. The gesture recognizer is attached to the content browsing UIViewController view.
It presents its own custom playback UI with an AVPlayerLayer for the video and supports MPNowPlayingSession in order to publish current playback information and respond to remote commands. It also supports switching between fullscreen and Picture in Picture playback.
Both features work fine, ie. the playback is launched when pressing the PlayPause Siri remote button and, during playback, the playback info are properly advertised on other devices and remote commands are also triggered as expected.
However, when pressing the PlayPause Siri remote button while the video is playing in PiP, the "pause" remote command is sometimes triggered instead of the .playPause gesture recognizer. The issue may not occur the first time but for subsequent PlayPause presses. Navigating a bit in the app UI seems to help preventing the issue to occur.
Finally, the issue only occurs if the video is playing. If the video is paused, the PlayPause Siri remote button gesture is always recognized instead of the remote command.
Please note that, before using MPNowPlayingSession (and the corresponding MPRemoteCommandCenter), the app was using the default MPRemoteCommandCenter to support remote commands and the issue did not occur.
We don't reproduce this issue with the Apple TV app so there's probably something we are not doing right. Has someone any clue?
My app reports a lot of crashes from 18.2 users.
I have been able to narrow down the issue to this line of code:
CGImageDestinationFinalize(imageDestination)
The error is Thread 93: EXC_BAD_ACCESS (code=1, address=0x146318000)
But I have no idea why this suddently started to crash.
Here is the code of the function:
private func estimateSizeUsingThumbnailMethod(fromImageURL url: URL, imageSettings: ImageSettings) -> (Int, Int) {
let sourceOptions = [kCGImageSourceShouldCache: false] as CFDictionary
guard let source = CGImageSourceCreateWithURL(url as CFURL, sourceOptions),
let imageProperties = CGImageSourceCopyPropertiesAtIndex(source, 0, nil) as? [CFString: Any],
let imageWidth = imageProperties[kCGImagePropertyPixelWidth] as? CGFloat,
let imageHeight = imageProperties[kCGImagePropertyPixelHeight] as? CGFloat else {
return (0, 0)
}
let maxImageSize = max(imageWidth, imageHeight)
let thumbMaxSize = min(2400, maxImageSize) // Use original size if possible, but not if larger than 2400, in this case we'll extrapolate from thumbnail
let downsampleOptions = [
kCGImageSourceCreateThumbnailFromImageAlways: true,
kCGImageSourceCreateThumbnailWithTransform: true,
kCGImageSourceThumbnailMaxPixelSize: thumbMaxSize as CFNumber,
] as CFDictionary
guard let cgImage = CGImageSourceCreateThumbnailAtIndex(source, 0, downsampleOptions) else {
DLog("CGImage thumb creation error")
return (0, 0)
}
let data = NSMutableData()
guard let imageDestination = CGImageDestinationCreateWithData(data, UTType.jpeg.identifier as CFString, 1, nil) else {
DLog("CGImage destination creation error")
return (0, 0)
}
let destinationProperties = [
kCGImageDestinationLossyCompressionQuality: imageSettings.quality.compressionRatio() // Set jpeg compression ratio
] as CFDictionary
CGImageDestinationAddImage(imageDestination, cgImage, destinationProperties)
CGImageDestinationFinalize(imageDestination) // <----- CRASHES HERE with EXC_BAD_ACCESS
...
}
So far, I'm stuck. Any idea that could help would be greatly appreciated, as I'm scared that this crash will propagate on the official release of 18.2
I have an app that gets data from Music.app with both the iTunesLibrary and MusicKit.
iTunesLibrary has ITLibArtist.sortName and ITLibAlbum.sortTitle and ITLibAlbum.sortAlbumArtist.
I can’t seem to find an equivalent in MusicKit. How are those properties obtained using MusicKit? Thanks.
FYI I have filed FB15554956 on this. You also may see my code at https://github.com/bolsinga/itunes_json
https://api.media.apple.com/v1/feed/exports/song_2024-11-02T16-02/parts?limit=200&offset=400
This is the api used to get parquet file urls. I need all the urls in one api hit, right now if I don't provide the limit then default it is taking 100 and max is 200.
How to get all the records in one hit? Or the count of parquet records in one hit?
Hi,
I am in need to get the total number of parquet files that are present in the apple music feed api for songs, artists. As there is option for limit and offset. But limit is limited to 200 records and offset is uncertain.
How to get total number of parquet files number without quering apple music feed api mulitple times?
Need help regarding this. Thanks!
I'm using iCloud Music Library. I’m using macOS 14.1 (23B74) and iOS 17.1.
i’m using MusicKit to find songs that do not have artwork. On iOS, Song.artwork will be nil for items I know do not have artwork. On macOS, Song.artwork is not nil. However when the songs are shown in Music.app, they do not have Artwork. Is this expected? Alternately, is there a more correct way to determine that a Song has no Artwork?
I have also filed FB13315721.
Thank you for any tips!
I am attempting to do batch Transcription of audio files exported from Voice Memos, and I am running into an interesting issue. If I only transcribe a single file it works every time, but if I try to batch it, only the last one works, and the others fail with No speech detected. I assumed it must be something about concurrency, so I implemented what I think should remove any chance of transcriptions running in parallel. And with a mocked up unit of work, everything looked good. So I added the transcription back in, and
1: It still fails on all but the last file. This happens if I am processing 10 files or just 2.
2: It no longer processes in order, any file can be the last one that succeeds. And it seems to not be related to file size. I have had paragraph sized notes finish last, but also a single short sentence that finishes last.
I left the mocked processFiles() for reference.
Any insights would be greatly appreciated.
import Speech
import SwiftUI
struct ContentView: View {
@State private var processing: Bool = false
@State private var fileNumber: String?
@State private var fileName: String?
@State private var files: [URL] = []
let locale = Locale(identifier: "en-US")
let recognizer: SFSpeechRecognizer?
init() {
self.recognizer = SFSpeechRecognizer(locale: self.locale)
}
var body: some View {
VStack {
if files.count > 0 {
ZStack {
ProgressView()
Text(fileNumber ?? "-")
.bold()
}
Text(fileName ?? "-")
} else {
Image(systemName: "folder.badge.minus")
Text("No audio files found")
}
}
.onAppear {
files = getFiles()
Task {
await processFiles()
}
}
}
private func getFiles() -> [URL] {
do {
let documentsURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!
let path = documentsURL.appendingPathComponent("Voice Memos").absoluteURL
let contents = try FileManager.default.contentsOfDirectory(at: path, includingPropertiesForKeys: nil, options: [])
let files = (contents.filter {$0.pathExtension == "m4a"}).sorted { url1, url2 in
url1.path < url2.path
}
return files
}
catch {
print(error.localizedDescription)
return []
}
}
private func processFiles() async {
var fileCount = files.count
for file in files {
fileNumber = String(fileCount)
fileName = file.lastPathComponent
await processFile(file)
fileCount -= 1
}
}
// private func processFile(_ url: URL) async {
// let seconds = Double.random(in: 2.0...10.0)
// await withCheckedContinuation { continuation in
// DispatchQueue.main.asyncAfter(deadline: .now() + seconds) {
// continuation.resume()
// print("\(url.lastPathComponent) \(seconds)")
// }
// }
// }
private func processFile(_ url: URL) async {
let recognitionRequest = SFSpeechURLRecognitionRequest(url: url)
recognitionRequest.requiresOnDeviceRecognition = false
recognitionRequest.shouldReportPartialResults = false
await withCheckedContinuation { continuation in
recognizer?.recognitionTask(with: recognitionRequest) { (transcriptionResult, error) in
guard transcriptionResult != nil else {
print("\(url.lastPathComponent.uppercased())")
print(error?.localizedDescription ?? "")
return
}
if ((transcriptionResult?.isFinal) == true) {
if let finalText: String = transcriptionResult?.bestTranscription.formattedString {
print("\(url.lastPathComponent.uppercased())")
print(finalText)
}
}
}
continuation.resume()
}
}
}
I'm very excited about the new MusicLibrary API, but after a couple of days of playing around with it, I have to say that I find the implementation of filtering MusicLibraryRequests a little confusing. MPMediaQuery has a fairly extensive list of predicates that can be applied, including string and persistentID comparisons for artist, album artist genre, and more. It also lets you filter on an item’s title. MusicLibraryRequests let you filter on the item’s ID, or on its MusicKit Artist and Genre relationships. To me, this seems like it adds an extra step.
With an MPMediaQuery, if I wanted to fetch every album by a given artist, I’d apply an MPMediaPropertyPredicate looking at MPMediaItemPropertyAlbumArtist and compare the string. It was also easy to change the MPMediaPredicateComparison to .contains to match more widely. If I wanted to surface albums by “Aesop Rock” or “Aesop Rock & Blockhead,” I could use that.
In the MusicLibraryRequest implementation, it looks like I need to perform a MusicLibraryRequest<Artist> first in order to get the Artist objects. There’s no filter for the name property, so if I don’t have their IDs, I’ve got to use filter(text:). From there, I can take the results of that request and apply them to my MusicLibraryRequest<Album> using the filter(matching:memberOf) function.
I could use filter(text:) on the MusicLibraryRequest<Album>, but that filters across multiple properties (title and artistName?) and is less precise than defining the actual property I want to match against.
I think my ideal version of the MusicLibraryRequest API would offer something like filter(matching:equalTo:) or filter(matching:contains:) that worked off of KeyPaths rather than relationships. That seems more intuitive to me. I’m not saying we need every property from every filterable MPMediaItemProperty key, but I’d love to be able to do it on title, artistName, and other common metadata. That might look something like:
filter(matching: \.title, contains: “Abbey Road”)
filter(matching: \.artistName, equalTo: “Between The Buried And Me”)
I noticed that filter(text:) is case insensitive, which is awesome, and something I’ve wanted for a long time in MPMediaPropertyPredicate. As a bonus, it would be great if a KeyPath based filter API supported a case sensitivity flag. This is less of a problem when dealing with Apple Music catalog content, but users’ libraries are a harsh environment, and you might have an artist “Between The Buried And Me” and one called “Between the Buried and Me.” It would be great to get albums from both with something like:
filter(matching: \.artistName, equalTo: “Between The Buried And Me”, caseSensitive: false)
I've submitted the above as FB10185685. I also submitted another feedback this morning regarding filter(text:) and repeating text as FB10184823.
My last wishlist item for this API (for the time being!) is exposing the MPMediaItemPropertyAlbumPersistentID as an available filter attribute. I know, I know… hear me out. If you take a look at the other thread I made today, you’ll see that due to missing metadata in MusicKit, I still have some use cases where I need to be able to reference an MPMediaItem and might need to fetch its containing MPMediaItemCollection to get at other tracks on the album. It would be nice to seamlessly be able to fetch the MPMediaItemCollection or the library Album using a shared identifier, especially when it comes to being able to play the album in MusicKit’s player rather than Media Player’s.
I've submitted that list bit as FB10185789
Thanks for bearing with my walls of text today. Keep up the great work!
I'm creating app that listening other app's sound. in this use case, screen data is not needed.
but if I don't call SCStream#addStreamOutput(_, type: .screen, ...), console shows this error:
[ERROR] _SCStream_RemoteVideoQueueOperationHandlerWithError:701 stream output NOT found. Dropping frame
currently I'm setting SCStreamConfiguration#minimumFrameInterval to large value (e.g. 0.1fps) as workaround, but it would be good if i can completely disable screen capture for best performance.
there is any way to disable screen capture and only captures apps audio?