Thanks for being a part of WWDC25!

How did we do? We’d love to know your thoughts on this year’s conference. Take the survey here

Explore best practices for creating inclusive apps that cater to users with diverse abilities

Learn More

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

User interface in Ainu.
I have mulling over this for many years ,Uralic and Siberian language user interface Support.Ainu of Japan is only supported by writing roman and rendering into Katakana with a few small modified characters there is no user interface ,spell,grammar checker,dictionary ,translator ,of course the Ainu has few terms in modern vocabulary but Iam studying the language in order to find words and coin new ones, iPhone hoomi-ye-p electric speak thing. I am looking for other peple who have the same idea.
3
1
1.3k
Dec ’16
Having trouble with Accessibility API of the ApplicationServices framework
After replacing Big Sur OSX 11.0 with the latest 11.5, my app's AXObserverAddNotification methods fails. Here is sample code I tested from StackOverflow: https://stackoverflow.com/questions/853833/how-can-my-app-detect-a-change-to-another-apps-window AXUIElementRef app = AXUIElementCreateApplication(82695); // the pid for front-running Xcode 12.5.1 CFTypeRef frontWindow = NULL; AXError err = AXUIElementCopyAttributeValue( app, kAXFocusedWindowAttribute, &frontWindow );     if ( err != kAXErrorSuccess ){         NSLog(@"failed with error: %i",err);     } NSLog(@"app: %@  frontWindow: %@",app,frontWindow); 'frontWindow' reference is never created and I get the error number -25204. It seems like the latest Big Sur 11.5 has revised the Accessibility API or perhaps there is some permission switch I am unaware of that would make things work. What am I doing wrong?
2
0
751
Aug ’21
¿How do I make Siri announce the local currency on notifications?
I'm currently testing the announce notifications feature and I can't seem to find out how to make Siri read aloud the current currency instead of dollars. My locale is es-CL (Chile). It uses the currency symbol $ and reads as Pesos locally or Chilean Pesos where the number 5000.1 is represented as 5.000,1 This is the notification content         let content = UNMutableNotificationContent()         content.body = "¡Has recibido un pago por $5.000!" Siri reads it aloud as "¡Has recibido un pago por 5.000 Dolares!" which translates to "You have received a payment for 5,000 Dollars", instead of the expected "¡Has recibido un pago por 5.000 Pesos!" -> "You have received a payment for 5,000 Pesos" I've tried changing the development region of the app, interpolating the string with NumberFormatter.localizedString(from: 5000, number: .currency), and with others styles( .currencyAccounting, .currencyISOCode and .currencyPlural) without good results. The last one seems to work buts it's not ideal since it outputs "5.000 pesos chilenos" which gets read as "5 pesos chilenos" which is not the correct amount (bug), it's as is you're not on Chile and I personally prefer it to be a symbol instead of words. I'm testing with my device which is setup with the region "Chile" Could someone help me find a solution?
5
1
1.2k
Jan ’23
xcstrings file in not being updated
I'm using Xcode 15.2 and have migrated my (macOS) project to use an xcstrings file a while back. Now when I check the xcstrings file, all items are marked as "stale". When I add new localized strings in code, they don't show up in the xcstrings file. The xcstrings file is built correctly (into .lproj/Localizable.strings) when building. Where can I check which source files are checked to update xcstrings status? "xcstringstool" appears to have a "sync" feature which reads "stringsdata" files, but there is no information in the xcstringstool help on where the stringsdata files come from. If I create a new project I can see a "stringsdata" file being generated for each source file in the intermediate build products folder.
2
0
2.6k
Feb ’24
VoiceOver spells word letter by letter
We currently have an odd issue with VoiceOver spelling a word letter by letter while the same word is spoken as a whole for other items. The app is in German. I have a View in SwiftUI whose button traits are removed, then a label "Start Tab 1 von 5" is added. "Tab is spoken as a whole word here, all fine. If I change the label to "Tab-Schaltfläche" or for example "SimplyGo Tab 3 von 5", then "Tab" is spoken as "T A B", letter by letter. is there a way to force VoiceOver to speak it as a whole?
4
0
1.2k
May ’24
Disabling New Hand Gesture Features in Vision Pro App on visionOS 2
Question: Hi everyone, I'm developing a Vision Pro app using the latest visionOS 2, and I've encountered some issues with the new hand gestures introduced in this update. My app is designed to display a UI element when a user's palm is detected. However, the new hand gestures for navigating key functions like Home View, Control Center, and adjusting the volume are interfering with my app's functionality. What I'm Trying to Achieve Detect when a user's palm is open and display a UI element. Ensure that my app's custom hand gestures are not disturbed by the new default gestures in visionOS 2. Problem The new hand gestures in visionOS 2 (such as those for Home View, Control Center, and volume adjustment) are activating while my app is open, causing disruptions to my app's functionality. I want to disable these system-level gestures when my app is running.
5
2
1.8k
Jun ’24
(After upgrading to iOS 17.5.1) MFi hearing devices appeared to be paired, but app is unable to resolve the connection to the peripheral
Hey Apple, We (our customer support teams) have received feedback from our customers complaining their hearing devices (hearing aids) appear to be connected to MFi and OS controls are working, but audio stream isn't working, and the app is unable to resolve a connection to the device via the CBCentralManager.retrieveConnectedPeripherals(withServices:) The issues appear to disappear once the user unpairs and re-pair the hearing devices in the Accessibility > Hearing Devices options (they might also need to uninstall and reinstall the app as it is getting stuck due to invalid state), but we are unable to replicate this issue on our environments given it is intermittent and once we have upgraded a device to iOS 17.5.1, we don't have a mechanism to revert to an earlier version of it. So far, we have received about 30 reports for the past 2 weeks. Most of our customers complain about the app not connecting to the devices, but the fact the audio stream is not happening could hint to a deeper problem than our app. Are you guys aware of a problem affecting MFi hearing devices restoring after the OS upgrade process?
2
1
1k
Jun ’24
Caller does not have kTCCServiceVoiceBanking access to personal voices. No speech will be generated
Hello, I'm trying to leverage PersonalVoice to read a phrase in my iOS application. My implementation works correctly on an iPhone 15, but does not work when I run the iOS application on an M2 Macbook Air. Here are some snippets from my implementation // This is how I request Personal Voice AVSpeechSynthesizer.requestPersonalVoiceAuthorization() { status in if status == .authorized { var personalVoices = AVSpeechSynthesisVoice.speechVoices().filter { $0.voiceTraits.contains(.isPersonalVoice) } } } // this is how I'm attempting to read let utterance = AVSpeechUtterance(string:textToRead) if let voice = personalVoices.first { utterance.voice = voice } var synthesizer = AVSpeechSynthesizer() synthesizer.speak(utterance) I get the following error messages when I try this: Cannot use AVSpeechSynthesizerBufferCallback with Personal Voices, defaulting to output channel. Caller does not have kTCCServiceVoiceBanking access to personal voices. No speech will be generated Voice not allowed to render speech! Will not set up synthesizer. Bailing now Any suggestions on how to mitigate this issue?
0
0
786
Jun ’24
iOS 18 beta bug(Dictation)
About half the time or more when dictating text, if dictation mode is manually deactivated immediately when done speaking the last word is duplicated. For example, if you dictate a text message (without using Siri) using the microphone button on the keyboard and are dictating a text message to someone by saying, ‘I’m on my way, be there soon.’ and hit send or stop the dictation as soon as you are done talking the dictated text will read. ‘I’m on my way, Be there soon soon.’ -currently running iOS 18, beta 1 and I’ve experienced this multiple times.
3
1
1.1k
Jun ’24
Xcode Accessibility audit in UI Tests gives less than stellar results
I have added a UI test that uses the newish app.performAccessibilityAudit() on each of the screens in my SwiftUI app. I get a handful of test failures for various accessibility issues. Some of them are related to iOS issues (like the "legal" button in a map view doesn't have a big enough tappable are; and, I think, there is a contrast issue with selected tabs in a tab bar, or maybe it is a false positive). Two of the error type I get are around Text elements that use default font styles (like headline) but that apparently get clipped at times and apparently "Dynamic Type font sizes are unsupported". For some of the errors it supplies a screenshot of the whole screen along with an image of the element that is triggering the error. For these text errors it only provides the overall image. After looking at the video I saw that part of the test runs the text size all the way up and I could see that at the largest size or two the text was getting cut off despite using default settings for the frames on all the Text elements. After a bit of trial and error I managed to get the text to wrap properly even at the largest of sizes and most of the related errors went away running on an iPhone simulator. I switched to an iPad simulator and the errors re-appeared even though the video doesn't show the text getting clipped. Does anyone have any tips for actually fixing these issues? Or am I in false positive land here? Also, is there any way to get more specific info out of the test failures? Looking at the entire screen and trying to guess where the issue is kinda sucks.
1
2
960
Jun ’24
Cannot activate FaceTime and iMessage service
I tried to activate FaceTime and iMessage on my iPhone 15 Pro after update to iOS 18. But i could not activate it to my service provider phone number and its activated to my iCloud mail id. Also tried everything mention on Apple Support page. No luck yet. I contact my service provider too. They asked to clerify from Apple.
1
0
518
Jun ’24
Battery Health draining fast
My Iphone’s Battery health Was 100% till 4 months when I bought it after updating to ios 17.5 its draining very very quickly it was 100% when i Updated my iphone and after that in 4 days 1% BH drops,updated my iphone on 21th Of may and after 1 month its 92%.Any solution On this?
1
0
530
Jul ’24
Localization for multiple targets
I have one project where I have XYZ scheme and target. I have Localizable.string under XYZ target for localization. I want to create a ABC target (duplicate of XYZ) and set custom language support for it. Let's say I have english, french and german for XYZ, I want hindi, japanese and chinese for ABC. I did the below steps I went to Manage scheme and duplicated the XYZ (duplicate scheme = ABC). I added new localization file only for ABC (LocalizationForABC.string) and made sure those reflect in File Inspector -> Target (only ABC selected) and also checked in Build Phases -> Copy Bundle Resources (LocalizationABC exists). When I run the ABC target under let's say french, it works fine but when I build the project ABC, and remove french from XYZ, ABC is broken and it only runs in english. Am I missing something here ?
3
0
1k
Jul ’24
Developing Carplay App for fragrance system
Greetings, I am currently conceptualizing an application designed to interface with Carplay, enabling control over aftermarket automotive fragrance systems and ambient lighting within vehicles. Having perused your development guidelines, I am interested in understanding how my project might be classified within your framework. Specifically, I am exploring whether it would be appropriate to categorize this endeavor under the 'Driving Task' category, given its direct interaction with the vehicle environment. Your insights on this matter would be greatly appreciated. Best regards!
0
0
426
Jul ’24
“Past”Text Keyboard Feature
After double tapping a text space. Allow to press and hold “Paste” to ase a pop up list of the last 5 text copies you’ve made. While still keeping the one “Paste” touch to the most recent. This we help when we accidentally copied or cut different text before pasting what we were saving to paste later.
1
0
533
Jul ’24
Toolkit/API for Interacting with Active Editable Text Boxes on macO
I am developing an application that needs to interact with active editable text boxes on macOS, similar to how Grammarly functions. Specifically, my application needs to: Read and write text in the currently active editable text box where the keyboard input is directed. Move the writer cursor and mark text within the text box. Ideally, the solution should support cross-platform functionality, covering both macOS and Windows. Does anyone know of any toolkits, libraries, or APIs that can facilitate this kind of functionality on macOS?
1
0
570
Jul ’24
AccessibilityUIServer has microphone locked
Just installed iOS 18 Beta 3. I am seeing my AccessibilityUIServer using the microphone and this is causing no notification sounds, inability to use Siri by voice and volume is grayed out. If I start to play anything with sound AccessibilityUIServer releases the microphone and I am able to use the app. Calls still work since AccessibilityUIServer will release and the phone will ring. Feed back ID is FB14241838.
12
9
6.2k
Jul ’24
FaceTime background macOS 15 Beta
I'm just joining FaceTime with the background setting that I have been advocating for a long time. It is about the Accessibility Lens, which we want to change the background of FaceTime video calls. I want to make suggestions or comments on those features we would like to have. Is there somewhere we can have this forum to discuss specifically with the FaceTime ecosystem?
0
0
492
Jul ’24
Voice over sound volume decrease
Hi, we encounter an issue while using voice over in our app. On a given page, voice over is running through the items but at some point (randomly) the volume of the voice over decreases dramatically. It is only possible to restore the original volume by killing the app and launch it again. It does not happen systematically and is pretty difficult to reproduce but it happens on a regular basis. No interaction occurs with other apps (music or navigation apps) and there is no interaction with bluetooth or other air play devices. The code does not involve any voice over changes. We are looking for any clue that can lead to the issue root cause. Thanks for the help !
1
0
476
Jul ’24