Accessibility

RSS for tag

Make your apps function for a broad range of users using Accessibility APIs across all Apple platforms.

Posts under Accessibility tag

138 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Focus issues with ScrollView iOS18
When using an app via external keyboard, FocusState and .focused used to work just fine until iOS17. Vertical-axis textfields were also accessible without any issues. But after iOS18 update, adding focused modifier removes elements out of focus order of external keyboard. 1 such example is -when a button using focused modifier and @FocusSate is inside a ScrollView and if this view is getting opened via NavigationLink, that button is not accessible via Bluetooth (external) keyboard. TextEditor / Vertical-axis TextFields also seem to be impacted in external-keyboard-focus-order when added inside ScrollView. Is this a known iOS18 issue with ScrollView / any tip to get this fixed ? Sample code that can reproduce this issue: struct ContentView: View { @State private var showBottomSheet: Bool = false @State private var goToNextView: Bool = false @FocusState private var focused: Bool @AccessibilityFocusState private var voFocused: Bool var body: some View { NavigationView { VStack { Text("Hello, world!") // This button works fine in Bluetooth keyboard in all versions Button("Trigger a bottomsheet") { showBottomSheet = true } .focused($focused) .accessibilityFocused($voFocused) Button("Goto another view") { goToNextView = true } NavigationLink( destination: View2(), isActive: $goToNextView ) { EmptyView() } .accessibility(hidden: true) } .sheet(isPresented: $showBottomSheet, onDismiss: { focused = true voFocused = true }, content: { VStack() { Text("Hello World ! I'm in a bottomsheet") Button("Close me") { showBottomSheet = false } } }) .padding() } } } #Preview { ContentView() } struct View2: View { @FocusState private var focused: Bool @AccessibilityFocusState private var voFocused: Bool @State private var showBottomSheet: Bool = false var body: some View { ScrollView { VStack { Text("check") // In iOS18, this button doesn't get focused in Bluetooth / external keyboard // This issue occurs when these 3 combine in iOS 18 - a button using FocusState inside a view that has a ScrollView & it is opened via NavigationLink Button("Trigger a bottomsheet") { showBottomSheet = true } .focused($focused) .accessibilityFocused($voFocused) Button("Test button") { } } .sheet(isPresented: $showBottomSheet, onDismiss: { focused = true voFocused = true }, content: { VStack() { Text("Hello World ! I'm in a bottomsheet") Button("Close me") { showBottomSheet = false } } }) .padding() } } }
0
1
524
Feb ’25
CGEvent Not Working
I am trying to simulate a paste command and it seems to not want to paste. It worked at one point with the same code and now is causing issues. My code looks like this: ` func simulatePaste() { guard let source = CGEventSource(stateID: .hidSystemState) else { print("Failed to create event source") return } let keyDown = CGEvent(keyboardEventSource: source, virtualKey: CGKeyCode(9), keyDown: true) let keyUp = CGEvent(keyboardEventSource: source, virtualKey: CGKeyCode(9), keyDown: false) keyDown?.flags = .maskCommand keyUp?.flags = .maskCommand keyDown?.post(tap: .cgAnnotatedSessionEventTap) keyUp?.post(tap: .cgAnnotatedSessionEventTap) print("Simulated Cmd + V") } I know that there is some issues around permissions and so in my Info.plist I have this: <string>NSApplication</string> <key>NSAppleEventsUsageDescription</key> <string>This app requires permission to send keyboard input for pasting from the clipboard.</string> I have also disabled sandbox. It does ask me if I want to give the app permissions but after approving it, it still doesn't paste.
1
0
360
Feb ’25
Components with Earcon haptic feedback for VoiceOver users
I want to understand which component types are intended to have an associated hint text, haptic feedback, or earcon associated with it for VoiceOver screen reader users. Is there a list somewhere or a HIG guideline for which transition types should have a sound? Some transitions in Apple apps generally include different beep sounds, such as opening a new screen screen dimming when a VoiceOver user swipes from the header / navbar to the body a scraping sound when swiping up or down a page. the beginning or end of the body section in Calculator when swiping from one row to the next. opening a pop up menu I would also appreciate any direction on what code strings are associated with these sounds and how custom components can capture these sounds or haptics or hints where it is expected? On the other hand, I don't want to get that info and then dictate that every component needs a specific beep type since these sounds appear to be used for specific purposes.
3
1
730
May ’25
tvOS: GCController does not send button press events for "Button A" and "Button Center" when VoiceOver is On
When turning VoiceOver ON, GCController does not send button press events for "Button A" and "Button Center". This happens when using Siri 2nd generation remote (with dedicated arrow buttons on the circle around center button) and also when using iOS remote. I didn't test it on old Siri 1st generation with touchpad without arrow buttons. Example: gameController.microGamepad?.allButtons.forEach { button in button.valueChangedHandler = { [weak self] _, _, _ in self?.buttonHandler(gameController: gameController, button: button) } private func buttonHandler(gameController: GCController, button: GCControllerButtonInput) { print("BUTTON: Pressed \(button.description) isPressed=\(button.isPressed) isTouched=\(button.isTouched)") } #endif VoiceOver ON (incorrect behavior): BUTTON: Pressed Direction Pad Left (value: 0.030, pressed: 1) isPressed=true isTouched=true BUTTON: Pressed Direction Pad Down (value: 0.079, pressed: 1) isPressed=true isTouched=true BUTTON: Pressed Direction Pad Left (value: 0.000, pressed: 0) isPressed=false isTouched=false BUTTON: Pressed Direction Pad Down (value: 0.000, pressed: 0) isPressed=false isTouched=false VoiceOver OFF (correct behavior): BUTTON: Pressed Direction Pad Left (value: 0.137, pressed: 1) isPressed=true isTouched=true BUTTON: Pressed Direction Pad Up (value: 0.078, pressed: 1) isPressed=true isTouched=true BUTTON: Pressed Button A (value: 1.000, pressed: 1) isPressed=true isTouched=true BUTTON: Pressed Button Center (value: 1.000, pressed: 1) isPressed=true isTouched=true BUTTON: Pressed Button A (value: 0.000, pressed: 0) isPressed=false isTouched=false BUTTON: Pressed Button Center (value: 0.000, pressed: 0) isPressed=false isTouched=false BUTTON: Pressed Direction Pad Left (value: 0.000, pressed: 0) isPressed=false isTouched=false BUTTON: Pressed Direction Pad Up (value: 0.000, pressed: 0) isPressed=false isTouched=false I could use for detection Direction Pad Left/Right/Up/Down and detect position between -0.7 and +0.7 and handle it as center button press, because I use that on old Siri remote where I need to distinguish center button and arrows (for switching TV channels by Up/Down and Skip forward/back by Left/Right arrows), but for new Siri remote it would be unnecessary workaround. Does anybody know why the center/select button is not detected when VoiceOver is ON. Is there another way of detecting it using GCController? I don't want to use SwiftUI onTapGesture for this one particular case. Is it an unexpected bug in tvOS APIs or is there some specific reason why center button is not handled by GCController when VoiceOver is ON? Thanks.
0
0
584
Jan ’25
PerformAccessibilityAudit and sufficientElementDescription clarification
Hi, I am writing in the hope to receive some clarification about the rationale of the Audit type sufficientElementDescription - in context with Accessibility Audit API. Please see my test below: And another example in context with Xcode, where the strings visible in the UI are also set as accessible labels of their respective elements. Thanks for your help!
1
0
446
Jan ’25
Making VoiceOver more concise on a SwiftUI Menu
A common UI idiom in Apple's first party iOS apps is a circle icon with three dots in the upper right of the screen. This serves as a pop-up menu of more options. Some examples include: Apple Music, Library tab Photos, Album view Reminders In all these cases, VoiceOver reads this element as "More, Button". In my SwiftUI app, I've implemented a visually identical button. Menu { // Button for Menu Item 1 // Button for Menu Item 2 // ... } label: { Image(systemName: "ellipsis.circle") .accessibilityHidden(true) } .accessibilityLabel("More") However, the VoiceOver output in my app is much more verbose. It speaks "More, Button, Pop Up Button, Double Tap To Activate The Picker". Any guidance on how to make this more concise in line with the apps mentioned above?
2
1
460
Jan ’25
How to determine the default duration of a long-press
Using gesture recognizers it is easy to implement a long-press gesture to open a menu, show a preview or something else on the iOS platform. And you can provide the duration the user must hold down the finger until the gesture recognizer fires. But I could not yet find out how to determine the default duration for a long-press gesture that is configured in the system settings within the "accessibility" settings under "Haptic Touch" (the available options are fast, standard and slow here). Is it possible to read out this setting, so my App can adapt to this system setting as well?
1
0
296
Jan ’25
Sandbox Permissions for Clipboard Monitoring and Modification in a macOS App
Hello, I’m developing a sandboxed macOS app using Qt, which will be distributed via the Mac App Store. The app: Monitors the clipboard to store copied items. Overrides the paste function of the operating system via keyboard shortcuts. Modifies clipboard content, replacing what the user pastes with stored data. So, I have some questions: Can a sandboxed app continuously read and modify clipboard content? What entitlements are required? What permissions should I request from the user to ensure that my app works? Any guidance would be greatly appreciated! Thanks in advance! Beril Bayram
5
1
505
Jan ’25
Issue with “Vocal Shortcuts” disabling after one day of use
Hello Apple Community, I'm experiencing an issue with "Vocal Shortcuts" on iOS. I created a trigger in Vocal Shortcuts to run a specific shortcut, and it works perfectly on the first day. However, by the next day, the voice command stops functioning entirely. To make it work again, I have to disable and then re-enable Vocal Shortcuts in the settings. I've tested this on multiple devices (iPhone 11, iPhone 13, and iPhone X), all running the latest iOS version, and the same problem occurs on each one. Is there any additional configuration needed, or could this be a bug? Any advice or insights would be greatly appreciated! Thank you in advance,
2
0
562
Jan ’25
Getting precise text position with Swift for MacOS
Hey there! Hope you are starting the year with great joy. My situation I'm building a new product that is based on detecting certain text on screen in realtime. The product is only targeted for Mac and it's built with Swift My problem I need to get the exact position of a text element with the Apple Accessibility API but I can't figurate it out. I managed to get the AXUIElement where the text is placed but it's position is too broad and off target. My discoveries so far I've tried OCR but is too slow for what I'm building, so the only possible way I can think of is with the Accessibility API. Thank you in advanced.
2
0
492
Jan ’25
Clarification on Entitlements, Privacy Manifest, and Info.plist for System-Wide Mouse Click Monitoring and Typing Simulation in macOS App
I am currently developing a macOS application that listens for system-wide mouse clicks to simulate typing with user-provided text. The app requires Accessibility permissions to function properly, and I want to ensure compliance with Apple’s latest privacy and security guidelines. The app listens to global mouse clicks. It simulates keyboard input with user-provided text I would like detailed guidance on the following aspects: What specific entitlements are required to allow system-wide mouse click monitoring and simulating user input ? App Sandbox enable or disable? what keys required to explain global mouse click monitoring and keyboard input simulation in the info.plist What will be the configuration of Privacy Manifest
0
0
386
Jan ’25
Registering a macOS app for dynamic text sizing in macOS 15
macOS 15 includes a neat section in System Preferences Settings to change the dynamic text size, as outlined see: https://support.apple.com/guide/mac-help/make-text-and-icons-bigger-mchld786f2cd/mac However, it's not immediately clear a) how to get one's app in this list, and b) if the usual methods from iOS to react to text size even work on macOS. Does anyone have any experience here? Or should I implement my own controls in my app's settings and call it a day? For context, my app is a macOS-native SwiftUI app.
1
0
546
Jan ’25
SwiftUI image has isAccessibilityElement == false
My SwiftUI app uses an Image with a tap gesture: Image(systemName: "xmark.circle.fill") .accessibilityIdentifier(kTextFieldClearButton) .foregroundColor(.secondary) .padding(.trailing, 6) .onTapGesture { dataSource.textFieldText = "" } In a UI test, I want to tap this image to execute its action: let clearButton = app.images[kTextFieldClearButton] clearButton.tap() However the action is not executed. I then set a breakpoint at clearButton.tap(), to execute lldb commands. Here are the results: (lldb) p clearButton.isHittable t = 439.54s Find the "TextFieldClearButton" Imag (Bool) true e It is a little strange that "Image" has been interrupted by (Bool) true, but the image is hittable. p clearButton.isAccessibilityElement gives (lldb) p clearButton.isAccessibilityElement (Bool) false I don't understand why this Image is no accessibility element. I thought, SwiftUI Views are by default accessible. What can I do to make it accessible so that clearButton.tap() works as expected?
0
0
420
Dec ’24
Application "help" menu does not open main help book page
Following the official documentation, I'm trying to create a set of three localised Help Books. The Help Books should be available in Spanish, English and Polish. Presently, I'm trying to complete English version. App Structure This is the plugin application consisting of main app and the plugin. The main app structure would looks as follows: Files . <XcodeProject Top> ├── Localizable.xcstrings ├── MyAppExtension │   ├── MyAppExtension.swift │   └── <other swift files>.swift ├──MyApp │ ├── Info.plist │   ├── +Array.swift │   ├── +ButtonStyle.swift │   ├── <other app swift files>.swift ├── Resources     └── MyApp.help └── MyApp.help └── Contents ├── Info.plist └── Resources ├── English.lproj │   ├── ExactMatch.plist │   ├── InfoPlist.strings │   ├── MyApp.helpindex │   ├── MyApp.html │   └── pgs └── shrd MyApp / MyApp.help / Info.plist file Consists the following values: Bundle name: MyApp HPDBookAccessPath: MyApp.html HPDBookTitle: My App Help Default localization: en_gb MyApp / Info.plist file Contains the following entries: Help Book directory name: MyApp.help Help Book Identifier: MyApp Help Build phase The Copy Bundle Resources copies MyApp.help in MyApp/Resources. Questions Is the provided folder structure valid for creating a localised help books Is there anything that is missing from across Info.plist files or is in the wrong places? Why the MyApp -> Help opens the main help menu, not the app help
3
0
473
Dec ’24
Accessibility Localization Questions
For practice, I have implemented accessibility labels and announcement in a very simple test app (All SwiftUI, all iOS 18). The app is not localized, default language is English. When running this on a German phone, odd things happen in the localization. My accessibility labels are read with an accent, but when they contain a url, the "dots" are read as the German "Punkt" (with an English Accent). When I am providing the same text as accessibility announcement, the same text (which is in English), is read with a German voice. I am also providing a Button with an "arrow.clockwise" image, and VoiceOver reads this, in an English Voice with "Refresh, Button". This is great and was to be expected. However, when the button is disabled, VoiceOver reads "Refresh, grau dargestellt, Button", all in an English Voice. Is this an error? Am I doing it wrong? The video at the link should show the issue https://share.icloud.com/photos/0757FJW2Q3fsA_cdhMX6ls46Q
2
0
1.2k
Dec ’24
Xcode 16.1 broken accessibilityLabel
Accessibility got broken after updated till XCode 16.1 There is a call to accessibilityLabel - it sets an a11y label for a title of a view. This used to work (pronounced by VoiceOver) with XCode 15.4 + iOS 17.5. Xcode 16.1 + iOS 18.1 + Physical device/ iOS SImulator - with Accessibility Inspector - no a11y label set. Tried Xcode 16.2 beta 3 - the same result - accessibilityLabel does not work - a11y label is not set.
1
0
497
Dec ’24
Fullscreen API web standard is unacceptably missing on iPhone
It is outrageous that Apple continue to fail to implement the Fullscreen API web standard for web apps on iPhone only, which is so important to accessibility and web app functionality. The only possible reason for this block is commercial: to promote iOS apps instead of browser based web apps. To quote a client from a major agency just now - a typical enquiry : We value accessibility greatly, and we noticed that the embedded player is missing a full screen button on iPhone. Everything else works perfectly fine, including a full screen button that appears on the mobile webpage on android devices. Is there any way we can include a button to enable full screen view for our viewers in your player that are going to watch it on iOS devices? To which, as usual, we have to reply: Apple unfortunately block fullscreen mode from being used with all web applications on iPhone. Apple will allow this to be displayed fullscreen on MacBooks and iPads, but currently not on on iPhone - so we have to hide the fullscreen button there. So fullscreen works on all devices and browsers apart from on iPhone. As you've seen with Android, all other devices and browsers follow the universal 'Fullscreen API' web standard to allow full screen. You're probably familiar with seeing the fullscreen button on normal linear videos on iPhone. These use Apple's native video player, which doesn't let buttons and scripts be used on top of it - just a single video, not an interactive web application. Our player looks like a video player but it is actually a web app combining multiple different video clips connected together by code and styling. They block it on iPhones for reasons known only to them, but the assumption is that it is to incentivise people to make iOS apps instead of web apps. The web development community is hopeful that Apple will change this unfortunate restriction soon, but we have been waiting a long time in vain. We have to send this to a lot of people. It's a very bad look for Apple. In less than a month it will be 2025. We have been waiting years for this. The web standard documentation showing universal support on other devices and browsers is here: https://developer.mozilla.org/en-US/docs/Web/API/Fullscreen_API This is not acceptable. It is time for Apple to stop blocking this important accessibility web standard for commercial reasons - only on iPhone. To whoever is in charge of these decisions in the Safari/Webkit team: Please just enable Fullscreen API for web apps on iPhone as soon as possible.
3
2
944
Dec ’24
German VoiceOver says "millibars" instead of "megabytes"
In SwiftUI, iOS 18.1.1, Xcode 16.1, the following control: Text(12345678, format: .byteCount(style: .binary)) displays text with MB (megabytes) unit, but German VoiceOver reads it as "millibars". I tried explicitly specify units with: Text(12345678, format: .byteCount(style: .memory, allowedUnits: .mb)) but the result is the same (German VoiceOver still says "millibars"). Aside from creating own accessibility label, is there any way to go around that?
3
0
492
Dec ’24