In VoiceOver, when using Group Navigation style, the cursor first focuses on the semantic group. To navigate inside the group, a two-finger swipe (left or right) can be used. This behavior works for default containers like the Navigation Bar, Tab Bar, and Tool Bar.
How can I achieve the same behavior for a custom view?
I tried setting accessibilityContainerType = .semanticGroup, but it only works for Mac Catalyst. Is there an equivalent approach for iOS?
How did we do? We’d love to know your thoughts on this year’s conference. Take the survey here
Accessibility
RSS for tagMake your apps function for a broad range of users using Accessibility APIs across all Apple platforms.
Posts under Accessibility tag
138 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
VoiceOver reads out all visible content on the screen, which is essential for visually challenged users. However, this raises a privacy concern—what if a user accidentally focuses on sensitive information, like a bank account password, and it gets read aloud?
How can developers prevent VoiceOver from exposing confidential data while still maintaining accessibility? Are there best practices or recommended approaches to handle such scenarios effectively?
I’m currently focused on an element at the bottom of the screen. What is the proper way to quickly navigate to the top element? By default, there’s a four-finger single tap to move to the first element, but should I use the Rotor action instead to focus on the element I need?
For example, in the Contacts app while adding a new contact, if I enter a value in a field at the bottom, there’s no quick way to directly save the contact. I have to manually navigate all the way to the top to tap the Done button, which feels a bit inconvenient.
Is there a better way to handle this using VoiceOver?
I have a UITextField in my application for entering a state. If I tap on it, a UIPickerView pops up and let's the user select a state (but they can still type too).
The issue relates to Full Keyboard Access. If we select the UITextField using an external keyboard, the UIPickerView appears, but in order to get to it the user has to tab through the whole view controller to get to the UIPickerView at the end.
What would be nice is to a) move focus directly to the UIPickerView (have it highlighted in blue and scrollable right away with keyboard) or b) make the UIPickerView the next view that's accessible when tabbing over or using the arrow keys.
I've tried using:
UIAccessibility notifications (both .screenChanged and .layoutChanged, with and without a delay). This ended up only announcing the view, but didn't help with full keyboard access.
Making the UIPickerView a first responder when it appears.
Attempting to change the accessibilityElements order (but with so many views and views within views, this isn't really a viable option either).
Pressing tab + -> (tab and right arrow button) will quickly take the user to the end of the chain of accessibility elements, in other words, to the UIPickerView. But there has to be a cleaner way of just automatically setting the focus to the UIPickerView or making it the next element by pressing the arrow key.
Hello,
I struggle to do some UI testing using accessibility identifiers when I wrap some SwiftUI view using UIHostingController (our app is mainly coded using UIKit).
Considering this SwiftUI view (simplified for this post):
HStack {
Text(self.title.uppercased())
.albusTheme(.header)
.lineLimit(self.isMultiline ? nil : 1)
.multilineTextAlignment(.leading)
.accessibilityAddTraits(.isStaticText)
.accessibilityIdentifier("section_title")
}
This view and its controller are embedded as a UITableViewHeaderFooterView in a UITableView.
This is an extract of recursiveDescription output:
| | | | | | <_UITableViewHeaderFooterContentView: 0x1076ad720; frame = (0 0; 393 40); layer = <CALayer: 0x6000006b1720>>
| | | | | | | <_TtGC13ListComponent19SwiftUIFieldContentV20ListComponentLibrary17FormSectionHeader_: 0x1076ab980; baseClass = UIControl; frame = (0 0; 393 40); layer = <CALayer: 0x6000006b1da0>>
| | | | | | | | <_TtGC7SwiftUI14_UIHostingViewV20ListComponentLibrary17FormSectionHeader_: 0x1078f9600; frame = (0 0; 393 40); gestureRecognizers = <NSArray: 0x600000e25d70>; backgroundColor = UIExtendedSRGBColorSpace 0.0666667 0.133333 0.227451 1; layer = <SwiftUI.UIHostingViewDebugLayer: 0x6000006b19a0>>
| | | | | | | | | <_TtCOCV7SwiftUI11DisplayList11ViewUpdater8Platform13CGDrawingView: 0x106985550; frame = (16 12.6667; 147.667 14.6667); anchorPoint = (0, 0); opaque = NO; autoresizesSubviews = NO; layer = <_TtCOCV7SwiftUI11DisplayList11ViewUpdater8PlatformP33_65A81BD07F0108B0485D2E15DE104A7514CGDrawingLayer: 0x6000026b8240>>
CGDrawingView seems to hide the underlying view hierarchy. Is there a way to access accessibility settings using the integration of SwiftUI in UIKit?
In some places of our app we make use of NSAccessibilityElement subclasses to vend some extra items to accessibility clients.
We need to know which item has the VoiceOver focus so we can keep track of it.
setAccessibilityFocused: does not get called when accessibility clients focus NSAccessibilityElements. This method is only called when accessibility clients focus view-based accessibility elements (i.e. when a NSView subclass gets focused).
At the same time we need to programmatically move VoiceOver focus to those items when something happens. Those accessibility elements inherit from NSObject so we can't make them first responder.
Is this the expected behavior? What are our options in terms of reacting to VoiceOver cursor moving around? What are our options in terms of programmatically moving the VoiceOver cursor to a different element?
Here's a sample project that demonstrates the first part of the issue: https://github.com/vendruscolo/apple-rdars/tree/master/DTS12368714%20-%20NSAccessibilityElement%20focus%20tracking
If you run the app, a window will show up. It contains a button and a red square. If you enable VoiceOver you'll be able to move the cursor over the red square, and a message will be logged. You'll also notice there's an extra element after the red square. That element is available to VoiceOver, however when it gets focuses, no message gets logged.
iOS 18.3.1, iPhone 16 Pro.
I pick photos using connected physical keyboard from the user's photo library using:
.photosPicker(isPresented: $viewModel.isImagePickerPresented, selection: $viewModel.selectedImageItem, matching: .images)
When picker appears, accessibility focus is moved to "dynamic Island" instead of cancel button. There is no possibility to navigate by keyboard in photos picker view without tapping on this view and move focus to this view manually . I noticed the same behavior in Notes app.
Using the new floating tab bar with ios18 on ipad with an RTL language, when the tabbar is correctly flipped and the right buttons become left buttons, the buttons are pushed off the screen on initial load but do come back after some navigating / reloading, have recreated in a fresh project.
I’ve tried implementing the accessibilityPerformMagicTap() method in a specific UIViewController, its view, and even in AppDelegate, but I am not receiving any callbacks.
I directly overrode this method in the mentioned areas, but it never gets triggered when performing a magic tap.
How can I properly observe and handle the accessibilityPerformMagicTap() action?
I’m currently exploring VoiceOver accessibility in iOS and looking for the best way to reduce the number of swipes required to navigate a UITableView. I’ve come across a couple of potential solutions but am unsure which is preferred.
Solution 1: Grouping Subviews in Each Cell
Combine all subviews inside a UITableViewCell into a single accessibility element.
Provide a concise and meaningful accessibilityLabel.
Use custom actions (UIAccessibilityCustomAction) or accessibilityActivationPoint to handle interactions on specific elements within the cell.
Solution 2: Using UIAccessibilityContainerDataTableCell & UIAccessibilityContainerDataTable
Implement UIAccessibilityContainerDataTable for structured table navigation.
Make each cell conform to UIAccessibilityContainerDataTableCell, defining its row and column positions.
However, I’m finding this approach a bit complex, and I need guidance on properly implementing these protocols.
Additionally, in my case, VoiceOver is not navigating to Section 2—I’m not sure why.
Questions:
Which of these approaches is generally preferred for better VoiceOver navigation?
How do I properly implement UIAccessibilityContainerDataTable so that all sections and rows are navigable?
Any best practices or alternative recommendations?
Would really appreciate any insights or guidance!
Hi,
I am setting an accessibilityLabel and accessibilityHint property of a UIAlertAction. However, VoiceOver is only reading the label out. Usually, the label is read out, followed by a short pause and then the hint. Is this a known issue, where hints do not work for this element? I can append the hint to the label, but interested to know if there's something I'm doing wrong.
Regards.
Hello,
I am a student studying accessibility.
I aim to analyze the smartphone usage patterns of visually impaired individuals.
Therefore, I would like to log the VoiceOver usage records of visually impaired iPhone users.
Is there a way to output VoiceOver logs, similar to the AccessibilityService API on Android?
Thank you in advance for your responses.
In our application we are using OTP login. When accessibility full keyboard access is enabled, and we are trying to enter OTP in the OTP field that time in iOS 17 focus is moving to the next text field accordingly but in iOS 18 focus is staying the first OTP field only and not moving to the next text field.
In our application we are using a pop over view and we have enabled the accessibility VoiceOver, When user navigating inside the popover and reached to the last element that time with the right swipe we need to dismiss the popover.
In our application we are using a Search bar in a pop over view and we have enabled Accessibility full keyboard access and we are using external keyboard. Now if the focus is on Searcher that time by next Tab key press Search bar will dismiss and focus needs to shift to the next UIElement.
In our application we are using UITableView for data population and that TableView cell contains a button. When we are enabling full keyboard access that time only TableView cell is focusing not the button. We need to focus on cell and button differently.
I am trying to grant Input Monitoring permission using MDM (Mobile Device Management), but I am facing issues. While I am able to deny the permission, I am unable to grant it.
In some profile configurator tools, I noticed a note stating:
"Allows the application to use CoreGraphics and HID APIs to listen to (receive) CGEvents and HID events from all processes. Access to these events cannot be given in a profile; it can only be denied."
This seems to suggest that granting Input Monitoring permission via an MDM profile may not be possible.
Has anyone successfully granted Input Monitoring permission using MDM, or is there an alternative way to achieve this on managed macOS devices?
In our application we are using UIAlertViewController. When accessibility full keyboard access is enabled, and we are trying to dismiss that AlertViewController with Esc key from external keyboard that is not working. We are presenting AlertViewController as a popover. We need dismiss the AlertViewController with Esc key press from external keyboard.
Hello all.
Currently I am trying to get WKWebView to scroll with a physical keyboard and it just will not work. I tried allowsKeyboardScrolling( ) and it did not work. UIWebView works but its no longer supported. Trying to get full keyboard access to work to make our app more accessible but WKWebView does not want to play nice.
Has anyone else had issues trying to use WKWebView with an external keyboard, and if so did you find any solutions? Greatly appreciated!
home button cannot be pressed.
attempting to enable developer mode says "press home to continue"
the usual accessibility switch "assistive touch" that is shown on iOS is not available on this screen
so dev mode cannot be enabled
this is an accessibility issue