Explore best practices for creating inclusive apps that cater to users with diverse abilities

Learn More

Posts under General subtopic

Post

Replies

Boosts

Views

Activity

SpaceBar Not functioning as expected
When I am doing a file search, in TextEdit, and on certain webistes the space bar will quit functioning as soon as i start typing. If I hold down the "Option" key it allows the space bar to work as normal. I have checked every setting I can think of and nothing has helped.
3
0
98
Apr ’25
Attaching procedural audio to an ARKit SCNNode
I’m developing an ARKit application where I aim to attach procedurally generated audio to detected planes in the environment. While using a static audio file with SCNAudioSource and SCNAudioPlayer works as expected, integrating procedural audio via AVAudioSourceNode does not produce any sound, nor does it generate any error messages: Stack Overflow Post Working Implementation with Static Audio File: let audioPlayer = SCNAudioPlayer(source: audioSource) node.addAudioPlayer(audioPlayer) Attempted Implementation with Procedural Audio: // Audio generation code } let audioPlayer = SCNAudioPlayer(avAudioNode: audioNode) node.addAudioPlayer(audioPlayer) In this setup, the AVAudioSourceNode successfully generates audio when connected directly to an AVAudioEngine. However, when used with SCNAudioPlayer and attached to an SCNNode, it fails to produce sound. What doesn’t work is creating some procedural audio with an AVAudioNode, as documented here: Apple docs Additionally, I explored the WWDC18 AR game project, SwiftShot, which utilizes SCNAudioPlayer(avAudioNode:). After updating it for the latest Xcode, the graphics function correctly, but the audio does not play. I also noted that the Apple documentation mentions an audioPlayerWithAVAudioNode: method, stating: Using this initializer is typically not necessary. Instead, call the audioPlayerWithAVAudioNode: method, which returns a cached audio player object if one for the specified AVAudioNode object has already been created and is available for use. However, this method does not appear to be available in Swift. Any insights or guidance on this matter would be greatly appreciated.
0
0
120
Apr ’25
SwiftUI List Accessibility VoiceOver
I have been working on a feature, where I have a List in SwiftUI with previous and next data loading, user can scroll up and down to load previous/next page data. Recently, I faced one accessibility issue while testing voice-over, when user lands on the listing screen and swipe across the screen from navigation and when focus comes on list it should highlight the first item visible. But when user swipes back: Should it load the previous data and announce the previous item or it should go back to the navigation items? If it loads the previous item, what if the user wants to go to the navigation to switch to other actions and vice-versa? Did anyone come across this kind of issue? What can be the standard expected behavior in this case if list has both previous and next page scroll? I different tried gestures https://support.apple.com/en-in/guide/iphone/iph3e2e2281/ios, but it isn't working
2
0
96
Apr ’25
Using WebSocket for BCI Click Input in VisionOS - FocusState vs. System-Level Limitations
Hi everyone, My team and I are developing an accessibility-focused VisionOS app (MindTap) as part of a university project, aiming to support individuals with Locked-In Syndrome using Brain-Computer Interface (BCI) signals to trigger interactions (e.g., tapping) within the Apple Vision Pro environment. Problem 1: Simulating Eye Tracking in Simulator We are testing onHover with Send pointer to the device under I/O > Input in the simulator, and while it mostly works (a bit laggy), we found that onHover won't function on the actual Vision Pro hardware. From what I understand, we should be using FocusState for proper gaze interaction, but testing this requires the physical device. Is there any workaround or official Apple-recommended way to simulate Focus-based gaze detection without a real Vision Pro? Problem 2: WebSocket-triggered "Click" doesn't work outside the app We successfully use WebSocket to send a custom ****** (a "1" from the brain ****** device) to trigger an action inside our app. However, when the user opens a third-party app like Apple News, the WebSocket-triggered "click" no longer works. We suspect this is due to sandbox restrictions or lack of system-level permissions. Is it possible in anyway to: Trigger interaction events outside the app using custom input (like BCI via Websocket)? Access system-wide click/tap simulation APIs from within VisionOS apps Integrate this with accessibility services (like Voice Control or AssistiveTouch) We'd appreciate any official guidance or tips from others building similar accessibility apps with alternative input methods in VisionOS. Thanks in advance for any insight you can provide!
0
0
68
Apr ’25
SwiftUI tvOS Accessibility VoiceOver - prevent reading all items in ScrollView over and over
Hi, I'm trying to fix tvOS view for VoiceOver accessibility feature: TabView { // 5 tabs Text(title) Button(play) ScrollView { // Live LazyHStack { 200 items } } ScrollView { // Continue watching LazyHStack { 500 items } } } When the view shows up VoiceOver reads: "Home tab 1 of 5, Item 2" - not sure why it reads Item 2 of the first cell in scroll view, maybe beacause it just got loaded by LazyHStack. VocieOver should only read "Home tab 1 of 5" When moving focus to scroll view it reads: "Live, Item 1" and after slight delay "Item 1, Item 2, Item 3, Item 4" When moving focus to second item it reads: "Item 2" and after slight delay "Item 1, Item 2, Item 3, Item 4" When moving focus to third item it reads: "Item 3" and after slight delay "Item 1, Item 2, Item 3, Item 4" It should be just reading what is focused, idealy just "Live, Item 1, 1 of 200" then after moving focus on item 2 "Item 2, 2 of 200" this time without the word "Live" because we are on the same scroll view (the same horizontal list) Currently the app is unusable, we have visually impaired testers and this rotor reading everything on the screen is totaly confusing, because users don't know where they are and what is actually focused. This is a video streaming app and we are streaming all the time, even on home page in background, binge plays one item after another, usually there is never ending Live stream playing, user can switch TV channel, but we continue to play. Voice over should only read what's focused after user interaction. Original Apple TV app does not do that, so it cannot be caused by some verbose accessibility settings. It reads correctly only focused item in scrolling lists. How do I disable reading content that is not focused? I tried: .accessibilityLabel(isFocused ? title : "") .accessibilityHidden(!isFocused) .accessibilityHidden(true) - tried on various levels in view hierarchy .accessiblityElement(children: .ignore) - even focused item is not read back by voice over .accessiblityElement(children: .ignore) - even focused item is not read back by voice over .accessiblityElement(children: .contain) - tried on various levels in view hierarchy .accessiblityElement(children: .combine) - tried on various levels in view hierarchy .accessibilityAddTraits(.isHeader) - tried on various levels in view hierarchy .accessibilityRemoveTraits(.isHeader) - tried on various levels in view hierarchy // the last 2 was basically an attempt to hack it .accessibilityRotor("", ranges []) - another hack that I tried on ScrollView, LazyHStack, also on top level view. 50+ other attempts at configuring accessibility tags attached to views. I have seen all the accessibility videos, tried all sample code projects, I haven't found a solution anywhere, internet search didn't find anything, AI didn't help as it can only provide code that someone else wrote before. Any idea how to fix this? Thanks.
1
0
87
Apr ’25
Handling VoiceOver Focus When Screen Changes (Push, Present, and SplitViewController)
I have some doubts about how VoiceOver handles focus when the screen updates. When a new UIViewController is pushed onto a UINavigationController or presented modally, how does VoiceOver decide which element to focus on? Is there a way to control or customize this behavior? In a UISplitViewController, when an item is selected in the primary view controller, the focus should shift to the relevant content in the secondary view controller. How can we ensure that VoiceOver correctly moves focus to the right element in the secondary panel?
0
0
102
Apr ’25
VisionOS - Gamepad steals focus
I am developing a vision os app for controlling an underwater ROV. I have ornaments with telemetry and buttons around a central video view feed. I have custom buttons mappings, such as "A" for locking the depth of the drone. However, when I look at buttons or certain ornaments, my custom gamepad logic is kept from running. This means that when a SwiftUI Button gains focus on visionOS, pressing the controller’s A button triggers the system’s default “click” on that Button rather than my custom buttonA handler. Essentially, focus interception by the system is stealing my A-press events and preventing my custom gamepad logic from running. Is there a way to disable the built in gamepad interaction and only allow my custom gamepad mappings?
1
0
105
Apr ’25
Camera Crashes
Hi everybody, I'm trying to build a QR-Code Scanner and Generator App for IOS. Whenever I try to implement the camera the app crashes with this comment: This app has crashed because it attempted to access privacy-sensitive data without a usage description. The app's Info.plist must contain an NSCameraUsageDescription key with a string value explaining to the user how the app uses this data. I tried to reduce the app to the minimum of nothing but camera with the same result. Any ideas? Tank you and best Regards Horst Schippers
0
0
44
Apr ’25
VoiceOver Not Scrolling to Focused TableView Cell
I have a view dynamically overlaid on a UITableView with proper padding (added when certain conditions are met). When VoiceOver focuses on a cell beneath this overlay, the focused element does not scroll into view. I’ve noticed similar behavior in Apple’s first-party Podcasts app. Please find the attached image for reference. How can I resolve this issue and ensure VoiceOver scrolls the focused cell into view?
1
0
75
Apr ’25
VoiceOver Text Recognition Announcing Hidden Labels
I have a UIImageView as the background of a custom UIView subclass. The image itself does not contain any text. On top of this image view, I have added two UILabels. To improve accessibility, I converted the entire view into a single accessibility element and set a proper accessibilityLabel. Additionally, I disabled accessibility for the UIImageView and the labels by setting isAccessibilityElement = false. However, when VoiceOver's Accessibility Recognition's Text Recognition feature is enabled, VoiceOver still detects and announces the text inside the UILabels at the end after reading my custom accessibility properties. This text should not be announced. It seems that VoiceOver treats the UILabel content as part of the UIImageView. Additionally, when using the Explore Image rotor action, the entire subview is recognized as a single image. Is this the expected behavior? If so, is there a way to disable VoiceOver’s text recognition for this view while keeping custom accessibility intact? class BackgroundLabelView: UIView { private let backgroundImageView = UIImageView() private let backgroundImageView2 = UIImageView() private let titleLabel = UILabel() private let subtitleLabel = UILabel() override init(frame: CGRect) { super.init(frame: frame) setupView() } required init?(coder: NSCoder) { super.init(coder: coder) setupView() configureAceesibility() } private func configureAceesibility() { backgroundImageView.isAccessibilityElement = false backgroundImageView2.isAccessibilityElement = false titleLabel.isAccessibilityElement = false subtitleLabel.isAccessibilityElement = false isAccessibilityElement = true accessibilityTraits = .button } func configure(backgroundImage: UIImage?, title: String, subtitle: String) { backgroundImageView.image = backgroundImage titleLabel.text = title subtitleLabel.text = subtitle accessibilityLabel = "Holiday Offer ," + title + "," + subtitle } private func setupView() { backgroundImageView2.contentMode = .scaleAspectFill backgroundImageView2.clipsToBounds = true backgroundImageView2.translatesAutoresizingMaskIntoConstraints = false backgroundImageView2.image = UIImage(resource: .bannerfestival) addSubview(backgroundImageView2) backgroundImageView.contentMode = .scaleAspectFit backgroundImageView.clipsToBounds = true backgroundImageView.translatesAutoresizingMaskIntoConstraints = false addSubview(backgroundImageView) titleLabel.font = UIFont.systemFont(ofSize: 18, weight: .bold) titleLabel.textColor = .white titleLabel.translatesAutoresizingMaskIntoConstraints = false titleLabel.numberOfLines = 0 addSubview(titleLabel) subtitleLabel.font = UIFont.systemFont(ofSize: 14, weight: .regular) subtitleLabel.textColor = .white.withAlphaComponent(0.8) subtitleLabel.translatesAutoresizingMaskIntoConstraints = false subtitleLabel.numberOfLines = 0 addSubview(subtitleLabel) NSLayoutConstraint.activate([ backgroundImageView2.leadingAnchor.constraint(equalTo: leadingAnchor), backgroundImageView2.trailingAnchor.constraint(equalTo: trailingAnchor), backgroundImageView2.heightAnchor.constraint(equalToConstant: 200), backgroundImageView.centerYAnchor.constraint(equalTo: centerYAnchor), backgroundImageView.topAnchor.constraint(equalTo: topAnchor), backgroundImageView.leadingAnchor.constraint(greaterThanOrEqualTo: leadingAnchor), backgroundImageView.trailingAnchor.constraint(equalTo: trailingAnchor), backgroundImageView.bottomAnchor.constraint(equalTo: bottomAnchor), titleLabel.leadingAnchor.constraint(equalTo: leadingAnchor, constant: 16), titleLabel.trailingAnchor.constraint(lessThanOrEqualTo: centerXAnchor), titleLabel.bottomAnchor.constraint(equalTo: centerYAnchor, constant: -4), subtitleLabel.leadingAnchor.constraint(equalTo: leadingAnchor, constant: 16), subtitleLabel.trailingAnchor.constraint(lessThanOrEqualTo: centerXAnchor), subtitleLabel.topAnchor.constraint(equalTo: centerYAnchor, constant: 4) ]) } override func layoutSubviews() { super.layoutSubviews() backgroundImageView.layer.cornerRadius = layer.cornerRadius } }
2
0
84
Apr ’25
iOS18.3.1+ widget: Local color picture load widget crashes
Environment:xcode 16.2 WidgetKit: Image(uiImage: UIImage(named: "jp_jump")!).resizable().scaledToFit().frame(width: 58, height: 16).padding(EdgeInsets(top: 0, leading: 16, bottom: 0, trailing: 0)) ”jp_jump“: Local color picture load widget crashes info: Thread 4: EXC_RESOURCE (RESOURCE_TYPE_MEMORY: high watermark memory limit exceeded) (limit=30 MB)
3
0
60
Mar ’25
Developer Mode Restart without HomeButton
After enabling Developer Mode on my iPhone and restarting it, the device asks me to press the Home button to confirm. Unfortunately, my Home button is broken, so I can’t access Developer Mode. The iPhone itself still works, but I can’t enable the mode. Is there any way to bypass this without the Home button?
3
0
66
Mar ’25
Already Enrolled, but Now Asked to Re-Enroll – Certificates Revoked, No Response
Our company enrolled in the Apple Developer Program as an organization in July 2024. Everything was fine for several months, but in early January 2025, our developer noticed that the certificates were missing. When we logged into our developer account, we were shocked to see a page prompting us to “Enroll Today”—as if we had never joined in the first place. Clicking the enrollment button led us to an error page stating we cannot enroll. We immediately reached out to Apple Developer Support via email, but despite multiple attempts, we received no response. Strangely, our apps remain live on the App Store, App Store Connect functions as usual, and we continue receiving payments every month. However, we are completely blocked from developing and releasing updates. Today, I managed to reach Apple by phone. After being transferred to a senior representative, I was told they couldn’t tell me why this was happening. They only confirmed that a request had been made and that I should “wait.” That’s it—no explanation, no timeline, nothing. While it’s somewhat reassuring that they acknowledge the issue, I’ve already seen other developers with the same problem go unanswered for months. My suspicion? This account might be linked to an individual developer account from way back in 2015 when Apple’s registration process was far less strict. Could that be the issue? No idea—because Apple won’t say a word. Meanwhile, both of our apps have been exposed to several bugs, and customers are waiting for updates. If there’s still no response from Apple, I have no choice but to register a new account—purely to continue supporting our users. CASE ID: 102508598957
4
0
416
Mar ’25
Guided Access Mode From Background
My team is designing an app for retail associates that need to share managed iPads. We keep the app in Guided Access mode on our login app until an auth token is obtained. Then the iPad is opened for general use. Upon signout we need to re-enter guided access mode and we can do this via manual signout easily. But with idle signout, ie after 60 minutes of inactivity, we need to be able to make a call from the background (in a locked state even) and sign out the user, clear the pin code and enter single app mode before restarting. So that hopefully once the device restarts, we have the app in a locked state again until the next user provides credentials that can obtain a new auth token. We are struggling to see if this is even possible. Our bosses will be displeased if we tell them it isn't. So anybody with any tips would be very appreciated.
2
0
190
Mar ’25
AVSpeechSynthesisProviderVoice audioFileSettings field
Hello, the AVSpeechSynthesisVoice has a audioFileSettings attributes let utterance = AVSpeechUtterance(string: text) utterance.voice = AVSpeechSynthesisVoice(identifier: voiceSelected!) print("- voice \(utterance.voice!.audioFileSettings)") ["AVLinearPCMIsBigEndianKey": 0, "AVLinearPCMIsFloatKey": 1, "AVLinearPCMIsNonInterleaved": 1, "AVNumberOfChannelsKey": 1, "AVSampleRateKey": 22050, "AVFormatIDKey": 1819304813, "AVLinearPCMBitDepthKey": 32] This is declared in AVSpeechSynthesisVoice { ... @available(iOS 13.0, *) open var **audioFileSettings:** [String : Any] { get } @available(iOS 17.0, *) open var voiceTraits: AVSpeechSynthesisVoice.Traits { get } } How can we specify the audioFileSettings attributes in a AVSpeechSynthesisProviderVoice ? Cause in AVSpeechSynthesisProviderVoice there is no such field AVSpeechSynthesisProviderVoice { open var name: String { get } open var identifier: String { get } open var primaryLanguages: [String] { get } open var supportedLanguages: [String] { get } open var voiceSize: Int64 open var version: String open var gender: AVSpeechSynthesisVoiceGender open var age: Int } Regards
2
0
95
Mar ’25
Assistive Access Bugs
Hi! I have noticed a few glitches as well as some overall unfortunate cons with the assistive access mode. Alarms, timers, stopwatch, etc. do not sound or alert. However, I have an infant monitor app and I do get that sound alert so I know it is possible.. do I need to download a separate alarm app for it to work? Cannot make FaceTime calls with favorite contacts. Find My iPhone cannot jump to the maps app. Camera cannot zoom in or out. Photos cannot be deleted, edited, or shared in a shared album in the photos app. Photos/videos cannot be sent in messages. Spotify cannot be accessed from the lock screen. Apps do not stay open if you lock the phone screen or leave it on too long without touching the screen (auto locks). There is no flashlight option. I downloaded an app to have this feature but without being touched the screen will lock which shuts off the flashlight feature in the app until I unlock the phone again.
1
0
77
Mar ’25
Accessibility full keyboard access issue.
In our application we are using UIAlertViewController. When accessibility full keyboard access is enabled, and we are trying to dismiss that AlertViewController with Esc key from external keyboard that is not working. We are presenting AlertViewController as a popover. We need dismiss the AlertViewController with Esc key press from external keyboard.
2
0
506
Mar ’25