App Store Connect 輔助說明
Voice Control evaluation criteria
Description
Using Apple’s Voice Control feature, users with reduced mobility or dexterity can navigate an app’s interface using only their voice. When looking at the screen, they can use commands like “tap," "click," or "swipe" to interact with on-screen elements.
Goals
Note: Most of the work that goes into making your app accessible to VoiceOver users will also make it more accessible for Voice Control users. For this reason, we recommend you start with the VoiceOver evaluation criteria before evaluating Voice Control.
The following sections provide more detail about how to determine whether your app supports Voice Control well. The goal is to help ensure users with disabilities can leverage all common tasks of the app, therefore performing this evaluation will help you determine whether to indicate your app Supports Voice Control on the App Store.
Getting started with testing
In order to accurately indicate support, proficient testing of your app using Voice Control is necessary on all platforms for your app. Even if you already have a basic proficiency in testing with Voice Control, take some time to go beyond the basics.
Review the resources below to learn more about using Voice Control for each device your app supports.
-
For iPhone, watch “How to use Voice Control on iPhone, iPad, and iPod touch,” then visit Use Voice Control commands to interact with iPhone and Use Voice Control on your iPhone, iPad, or iPod touch.
-
For iPad, watch “How to use Voice Control on iPhone, iPad, and iPod touch,” then visit Use Voice Control commands to interact with iPad and Use Voice Control on your iPhone, iPad, or iPod touch.
-
For Mac, visit Get started with Voice Control on Mac.
-
For Apple Vision Pro, visit the Use Voice Control to interact with Apple Vision Pro.
Indicating support for Voice Control
You may say your app supports Voice Control if users are able to navigate and interact with your app using only voice commands. Users should be able to complete all of the common tasks, actions, and functions of your app using only Voice Control, without touching the display.
Using only their voice, users should be able to activate all tappable or clickable elements on the screen.
-
Speak “Show numbers” to make sure all buttons, links, and other interactive elements display a number.
-
Speak “Show names” to make sure all buttons, links, and other interactive elements display a label, also known as “alternative text.” Refer to the VoiceOver evaluation criteria for more tips on writing good default labels, and review how to use accessibilityInputLabels(_:) for Voice Control.
-
If third-party or user-generated content is required in your common tasks, refer to the detailed guidance for third-party content on the Overview of Accessibility Nutrition Labels.
-
Use commands like “tap” or “click” plus the name or number of a control to drive your app’s common workflows. For example, “Tap Compose,” “Click Back,” or “Tap 4.”
-
Match Voice Control labels to the visible text. If the Voice Control label is different from the visible text in your app (for example, “Leave call” instead of “End call”), users may be confused. Your users may be forced to speak “Show names” in order to understand why Voice Control isn’t working as they expected with your app.
Using only their voice, users should be able to operate all interactions in your app, including those that are more complex.
-
If your app uses swipes, long presses, secondary clicks, or other ways to show additional interactive controls, make sure those are accessible to Voice Control users. For example, use a custom action. Elements with custom actions will be indicated with a double chevron; try speaking “Show actions for <number>.”
-
If your app reveals a hidden user interface (for example, on hover or swipe), make sure that the user has a speech-only way to show the hidden elements or trigger the same behavior through an action or context menu.
-
Ensure scrolling works in your app. For example, “Scroll down.” For panning views like maps or layout canvases, try out speech commands such as “Pan left” and “Zoom out.”
-
If other multi-touch gestures are required, try speaking the gesture, such as “Swipe up with two fingers.”
-
If your app has a recording option, or if you have a custom dictation function, ensure users can start and stop the recording using only their voice.
Using only their voice, users should be able to dictate and edit text in any text field.
-
Make sure that users can dictate text in any text field in your app. Speak “type” and the text you’d like to type.
-
Try speaking commands like “Select” and the text you want selected. Ensure the text is selected, then speak “Delete that” and verify the text is deleted.
-
If you use standard text fields from Apple frameworks, Voice Control will be automatically supported. However, if you use your own custom text entry or a third-party framework, extra work may be required. For example, Apple has an open-source accessibility plug-in for Unity-based apps.
-
If your app uses any custom text fields, text entry behavior, or input validation, review and test the rest of Voice Control’s text commands very carefully.
Additional suggestions
-
Some users with physical or cognitive impairments may need more time, or wish to avoid distracting auto-play defaults. If aspects of your app auto-play or auto-hide on a time delay, consider allowing users to cancel or extend that delay.
-
Consider implementing SiriKit and App Intents to make your app better for all users, including Voice Control users. Review “Bring your app to Siri” for more details.
Even after you’re able to indicate support for Voice Control in the common tasks of your app, there are likely further improvements you’ll be able to make to the accessibility of your app. Re-evaluate your app's support for Voice Control every time you update your app. Set a goal to make your app more accessible to more people in every release.