Hello!
My question is about 1) if we can use any and or all accessibility features within a sandboxed app and 2) what steps we need to take to do so.
Using accessibility permissions, my app was working fine in Xcode. It used NSEvent.addGlobalMonitorForEvents and localMoniter, along with CGEvent.tapCreate. However, after downloading the same app from the App Store, the code was not working. I believe this was due to differences in how permissions for accessibility are managed in Xcode compared to production.
Is it possible for my app to get access to all accessibility features, while being distributed on the App Store though? Do I need to add / request any special entitlements like <key>com.apple.security.accessibility</key><true/>?
Thanks so much for the help. I have done a lot of research on this online but found some conflicting information, so wanted to post here for a clear answer.
Here’s my understanding of your goals:
-
You’re distributing a Mac app via the App Store.
-
You want that app to watch for keyboard events, even when it’s inactive.
-
When it detects a relevant event, your app performs some action.
Is that correct?
If so, then CGEventTap
should work for you. The user will need to grant your app the System Settings > Privacy & Security > Input Monitoring privilege, but once they do it’ll be able to use CGEventTap
to monitor keyboard events.
Share and Enjoy
—
Quinn “The Eskimo!” @ Developer Technical Support @ Apple
let myEmail = "eskimo" + "1" + "@" + "apple.com"