Hello,
I'm seeking some clarification regarding the use of accessibility and input monitoring APIs in sandboxed apps that are distributed through the App Store.
I understand that accessibility permissions are generally restricted for App Store apps. However, I've seen several recently released apps request these permissions directly upon first launch. I'm aware that apps submitted prior to 2012 may have legacy access to certain APIs, but the ones I'm referring to appear to be recent - within the past year.
While it's possible these apps were approved despite the restrictions, I want to make sure I'm not overlooking something. I also came across a recent discussion on this topic, and one post in particular stood out: Link
I’d really appreciate some clarification on what's officially allowed. Specifically:
Are accessibility permissions ever allowed? If so, under what circumstances?
Is input monitoring permitted for apps on the App Store? (The referenced post says yes, but since it's from 2022, I just want to confirm)
The linked post suggests that event generation might be allowed on the App Store, though the author hadn’t explored that privilege in detail and recommended opening a DTS tech support incident. I’ve done that and have a support case open - would it be possible to take a closer look at this?
For context, my app (currently distributed outside the App Store) uses CGEventPost and CGEventCreateMouseEvent to modify mouse behavior.
Thank you