In the doc's it say this about UiAutomation
Class for interacting with the device's UI by simulation user actions and introspection of the screen content. It relies on the platform accessibility APIs to introspect the screen and to perform some actions on the remote view tree. It also allows injecting of arbitrary raw input events simulating user interaction with keyboards and touch devices. One can think of a UiAutomation as a special type of AccessibilityService which does not provide hooks for the service life cycle and exposes other APIs that are useful for UI test automation.
The APIs exposed by this class are low-level to maximize flexibility when developing UI test automation tools and libraries. Generally, a UiAutomation client should be using a higher-level library or implement high-level functions. For example, performing a tap on the screen requires construction and injecting of a touch down and up events which have to be delivered to the system by a call to injectInputEvent(android.view.InputEvent, boolean).
The APIs exposed by this class operate across applications enabling a client to write tests that cover use cases spanning over multiple applications. For example, going to the settings application to change a setting and then interacting with another application whose behavior depends on that setting.
How exactly is UiAutomation
different from a regular AccessibilityService
, as it doesn't inherit from it in the source code.
public final class UiAutomation {
private static final String LOG_TAG = UiAutomation.class.getSimpleName();
// omitted the rest...
Accessibility Service:
should only be used to assist users with disabilities in using Android devices and apps. They run in the background and receive [...] AccessibilityEvents [...]. Such events denote some state transition in the user interface, for example, the focus has changed, a button has been clicked, etc. Such a service can optionally request the capability for querying the content of the active window.
This is a powerful tool that can give you access to basically everything that happens on your phone, hence the user needs to explicitly enable such service in the phone settings (generally under Accessibility/Installed Services).
UiAutomation:
Well you basically quoted the description in your question, but here you have a more detailed one:
In certain way UiAutomation acts as a decorated AccessibilityService (decorator design pattern.). You can verify it reading the source code of the methods getServiceInfo and setServiceInfo, which internally uses the connected service counterparts (check the imports for a quick look).
This is a wrapper that adds some functionality, for example it allows you to navigate the view hierarchy, etc, without needing to enable the accessibility service in the phone settings every single time you first run an instrumented test. This is achieved with aid of the android.app.Instumentation class and other testing framework components. The system can safely grant the needed permissions because an instrumentation can only be started through adb or from within a System App.
This is not really an AccessibilityService (Not even a real service), but it's probably called that way for simplicity and to draw attention to the underlaying environment.