I was wondering if there was a way to implement the same type of functionality of iOS's Assistive Gesture in Android. I know that actions can be performed on behalf of the user through the AccessibilityService, but is it possible to use the AccessibilityService in the same manner?
It sounds like you want AccessibilityNodeInfo.performAction(int). This allows an accessibility service to perform an action (click, long-press, etc.) on a view or virtual view in the app hierarchy.
To do so, your AccessibilityService will need to be set up to retrieve window content. You can read more about this in the "Retrieving window content" section of the AccessibilityService documentation.