I'm looking for a way to capture (X,Y) locations of touches that user might be making within screens my app (relative to most backward view of a view controller).
Is there a way for me to subclass some kind of (UIResponder?) or add a category on a UIView to be able to intercept touches, process them, but still allow them to interact with the content (buttons, gesture recognizers, etc)?
I was thinking of implementing "touchesBegan:", but in my experience that frequently messes up existing button or gesture recognizer logic.
This is possible up to some limits. There is a sample project on GitHub by Todd Reed, you can also use that code and modify it for your needs.
Having a quick look at the code, it is keeping a custom UIWindow
on top using swizzled methods and rendering touches on that view. It is also using sendEvent:
method of the UIApplication
class, which redirects the events down the view hierarchy.
This is a much more elegant solution than using UIGestureRecognizer
on each view controller. Many analytics solutions are also doing this very effectively.