As an exercise with accessibility and a personal challenge to myself I decided that I'd like to write a relatively simple app.
The app would show an MKMapView
of the United States and when you tap anywhere on it, it uses an MKReverseGeocoder
to show you the locality, state and country where you tapped. This works fine, although I have to hijack the touch events by adding a WildcardGestureRecognizer
to the MKMapView
. This works great with VoiceOver
turned off.
When I turn VoiceOver
on and tap on the map, it says "map". If I double tap it makes it little clicky noise which indicates that you've activated it. To be honest, I'm at a loss for how to intercept these events. I know the general solution is to put a transparent view above the whole screen and pass down touches, but will that work with VoiceOver
?
For the record, the WildcardGestureRecognizer
I'm using is found here:
How to intercept touches events on a MKMapView or UIWebView objects?
The problem is that when you turn on VoiceOver, touch events are blocked by the system. To prove it, put a trace in your touchesBegan function. It should fire fine until you turn on VoiceOver.
The little clicky sound you hear when you double tap is the VoiceOver over-ride gesture. VoiceOver has its own set of gestures but you can over-ride VoiceOver's gestures with a double-tap-hold.
For example, swiping down does not scroll a page with VoiceOver on. But if you double-tap-hold, wait for the clicky sound and then swipe down, it will scroll.