Edit: Just noticed, that sometimes, it works, and sometimes not. And I do not know why.
I am making an App made mostly for blind, VoiceOver will probably be active. In one view I need to make my interactions and gestures myself. In there I am trying to make a zone that is directly interactable, so that the functionallity behind it works like there is no Voiceover active, even when it is. But when I do this, instead of printing text on double tap, VoiceOver always tells: "Zone direct interaction", or something similar (The testdevice is not set to english).
Does anyone has an idea what the problem could be?
This is my View:
struct MyView: View {
var body: some View {
TestView()
.accessibilityAddTraits(.allowsDirectInteraction)
}
}
And that is the TestView:
struct TestView: View {
var body: some View {
Rectangle()
.onTapGesture(count: 2) { print("A View was tapped") }
.onAppear { print("A View was created") }
}
}
When having a view that uses up all the space it cannot be made directly interactable as default as it seems. The user needs to do this himself using the rotor or the settings:
Settings App/Accessibility/VoiceOver/Rotor Actions/Direct Touch Apps/
When the App is checked in that setting the view marked with the trait .allowDirectInteraction does work as if VoiceOver is not active, though this should only be used for elements that are accessible on their own, and not as easy way to avoid making the app accessible for those parts!