I’m currently working on a SwiftUI
project where I need to implement true multitouch support on a view that consists of multiple buttons. The existing implementation supports dragging across these buttons to update an activeButtonIndex
state, highlighting the button currently being dragged over. However, it currently only supports single-touch interactions. I want to enhance this to support multitouch, where multiple buttons can be activated simultaneously if multiple fingers are used.
I would like to maintain all the current functionality. I’m looking for a SwiftUI
-only solution, without resorting to UIKit
interop.
I tried referencing Apple’s developer documentation article Composing SwiftUI gestures to no avail.
Here’s the simplified version of my current code:
import SwiftUI
struct ContentView: View {
@State private var activeButtonIndex: Int? = nil
@State private var buttonFrames: [CGRect] = Array(repeating: .zero, count: 11)
private let buttonCount = 11
var body: some View {
VStack {
activeButtonView
buttonsView
}
.gesture(dragGesture)
}
private var activeButtonView: some View {
Text("Active Button \(activeButtonIndex.map { "\n\($0 + 1)" } ?? "\nNone")")
.multilineTextAlignment(.center)
.padding()
}
private var buttonsView: some View {
ScrollView(.horizontal) {
LazyHStack {
ForEach(0 ..< buttonCount, id: \.self) { index in
Button(action: {}) {
EmptyView()
}
.frame(width: 70, height: 210)
.background(backgroundForButton(at: index))
.overlay(GeometryReader { geometry in
Color.clear.onAppear {
buttonFrames[index] = geometry.frame(in: .global)
}
})
}
}
}
}
private func backgroundForButton(at index: Int) -> Color {
activeButtonIndex == index ? .green : .blue
}
private var dragGesture: some Gesture {
DragGesture(minimumDistance: 0, coordinateSpace: .global)
.onChanged { value in
updateActiveButton(with: value.location)
}
.onEnded { _ in
activeButtonIndex = nil
}
}
private func updateActiveButton(with location: CGPoint) {
for (index, frame) in buttonFrames.enumerated() {
if frame.contains(location) {
activeButtonIndex = index
return
}
}
activeButtonIndex = nil
}
}
Has anyone implemented something similar or can guide me on how to approach adding multitouch support in this context? Any advice or examples would be greatly appreciated.
iOS 18: Multiple finger / Multitouch recognition is now possible in SwiftUI by using 'SpatialEventGesture'. Apple documentation link: https://developer.apple.com/documentation/swiftui/spatialeventgesture
Apple example:
struct ParticlePlayground: View {
@State var model = ParticlesModel()
var body: some View {
Canvas { context, size in
for particle in model.particles {
context.fill(Path(ellipseIn: particle.frame),
with: .color(particle.color))
}
}
.gesture(
SpatialEventGesture()
.onChanged { events in
for event in events {
if event.phase == .active {
// Update particle emitters.
model.emitters[event.id] = ParticlesModel.Emitter(
location: event.location
)
} else {
// Remove emitters when no longer active.
model.emitters[event.id] = nil
}
}
}
.onEnded { events in
for event in events {
// Remove emitters when no longer active.
model.emitters[event.id] = nil
}
}
)
}
}