Since installing iOS 11.1 we have observed that a strip along the top and bottom of the screen has become unresponsive to touches. This affects both a landscape app and a portrait app which I develop for (they use metal API for rendering). The issue has also been observed on some other developer's apps e.g. Golf Clash but not others - as we use an in-house engine there is no shared code between our apps and other developer's apps.
Is this a functionality change or known issue with workaround? I can't find any information on StackOverflow or in the Apple Developer forums or in Apple documentation.
Perhaps it relates to the introduction of safe zones that appeared in iOS 11 which our titles are not fully supporting yet (but I wasn't expecting safe zones to affect touch, or to appear on existing devices). Perhaps it relates to the new notification swipe-down and control center swipe-up behaviour - we've noticed they used to open a tab on the first swipe, but now they immediately open, however, going into the settings and disabling the control center in apps doesn't fix it.
Observed on both an iPhone6 and an iPhone6S, so it's unlikely to relate to 3D touch features as the iPhone6 does not support that.
Edit: Will investigate further after some sleep, but I think that the touch events for the offending regions are in fact being generated. The problem is one of timing, normally the touchesBegan events are received about 20-50ms before the touchesEnded events, even for quick presses. However, in these top and bottom areas, I receive the touchesEnded event almost immediately after touchesBegan (less than 1ms). My working theory is that iOS is holding back the touch event to see if it becomes a swipe gesture, and only sends the touchesBegan event once it decides it won't be handled as a swipe. Probably will have to improve my input code so that a touch that ends as soon as it begins doesn't fall through the cracks and ends up ignored by the game code (which unfortunately polls the touch state rather than being event driven).
Answering my own question here:
It does indeed appear that iOS11.1 changes the timing for touch events at the top and bottom of the screen. Presumably this is in order to handle the new single-swipe gestures for opening notifications and the control center. The touchesBegan events and touchesEnded events appear at almost exactly the same time.
I believe that if I did file a Radar that Apple would consider the behaviour change as-designed (and reasonably so). However, Apps that follow the pattern of processing touch events into a touch state, and then poll that touch state to detect button presses are likely to break. The approach fails because these touches get removed from the touch state immediately after being added with no opportunity for the polling code to observe them.
The best fix is probably to make input handling event-driven throughout the App. I felt that was impractical in my case due to the amount of code that would be impacted across multiple projects. My fix was to detect the situation and not allow a touchesEnded event to affect the touch state until after the App had a chance to notice that the touch had taken place.
As an aside, similar behaviour can be observed on Android if you enable Magnification Gestures in Accessibility Settings (again the OS is holding back touch events from the App until it determines that a gesture isn't taking place - in this case a triple-tap). I'm pleased that in fixing this iOS11.1 issue, I've also addressed the fact that my Android apps were incompatible with Magnification Gestures too.