I have an app that displays video in a subview and where it would be nice to give the option to display that video on a second screen such as an Apple TV and be able to use that freed-up space to show additional controls.
I've found all sorts of help about how to do that, but I'm hitting a wall even before getting out of the starting gate.
In order to detect that the app has started up in a multiple display environment, all the sample code features a line like...
if (UIScreen.Screens.Length > 1) {
// ...
}
(I'm doing this in C#/Xamarin, though I doubt the problem is related to that; anyway, the snippets are in C#)
My problem is that the array of screens is always 1 no matter what I do. The iPad is running iOS 11.2.5, and if I turn on mirroring, the iPad is mirrored, but , again, the array of screens only has a single item.
There are also a couple of observers to detect screens being added/removed while the app is running. I haven't seen Xamarin specific code, but I presume it looks like:
NSNotificationCenter.DefaultCenter.AddObserver(this, UIScreen.DidConnectNotification, NSKeyValueObservingOptions.New, IntPtr.Zero);
NSNotificationCenter.DefaultCenter.AddObserver(this, UIScreen.DidDisconnectNotification, NSKeyValueObservingOptions.New, IntPtr.Zero);
Anyway, those never fire even if I add/remove the Apple TV or enter/exit Mirroring Mode on the iPad.
Oh; also if I do
avPlayer.AllowsExternalPlayback = true;
avPlayer.UsesExternalPlaybackWhileExternalScreenIsActive = true;
then that works as expected, too. The video now appears full-screen on the Apple TV and the UIView on the iPad containing the avPlayer greys out rather than showing the video.
However, that's not what I'm looking for. I would like to control the layout of both screens and that does neither. (While I do want the video to be full screen on the Apple TV, I don't want it to be an AVPlayerViewController and I do want to repurpose the screen real-estate taken up by the iPad video view)
At the end of the day, all I think I need is to manage to get
UIScreen.Screens.Length to be equal to 2 when I launch the app.
What's the secret of getting UIScreen to detect/report a second display?
When an app is launched with screen mirroring already enabled, the UIScreen.screens
array initially only contains the device's screen. Shortly after launch, iOS posts a UIScreenDidConnect
notification to advise your app that a second screen is connected.
What you will see at launch is that the captured
property of your main screen is true
if mirroring is enabled, however you can't actually access the second screen until after the notification is posted. Note that captured
could also indicate that screen recording is in progress.
Although this seems slightly counter-intuitive it actually makes your coding a little simpler; you need to observe the UIScreenDidConnect
and UIScreenDidDisconnect
notifications anyway and now you don't need to write any special code to handle the case where the app is launched with a second screen already attached.
You can use something like this in your didFinishLaunching
:
let nc = NotificationCenter.default
nc.addObserver(forName: NSNotification.Name.UIScreenDidConnect, object: nil, queue: nil) { (notification) in
print("Screen connected")
self.enableExternalDisplay()
}
nc.addObserver(forName: NSNotification.Name.UIScreenDidDisconnect, object: nil, queue: nil) { (notification) in
print("Screen disconnected")
self.disableExternalDisplay()
}
UPDATE
Actually, it looks like you have the key/value observing format of AddObserver
in your code, when you actually want notification observing. Something like:
NSNotificationCenter.DefaultCenter.AddObserver(UIScreen.DidConnectNotification,OnScreenConnected)
And then you need to implement an OnScreenConnected
method.