I'm working on implementing screen-capturing of a Mac app suing CGDisplayStream
, similar to the question asked here, but in Swift.
Below is the code I have in my app's single ViewController:
override func viewDidAppear() {
super.viewDidAppear()
let backgroundQueue = DispatchQueue(label: "com.app.queue",
qos: .background,
target: nil)
let displayStream = CGDisplayStream(dispatchQueueDisplay: 0, outputWidth: 100, outputHeight: 100,pixelFormat: Int32(k32BGRAPixelFormat), properties: nil, queue: backgroundQueue) { (status, code, iosurface, update) in
switch(status){
case .frameBlank:
print("FrameBlank")
break;
case .frameIdle:
print("FrameIdle")
break;
case .frameComplete:
print("FrameComplete")
break;
case .stopped:
print("Stopped")
break;
}
self.update()
}
displayStream?.start()
}
func update(){
print("WORKING")
}
What seems to be happening is that the queue process isn't being properly initialized, but I'm not sure...when the app starts, the self.update()
is called once, but only once. Given that the display stream has started properly, I would expect this function to be called repeatedly, but it's only called once.
Anyone have any ideas? Am I not setting up a queue properly?
Thank you!
The problem is that no reference to displayStream
is kept outside
of viewDidAppear
, so the stream will be deallocated on return
of that method.
Making it a property of the view controller should solve the problem:
class ViewController: NSViewController {
var displayStream: CGDisplayStream?
override func viewDidAppear() {
super.viewDidAppear()
// ...
displayStream = CGDisplayStream(...)
displayStream?.start()
}
override func viewWillDisappear() {
super.viewWillDisappear()
displayStream?.stop()
displayStream = nil
}
}
Releasing the stream in viewWillDisappear
breaks the retain cycle
and allows the view controller to be deallocated (if it is part of
a view controller hierarchy).