I'm having some issues with the now available getUserMedia api available in the WKWebView (as of iOS 14.3).
On a page in the webview, I will access the camera using the following code:
navigator.mediaDevices.getUserMedia({
video: {
facingMode: "user"
},
audio: false
}).then(function(webcamStream) {
document.querySelector("#video").srcObject = webcamStream; /* this is a HTML video tag available on the page */
}).catch(function() {
console.log("fail")
});
This... mostly works. Unlike in Safari (and now Chrome), instead of the video element just showing what is in the video track of the webcamStream
MediaStream object, it opens up a "Live Broadcast" panel and the video track pauses whenever this is closed. Is there anyway to replicate the behaviour in Safari and Chrome, where there is no panel popup?
Image of the "Live Broadcast" panel
Thanks
For Safari iOS web based apps, link: Video Playback on Safari references that the element needs to have a playsInLine attribute along with the idea that it won't autoplay unless it is muted and it pauses if it is out of frame. <video id="video" autoplay playsInline muted></video>
and the stream may need to stream.play() to get it started.
https://developer.apple.com/documentation/webkitjs
https://developer.apple.com/documentation/webkit/delivering_video_content_for_safari
As for recording, https://webkit.org/blog/11353/mediarecorder-api/ has info for the MediaRecorder interface available in Safari 14.03? I don't have much info for native iOS apps.