I'm following this tutorial to take a picture using the Hololens 2 camera and show it using a billboard (Quad
object). When I try to run that code, even using Holographic Emulation and playing the scene in a connected Hololens 2, I get an error because the script cannot access the camera (Failed to initialize IMediaCapture (hr = 0xC00DABE0)
). Thing that doesn't happen if I build the app and deploy it in the HoloLens 2.
My question is: is there a way to grant access to Unity to this camera, so when I hit Play and go into Game Mode (with Holographic Emulation enabled and the Hololens 2 connected), the script can access the camera?
Again, the script works if I effectively deploy it in the Hololens 2, but having to build the project in Unity and later in VS for every small test takes too long. I'm using Unity 2019.4.26f and VS 2019.
Code in case the link doesn't work:
using UnityEngine;
using System.Collections;
using System.Linq;
using UnityEngine.Windows.WebCam;
public class PhotoCaptureExample : MonoBehaviour
{
PhotoCapture photoCaptureObject = null;
Texture2D targetTexture = null;
// Use this for initialization
void Start()
{
Resolution cameraResolution = PhotoCapture.SupportedResolutions.OrderByDescending((res) => res.width * res.height).First();
targetTexture = new Texture2D(cameraResolution.width, cameraResolution.height);
// Create a PhotoCapture object
PhotoCapture.CreateAsync(false, delegate(PhotoCapture captureObject) {
photoCaptureObject = captureObject;
CameraParameters cameraParameters = new CameraParameters();
cameraParameters.hologramOpacity = 0.0f;
cameraParameters.cameraResolutionWidth = cameraResolution.width;
cameraParameters.cameraResolutionHeight = cameraResolution.height;
cameraParameters.pixelFormat = CapturePixelFormat.BGRA32;
// Activate the camera
photoCaptureObject.StartPhotoModeAsync(cameraParameters, delegate(PhotoCapture.PhotoCaptureResult result) {
// Take a picture
photoCaptureObject.TakePhotoAsync(OnCapturedPhotoToMemory);
});
});
}
void OnCapturedPhotoToMemory(PhotoCapture.PhotoCaptureResult result, PhotoCaptureFrame photoCaptureFrame)
{
// Copy the raw image data into our target texture
photoCaptureFrame.UploadImageDataToTexture(targetTexture);
// Create a gameobject that we can apply our texture to
GameObject quad = GameObject.CreatePrimitive(PrimitiveType.Quad);
Renderer quadRenderer = quad.GetComponent<Renderer>() as Renderer;
quadRenderer.material = new Material(Shader.Find("Unlit/Texture"));
quad.transform.parent = this.transform;
quad.transform.localPosition = new Vector3(0.0f, 0.0f, 3.0f);
quadRenderer.material.SetTexture("_MainTex", targetTexture);
// Deactivate our camera
photoCaptureObject.StopPhotoModeAsync(OnStoppedPhotoMode);
}
void OnStoppedPhotoMode(PhotoCapture.PhotoCaptureResult result)
{
// Shutdown our photo capture resource
photoCaptureObject.Dispose();
photoCaptureObject = null;
}
}
It is highly recommend to refer Mixed reality capture to get video stream to render on a texture or refer Research Mode to get in-depth camera/sensor data.
Please also double-check the asset directory is correct in this line:
quadRenderer.material = new Material(Shader.Find("Unlit/Texture"));