Search code examples
c#androidunity-game-enginevuforia

I want to develop with vuforiaEngine and Unity and use the front camera to read ImageTarget


I would like to use vuforia and unity in some of my android native apps to achieve AR. However, currently the image target recognition with the rear camera works well, but not with the front camera. I've been researching and found some references that say that it is not supported after a certain version. Is this no longer possible?

The version of vuforiaEngine used is 10.27. The following is the code we tried in unity.


using UnityEngine;
using UnityEngine.Networking;
using System.Collections;
using UnityEngine.Android;
using System.IO;

public class LoadAndUseFrontCamera : MonoBehaviour
{
    private string fileName = "test.jpg";
    private string imagePath;
    public Texture2D defaultTexture; 
    private WebCamTexture webcamTexture;

    void Start()
    {
        imagePath = GetExternalStoragePublicDirectory("Pictures", fileName);
        StartFrontCamera();
    }

    void StartFrontCamera()
    {
        WebCamDevice[] devices = WebCamTexture.devices;

        if (devices.Length > 0)
        {
            for (int i = 0; i < devices.Length; i++)
            {
                Debug.Log($"deviceName: {devices[i].name}, frontCamera: {devices[i].isFrontFacing}");
            }

            
            int frontCameraIndex = -1;
            for (int i = 0; i < devices.Length; i++)
            {
                if (devices[i].isFrontFacing)
                {
                    frontCameraIndex = i;
                    break;
                }
            }

            if (frontCameraIndex != -1)
            {
              
                webcamTexture = new WebCamTexture(devices[frontCameraIndex].name);
                Renderer renderer = GameObject.Find("ARObject")?.GetComponent<Renderer>();

                if (renderer != null)
                {
                    renderer.material.mainTexture = webcamTexture;
                    webcamTexture.Play(); 
                }
                else
                {
                    Debug.LogError("Renderer notfound");
                }
            }
            else
            {
                Debug.LogWarning("frontCamera notfound");
                ApplyDefaultTexture();
            }
        }
        else
        {
            Debug.LogWarning("cameraDevice notfound");
            ApplyDefaultTexture();
        }
    }

    void ApplyDefaultTexture()
    {
        Renderer renderer = GameObject.Find("ARObject")?.GetComponent<Renderer>();
        if (renderer != null)
        {
            renderer.material.mainTexture = defaultTexture;
           
        }
        else
        {
            Debug.LogError("Renderer notfound");
        }
    }

    void OnApplicationPause(bool pauseStatus)
    {
        if (webcamTexture != null)
        {
            if (pauseStatus)
            {
                webcamTexture.Pause(); 
               
            }
            else
            {
                webcamTexture.Play(); 
            }
        }
    }

    void OnDestroy()
    {
        if (webcamTexture != null)
        {
            webcamTexture.Stop(); 
        }
    }
//Code to flip the camera
    Texture2D FlipTexture(Texture2D originalTexture)
    {
        Texture2D flippedTexture = new Texture2D(originalTexture.width, originalTexture.height);
        for (int y = 0; y < originalTexture.height; y++)
        {
            for (int x = 0; x < originalTexture.width; x++)
            {
                Color pixel = originalTexture.GetPixel(originalTexture.width - x - 1, originalTexture.height - y - 1);
                flippedTexture.SetPixel(x, y, pixel);
            }
        }

        flippedTexture.Apply();

        return flippedTexture;
    }

    string GetExternalStoragePublicDirectory(string directoryType, string fileName)
    {
#if UNITY_ANDROID && !UNITY_EDITOR
        using (AndroidJavaClass environment = new AndroidJavaClass("android.os.Environment"))
        {
            using (AndroidJavaObject file = environment.CallStatic<AndroidJavaObject>("getExternalStoragePublicDirectory", directoryType))
            {
                return file.Call<string>("getAbsolutePath") + "/" + fileName;
            }
        }
#else
       
        return Application.persistentDataPath + "/" + fileName;
#endif
    }
}

Solution

  • Vuforia or ArFoundation are framework that will work with ARCore and ARKit.

    And it is up to these SDKs to choose a hardware camera.

    And these SDKs use front camera to Face Tracking.

    From Google ARCore:

    When the front camera is selected, ARCore's behavior changes in the following ways:

    The display will be mirrored. Specifically, Camera.getProjectionMatrix(float[], int, float, float) will include a horizontal flip in the generated projection matrix and APIs that reason about things in screen space such as Frame.transformCoordinates2d(Coordinates2d, float[], Coordinates2d, float[]) will mirror screen coordinates. Open GL apps should consider using glFrontFace to render mirrored assets without changing their winding direction.

    Camera.getTrackingState() will always return TrackingState.PAUSED. All forms of Frame.hitTest() will always return an empty list.

    Camera.getDisplayOrientedPose() will always return an identity pose.

    Session.createAnchor(Pose) will always throw NotTrackingException.

    Planes will never be detected.

    Session.configure(Config) will throw if the supplied configuration requests Cloud Anchors or Augmented Images.

    SO, you can see that when in Front Camera, getTrackingState() is not used and the same to createAnchor(Pose) that both are used in imageTracking then, with ARCore and ARKit front camera will work only with faceTracking.