ARfoundation is Unity's unified API to access Augmented Reality functionalities both on iOS via ar-Kit and Android via ar-Core.
Context:
They have an API to access the camera image on the CPU, which can we found here: https://docs.unity3d.com/Packages/[email protected]/manual/cpu-camera-image.html
Wat I have tried:
I set up a minimalistic scene as described in tutorials, with nothing but a directional light, an ArSession and an ArSessionRoot with an ArPlaneManager and ArParticleManager.
I then add the example script given in the documentation script here: https://docs.unity3d.com/Packages/[email protected]/manual/cpu-camera-image.html#synchronously-convert-to-grayscale-and-color
only replacing
if (!cameraSubsystem.TryGetLatestImage(out image))
return;
with
if (!cameraSubsystem.TryGetLatestImage(out image))
{
Debug.LogError("getting latest image failed again");
image.Dispose();
/*/ image is a struct, so it's never null.
Probably it only needs to be disposed of if it has properly been used
which would explain why they didn't dispose of it in the example code.
Still, since it might be that trying-to-get-the-latest-image fails the first couple of times and would after that work,
I'd still like to make sure it doesn't fail because there are too many image instances
The documentation says: The CameraImage is a struct which represents a native resource. When you are finished using it, you must call Dispose on it to release it back to the system. Although you may hold a CameraImage for multiple frames, most platforms have a limited number of them, so failure to Dispose them may prevent the system from providing new camera images.
//*/
return;
}
Debug.Log("getting latest image succeeded for once");
Someone else seems to have had the same issue as me, so I asked this question in this forum as well: https://forum.unity.com/threads/arfoundation-camera-api-not-working-on-android.663979/#post-4510765
Problem:
The arCameraBackground correctly displays the webcam feed, the cameraTrackedPoseDriver updates the virtual camera position correctly, since I can properly see the arPlaneManager's and arPointCloudManage's effects (planes are detected and properly shown and depth is obviously perceived with feature points being displayed as particles when the camera is moved)
BUT: even though the AR experience is working fine, I get spammed with the error log, and I never get past that. This means the event gets called over and over, but the attempt to get the latest image just always fails.
I need to access the camera image on CPU, I tried both synchronous and asynchronous examples, both are returning false when I try to get the latest image. Any help or hint much appreciated, thank you =)
*EDIT: it actually works on android for me, but not on iOS
The issue seams to have been linked with build settings. If you ever encounter this problem, play around with build settings.
I don't know exactly what the problem was for me, so I can't specifically tell what is exactly important, but for example if you're loading and instantiating AR game object prefabs via code, unity might strip them if you have stripping enabled.