Search code examples
augmented-realityarkitrealitykit

Is there a way to use multiple cameras (.ar and .nonAR) for the same scene in RealityKit?


I would like to create a main AR camera (cameraMode: .ar) and a second picture-in-picture virtual camera (cameraMode: .nonAR) from an orthogonal view in RealityKit. The goal of the second camera is to visualize model entities from a different perspective in the same scene without the camera feed.

While I do not have experience with SceneKit, it appears that in non-AR scenes this can be done with two different cameras by changing the pointOfView property of the second camera. See stack overflow question and answer.

I cannot find a similar property in RealityKit. Is this possible in RealityKit? Has anyone tried this using SceneKit as the renderer as opposed to RealityKit?

Without changing the position of the nonAR camera, I have tried to create the two views described above in SwiftUI, but receive multiple errors:

-[MTLTextureDescriptorInternal validateWithDevice:]:1248: failed assertion `Texture Descriptor Validation MTLTextureDescriptor has width (4294967295) greater than the maximum allowed size of 16384. MTLTextureDescriptor has height (4294967295) greater than the maximum allowed size of 16384. MTLTextureDescriptor has invalid pixelFormat (0).

Sample code below:

import SwiftUI
import RealityKit


let arViewOne: ARView = ARView(frame: .zero, cameraMode: .ar, automaticallyConfigureSession: true)
let arViewTwo: ARView = ARView(frame: .zero, cameraMode: .nonAR, automaticallyConfigureSession: true)

let boxAnchor = try! Experience.loadBox()

struct ContentView : View {
    var body: some View {
    
    HStack {
        ARViewContainerOne().edgesIgnoringSafeArea(.all)
        ARViewContainerTwo().edgesIgnoringSafeArea(.all)
    }
}
}

struct ARViewContainerOne: UIViewRepresentable {

func makeUIView(context: Context) -> ARView {
    
    arViewOne.scene.anchors.append(boxAnchor)
    return arViewOne
}

func updateUIView(_ uiView: ARView, context: Context) {}
}

struct ARViewContainerTwo: UIViewRepresentable {

func makeUIView(context: Context) -> ARView {
    return arViewTwo
}

func updateUIView(_ uiView: ARView, context: Context) {}
}

Solution

  • I have been unable to get it to work in RealityKit. With a caveat, I have been able to produce the desired effect using SceneKit. However, I ran into a problem. The second camera always displays the camera feed. When I try to change the scene background color, it will not change. I need the AR camera to have the camera feed and the nonAR camera fro

    When looking up how to solve this problem, I found another developer looking to solve this same problem. That developer uses a somewhat inconvenient workaround which is less than ideal. Creating two camera views, one AR and on nonAR in SceneKit.

    Ideally, I would like to get this to work in RealityKit. But this is an OK workaround.

    I think it was Andy Fedoroff that provided an answer to supplying different background opacities to RealityKit view which worked for me on a different project. I have not been able to find this same solution in SceneKit.

       arView.environment.background = .cameraFeed(exposureCompensation: -4.0) // -4 is partial