Search code examples
javalibgdxaugmented-realityvuforia

How can I render a LibGDX modelInstance on Vuforia AR library?


I know most questions on the community should be done with at least some code. But I´m totally lost here, I don´t even know where to start. What I want to do is use the Vuforia AR Library to render LibGDX 3D modelInstances. However, I don't know how can I make Vuforia render the modelInstances or use a libGDX camera as it´s camera.

I've done external research but I have not been able to find useful information. Is there anyone who can help me get started with this?


Solution

  • Ok. So I finfally managed to combine both libraries. I´m not sure if what I´m doing is the most efficient way of working but it has worked for me.

    First of all I´m basing myself on the Sample Apps from Vuforia. Specifically using the FrameMarkers example.

    I opened an empty LibGDX project, imported the Vuforia jar and copied the SampleApplicationControl, SampleApplicationException, SampleApplicationGLView, SampleApplicationSession, FrameMarkerRenderer and FrameMarker.

    Next, I created some attributes on the AndroidLauncher class of LibGDX and initialized all the Vuforia Stuff:

    public class AndroidLauncher extends AndroidApplication implements SampleApplicationControl{
        private static final String LOGTAG = "FrameMarkers";
    
    
        // Our OpenGL view:
        public SampleApplicationGLView mGlView;
        public SampleApplicationSession vuforiaAppSession;
        // Our renderer:
        public FrameMarkerRenderer mRenderer;
        MyGDX gdxRender;
        // The textures we will use for rendering:
        public Vector<Texture> mTextures;
        public RelativeLayout mUILayout;
    
        public Marker dataSet[];
    
        public GestureDetector mGestureDetector;
    
    
        public LoadingDialogHandler loadingDialogHandler = new LoadingDialogHandler(
            this);
    
        // Alert Dialog used to display SDK errors
        private AlertDialog mErrorDialog;
    
        boolean mIsDroidDevice = false;
        @Override
        protected void onCreate (Bundle savedInstanceState) {
            super.onCreate(savedInstanceState);
            vuforiaAppSession = new SampleApplicationSession(this);
            vuforiaAppSession.setmActivity(this);
            AndroidApplicationConfiguration config = new      AndroidApplicationConfiguration();
    
    
            // Load any sample specific textures:
            mTextures = new Vector<Texture>();
            loadTextures();
            startLoadingAnimation();
            vuforiaAppSession.initAR(ActivityInfo.SCREEN_ORIENTATION_PORTRAIT);
            gdxRender = new MyGDX (vuforiaAppSession);
            gdxRender.setTextures(mTextures);
            initialize(gdxRender, config);
    
            mGestureDetector = new GestureDetector(this, new GestureListener());
    
            mIsDroidDevice = android.os.Build.MODEL.toLowerCase().startsWith(
                "droid");    
    }
    

    I needed to set the activity so I created the setmActivity() on the SampleApplicationSession.

    After that I implemented the Libgdx ApplicationAdapter class and passed the vuforiaAppSession as an attribute to access all the stuff I initialized.

    public class MyGDX extends ApplicationAdapter  {
        ModelInstance modelInstanceHouse;
        private AnimationController controller;
        Matrix4 lastTransformCube;
        // Constants:
        static private float kLetterScale = 25.0f;
        static private float kLetterTranslate = 25.0f;
        // OpenGL ES 2.0 specific:
        private static final String LOGTAG = "FrameMarkerRenderer";
        private int shaderProgramID = 0;
        private Vector<com.mygdx.robot.Texture> mTextures;
        //SampleApplicationSession vuforiaAppSession;
        PerspectiveCamera cam;
        ModelBuilder modelBuilder;
        Model model;
        ModelInstance instance;
        ModelBatch modelBatch;
        static boolean render;
        public SampleApplicationSession vuforiaAppSession;
    
        public MyGDX ( SampleApplicationSession vuforiaAppSession){
            super();
            this.vuforiaAppSession = vuforiaAppSession;
    
        }
    

    The last important thing to keep in mind is the render() method. I based myself on the the render method of the FrameMarkerRenderer. It has a boolean that gets activated when the camera starts. So I simple changed the variable both on the vuforia AR initialization and the render() method. I had to put the camera on and identity matrix and multiply the model by the modelViewMatrix.

    @Override
    public void render() {
        if (render) {
            // Clear color and depth buffer
            GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT);
            // Get the state from Vuforia and mark the beginning of a rendering
            // section
            State state = Renderer.getInstance().begin();
            // Explicitly render the Video Background
            Renderer.getInstance().drawVideoBackground();
    
            GLES20.glEnable(GLES20.GL_DEPTH_TEST);
    
            // We must detect if background reflection is active and adjust the
            // culling direction.
            // If the reflection is active, this means the post matrix has been
            // reflected as well,
            // therefore standard counter clockwise face culling will result in
            // "inside out" models.
            GLES20.glEnable(GLES20.GL_CULL_FACE);
            GLES20.glCullFace(GLES20.GL_BACK);
            cam.update();
            modelBatch.begin(cam);
    
            if (Renderer.getInstance().getVideoBackgroundConfig().getReflection() == VIDEO_BACKGROUND_REFLECTION.VIDEO_BACKGROUND_REFLECTION_ON)
    
                            GLES20.glFrontFace(GLES20.GL_CW);  // Front camera
            else
                GLES20.glFrontFace(GLES20.GL_CCW);   // Back camera
    
            // Set the viewport
            int[] viewport = vuforiaAppSession.getViewport();
            GLES20.glViewport(viewport[0], viewport[1], viewport[2], viewport[3]);
    
            // Did we find any trackables this frame?
            for (int tIdx = 0; tIdx < state.getNumTrackableResults(); tIdx++)
            {
                // Get the trackable:
                TrackableResult trackableResult = state.getTrackableResult(tIdx);
                float[] modelViewMatrix = Tool.convertPose2GLMatrix(
                        trackableResult.getPose()).getData();
    
                // Choose the texture based on the target name:
                int textureIndex = 0;
    
                // Check the type of the trackable:
                assert (trackableResult.getType() == MarkerTracker.getClassType());
                MarkerResult markerResult = (MarkerResult) (trackableResult);
                Marker marker = (Marker) markerResult.getTrackable();
                textureIndex = marker.getMarkerId();
                float[] modelViewProjection = new float[16];
                Matrix.translateM(modelViewMatrix, 0, -kLetterTranslate, -kLetterTranslate, 0.f);
                Matrix.scaleM(modelViewMatrix, 0, kLetterScale, kLetterScale, kLetterScale);
                Matrix.multiplyMM(modelViewProjection, 0, vuforiaAppSession.getProjectionMatrix().getData(), 0, modelViewMatrix, 0);
                SampleUtils.checkGLError("FrameMarkers render frame");
                cam.view.idt();
                cam.projection.idt();
                cam.combined.idt();
                Matrix4 temp3 = new Matrix4(modelViewProjection);
                modelInstanceHouse.transform.set(temp3);
                modelInstanceHouse.transform.scale(0.05f, 0.05f, 0.05f);
                controller.update(Gdx.graphics.getDeltaTime());
                modelBatch.render(modelInstanceHouse);
            }
            GLES20.glDisable(GLES20.GL_DEPTH_TEST);
            modelBatch.end();
    
    }
    

    It´s a lot of code, but I hope it helps the people that try to start integrating both libraries. I don´t think this is efficient but it´s the only solution I´ve come with.