Search code examples
libgdxscreenresolutionscaling

libgdx resolution/density scaling assets


Quick question,

I am working on a game in libgdx and have hit a snag. I'm trying to scale my assets and I'm using Gdx.graphics.getDensity() to get the density and then setting my asset size using that as a multiplier. The issue I'm having is that on a tablet that has a 2560x1600 resolution has a density of 2.0, yet the nexus 5 emulator that has 1080x1920 has a density of 2.652... how does the tablet have a smaller density then the smaller phone?

What should I be using to get my multiplier for scaling if density is not reliable based on the android app screen size?


Solution

  • To answer your first question, it's a hardware thing. The Nexus 5 probably has a higher PPI (pixels per inch) ratio than the second, even though the resolution is smaller.

    As to your second question, I would propose an alternative to Gdx.graphics.getDensity(): If you are trying to make sure that your asset is the same size relative to the screen (i.e. a 80x80 asset on a 2560x1600 display would be half the size on a 1280x800 display), then I think you want to leave your assets the same size and change the size of your camera instead.

    When you create a game in libGDX (or any game engine), you need to keep in mind the difference between three sets of sizes/coordinates in your game:

    • Viewport size is the size of the space on your screen where your game will draw. It is measured in pixels and defaults to your window size (in a fullscreen game, the size of the screen).
    • game world size/coordinates are completely arbitrary. They are measured in whatever unit you want (be it meters, inches, bananas, F16s, etc.). You'll know you're working in game world units because you have floats.
    • camera size is the amount 'world' you can see at one time measured in game units. You can make your camera show a 1280x800 portion of the world or a 3x5 portion of the world or a 137.6x42 portion. The tricky thing is that libGDX's OrthographicCamera class defaults your viewport size, acting as though 1 game unit == 1 pixel.

    Hopefully that made sense. Let's look at the implications:

    Say I have a 800x600 GU (game unit) asset at 0,0 in my game world. My viewport size is 2560x1600, and my camera is scaled to 2560x1600 as well. My 800x600 GU object will render a 800x600 px image on the screen.

    Now suppose I want to port my game to other screen sizes with the same ratio (just to keep things simple for now). I still have a 2560x1600 viewport size, but I change my camera to 8x5 GU and my game object to 2.5x1.875 GU. These sizes will be set explicitely- independent of the viewport size. This makes 1 GU = 320px. The net result is that my game object will still render a 800x600px image on the screen.

    Now let's see how this would work on a 1280x800px resolution: My viewport is 1280x800 px, but my camera stays at 8x5 GU and my game object at 2.5x1.875 GU. Because of the size of the camera, 1 GU = 160px, which means my game object renders at 400x300, which means it is proportionally the same size on the smaller screen.

    Hopefully that made sense without pictures. When you start using screens with different aspect ratios (i.e. you go from an 8x5 screen to a 3x2 screen), you need a little extra logic to keep the aspect ratio of your viewport the same as your camera (otherwise things get all stretched). Fortunately, libGDX provides some viewports which will do this for you.

    You can find an external tutorial by the libGDX community here that talks more about this issue.