libgdx coordinate system differences between rendering and touch input

15,684

Solution 1

To detect collision I use camera.unproject(vector3). I set vector3 as:

x = Gdx.input.getX();     
y = Gdx.input.getY();
z=0;

Now I pass this vector in camera.unproject(vector3). Use x and y of this vector to draw your character.

Solution 2

You're doing it right. Libgdx generally provides coordinate systems in their "native" format (in this case the native touch screen coordinates, and the default OpenGL coordinates). This doesn't create any consistency but it does mean the library doesn't have to get in between you and everything else. Most OpenGL games use a camera that maps relatively arbitrary "world" coordinates onto the screen, so the world/game coordinates are often very different from screen coordinates (so consistency is impossible). See Changing the Coordinate System in LibGDX (Java)

There are two ways you can work around this. One is transform your touch coordinates. The other is to use a different camera (a different projection).

To fix the touch coordinates, just subtract the y from the screen height. That's a bit of a hack. More generally you want to "unproject" from the screen into the world (see the Camera.unproject() variations). This is probably the easiest.

Alternatively, to fix the camera see "Changing the Coordinate System in LibGDX (Java)", or this post on the libgdx forum. Basically you define a custom camera, and then set the SpriteBatch to use that instead of the default.:

// Create a full-screen camera:
camera = new OrthographicCamera(Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
// Set it to an orthographic projection with "y down" (the first boolean parameter)
camera.setToOrtho(true, Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
camera.update();

// Create a full screen sprite renderer and use the above camera
batch = new SpriteBatch(Gdx.graphics.getWidth(), Gdx.graphics.getHeight());
batch.setProjectionMatrix(camera.combined);

While fixing the camera works, it is "swimming upstream" a bit. You'll run into other renderers (ShapeRenderer, the font renderers, etc) that will also default to the "wrong" camera and need to be fixed up.

Solution 3

I had same problem , i simply did this.

public boolean touchDown(int screenX, int screenY, int pointer, int button) {

    screenY = (int) (gheight - screenY);
    return true;
}

and every time you want to take input from user dont use Gdx.input.getY(); instead use (Gdx.graphics.getHeight()-Gdx.input.getY()) that worked for me.

Solution 4

The link below discusses this problem.

Projects the given coords in world space to screen coordinates.

You need to use the method project(Vector3 worldCoords) in class com.badlogic.gdx.graphics.Camera.

private Camera camera;
............

@Override
public boolean touchDown(int screenX, int screenY, int pointer, int button) {

Create an instance of the vector and initialize it with the coordinates of the input event handler.

    Vector3 worldCoors = new Vector3(screenX, screenY, 0);

Projects the worldCoors given in world space to screen coordinates.

    camera.project(worldCoors);

Use projected coordinates.

    world.hitPoint((int) worldCoors.x, (int) worldCoors.y);

    OnTouch();

    return true;
}
Share:
15,684
Brian Mains
Author by

Brian Mains

Computer Aid Inc. Consultant, Microsoft MVP.

Updated on June 04, 2022

Comments

  • Brian Mains
    Brian Mains almost 2 years

    I have a screen (BaseScreen implements the Screen interface) that renders a PNG image. On click of the screen, it moves the character to the position touched (for testing purposes).

    public class DrawingSpriteScreen extends BaseScreen {
        private Texture _sourceTexture = null;
        float x = 0, y = 0;
    
        @Override
        public void create() {
            _sourceTexture = new Texture(Gdx.files.internal("data/character.png"));
        }
    
        .
        .
    }
    

    During rendering of the screen, if the user touched the screen, I grab the coordinates of the touch, and then use these to render the character image.

    @Override
    public void render(float delta) {
        if (Gdx.input.justTouched()) {
            x = Gdx.input.getX();
            y = Gdx.input.getY();
        }
    
        super.getGame().batch.draw(_sourceTexture, x, y);
    }
    

    The issue is the coordinates for drawing the image start from the bottom left position (as noted in the LibGDX Wiki) and the coordinates for the touch input starts from the upper left corner. So the issue I'm having is that I click on the bottom right, it moves the image to the top right. My coordinates may be X 675 Y 13, which on touch would be near the top of the screen. But the character shows at the bottom, since the coordinates start from the bottom left.

    Why is what? Why are the coordinate systems reversed? Am I using the wrong objects to determine this?