Good evening all
I am trying to get my OpenGL program to use 1 texture (texture Atlas). It is 256 x 256. I load it as a normal texture.
If the texture is from 0,0 to 1,1 then I believe each square is 0.2 in size.
The issue is only the 1st and 3rd texture work. The 2nd is odd and the 4th seems to be 2nd+3rd texture and 5th is the 2nd,3rd and 4th put together
So I built a simple function to return the texture coords
int yy = textureId / 5;
int xx = textureId % 5;
float size = 1.0f / 5;
float[] textureCoordinateDataMap = createTexture(size * xx,size * yy ,size, size );
public float[]createTexture(float x, float y, float xx, float yy)
{
float[] textureCoordinateDataMap =
{
// Front face
x, y,
x, yy,
xx, y,
x, yy,
xx, yy,
xx, y,
// Right face
x, y,
x, yy,
xx, y,
x, yy,
xx, yy,
xx, y,
// Back face
x, y,
x, yy,
xx, y,
x, yy,
xx, yy,
xx, y,
// Left face
x, y,
x, yy,
xx, y,
x, yy,
xx, yy,
xx, y,
// Top face
x, y,
x, yy,
xx, y,
x, yy,
xx, yy,
xx, y,
// Bottom face
x, y,
x, yy,
xx, y,
x, yy,
xx, yy,
xx, y,
};
return textureCoordinateDataMap;
}
I have tried to hardcode the values, the only ones that work are 0,0 0.4,0 0,0.4 0.4,0.4
If you want to use the values in this array as texture coordinates, you will need to add the size to the left/lower coordinate to get the right/top coordinate. Right now you use the size itself as the right/top coordinate. One way is to add the size in the function call:
float[] textureCoordinateDataMap = createTexture(
size * xx, size * yy , size (xx + 1), size * (yy + 1));