Using LibGDX, I've written a very simple (and my first) fragment shader which is to have two different textures set, the first being an image to draw to the screen and the second is an alpha-transparency mask. Here is the fragment shader:
#ifdef GL_ES
precision mediump float;
#endif
varying vec4 v_color;
varying vec2 v_texCoords;
uniform sampler2D u_texture;
uniform sampler2D u_mask;
void main() {
vec4 texColor = texture2D(u_texture, v_texCoords);
vec4 maskColor = texture2D(u_mask, v_texCoords);
gl_FragColor = vec4(
vec3(v_color * texColor),
maskColor.a
);
}
u_texture is the image to draw and u_mask is the texture with the transparency information.
However, what I really want to do is utilise the Sprite
and TextureAtlas
classes to refer to a couple of TextureRegion
instances for the shader. Here is my rendering code:
shaderBatch.setProjectionMatrix(camera.combined);
shaderBatch.begin();
// ... some housekeeping code ...
Sprite overlapAlphaMask = maskTextureAtlas.createSprite("mask", 4);
Sprite overlapSprite = spriteTextureAtlas.createSprite("some-tile", 8);
// Or some other index in the texture atlas
overlapAlphaMask.getTexture().bind(1);
alphaMaskShader.setUniformi("u_mask", 1);
overlapSprite.getTexture().bind(0);
alphaMaskShader.setUniformi("u_texture", 0);
shaderBatch.draw(overlapSprite, worldX, worldY, 1.0f, 1.0f);
Although this is at least running and rendering something, it is picking up the wrong texture region from the maskTextureAtlas
. My guess is there's more to do here as the shader is not going to have any knowledge about the overlapAlphaMask sprite - how to draw it, what the texture coords are, etc.
I'm assuming the SpriteBatch.draw()
method is taking care of picking up the correct information from the overlapSprite
that's passed in, so I expect the vec2 v_texCoords
in the shader has been set correctly to draw this, but these co-ordinates are wrong for the 2nd texture / sampler2D uniform property. This is my first attempt at using shaders so I'm sure I'm missing something basic!
--- Update ---
So far my googling has revealed I may need to be setting something more via the vertex shader. I'm using this (default?) libGDX vertex shader:
attribute vec4 a_position;
attribute vec4 a_color;
attribute vec2 a_texCoord0;
uniform mat4 u_projTrans;
varying vec4 v_color;
varying vec2 v_texCoords;
void main() {
v_color = a_color;
v_texCoords = a_texCoord0;
gl_Position = u_projTrans * a_position;
}
As Dietrich Epp has pointed out, what I require is to send the extra texture co-ordinates via vertex buffers through to the vertex and pixel shader. I've managed to achieve this by writing my own implementation of the SpriteBatch
class which has two texture
s set at a time (not TextureRegion
s) and batches drawing of these with a couple of slightly extended shaders, as follows:
Vertex Shader
attribute vec4 a_position;
attribute vec4 a_color;
attribute vec2 a_texCoord0;
attribute vec2 a_texCoord1;
uniform mat4 u_projTrans;
varying vec4 v_color;
varying vec2 v_texCoords0;
varying vec2 v_texCoords1;
void main() {
v_color = a_color;
v_texCoords0 = a_texCoord0;
v_texCoords1 = a_texCoord1;
gl_Position = u_projTrans * a_position;
}
This is the default vertex shader but with the attribute vec2 a_texCoord
replaced with a set of two tex co-ords, and resulting varying vec2 v_texCoordsX
for passing along to the fragment shader. This extra attribute vec2
is then sent in through the vertex buffer.
Fragment Shader
#ifdef GL_ES
precision mediump float;
#endif
varying vec4 v_color;
varying vec2 v_texCoords0;
varying vec2 v_texCoords1;
uniform sampler2D u_texture0;
uniform sampler2D u_texture1;
void main() {
vec4 texColor = texture2D(u_texture0, v_texCoords0);
vec4 maskColor = texture2D(u_texture1, v_texCoords1);
gl_FragColor = vec4(
vec3(v_color * texColor),
maskColor.a
);
}
Similarly, the fragment shader now receives two sampler2D
textures to sample from, and a pair of texture co-ordinates to refer to for each.
The key to the changes in the SpriteBatch
class is extending the Mesh
definition in the constructor:
mesh = new Mesh(Mesh.VertexDataType.VertexArray, false, size * 4, size * 6, new VertexAttribute(VertexAttributes.Usage.Position, 2,
ShaderProgram.POSITION_ATTRIBUTE), new VertexAttribute(VertexAttributes.Usage.ColorPacked, 4, ShaderProgram.COLOR_ATTRIBUTE),
new VertexAttribute(VertexAttributes.Usage.TextureCoordinates, 2, ShaderProgram.TEXCOORD_ATTRIBUTE + "0"),
new VertexAttribute(VertexAttributes.Usage.TextureCoordinates, 2, ShaderProgram.TEXCOORD_ATTRIBUTE + "1"));
The final line above is the new addition, a pair of co-ordinates for a_texCoord1
. The draw()
method is passed two Sprite
or TextureRegion
instances, and is extended slightly to pass in the 2 extra floats for each set of vertices per point/index to the vertex array. Finally a slight addition to the shader set up to bind both textures and set the uniforms as so:
shader.setUniformi("u_texture0", 0);
shader.setUniformi("u_texture1", 1);
And it works! My temporary solution was to draw the two sprites to the screen over the top of each other writing transparency information into the display buffer, but due to SpriteBatch
batching for each texture, and my images being split across several textures, this was resulting in an unacceptable loss of performance, which this solution has fixed :)