I have already lost 2 days trying to figure out on this issue, but with no use.
I have written a collada animation renderer using opengles2.0 for android; using shader for skinning. The code is almost complete and it runs just fine on my HTC DesireS.
But, when I try to run the same on a tridernt SetTopBox with PowerVR chipset, my geometry is not displayed. After a day of debugging, I found out that it is happening because I am getting != -1 as bone matrix indeices in the shader.
I verified that it is == -1 in my phone; but is != -1 in the SetTopBox.
What could possibly be wrong? Please save me from this big trouble.
Sorry for not puttingup the code. Here is the vertex shader. I am expecting vec2(boneIndices) to have -1 in [0] as well as [1]; but is not so on Powervr.
attribute vec4 vPosition;
attribute vec2 vTexCoord;
attribute vec2 boneIndices;
attribute vec2 boneWeights;
uniform mat4 boneMatrices[BNCNT];
uniform mat4 modelMatrix;
uniform mat4 viewMatrix;
uniform mat4 projectionMatrix;
varying mediump vec2 fTexCoord;
varying mediump vec3 col;
void main(){
vec4 tempPosition = vPosition;
int index = int(boneIndices.x);
col = vec3(1.0, 0.0, 0.0);
if(index >= 0){
col.y = 1.0;
tempPosition = (boneMatrices[index] * vPosition) * boneWeights.x;
}
index = int(boneIndices.y);
if(index >= 0){
col.z = 1.0;
tempPosition = (boneMatrices[index] * vPosition) * boneWeights.y + tempPosition;
}
gl_Position = projectionMatrix * viewMatrix * modelMatrix * tempPosition;
fTexCoord = vTexCoord;
}
setting up the attribute pointers
glVertexAttribPointer(position, 3, GL_FLOAT, GL_FALSE, 13*sizeof(GLfloat), 0);
glVertexAttribPointer(texCoord, 2, GL_FLOAT, GL_FALSE, 13*sizeof(GLfloat), (GLvoid*)(3*sizeof(GLfloat)));
glVertexAttribPointer(boneIndices, 2, GL_FLOAT, GL_FALSE, 13*sizeof(GLfloat), (GLvoid*)(9*sizeof(GLfloat)));
glVertexAttribPointer(boneWeights, 2, GL_FLOAT, GL_FALSE, 13*sizeof(GLfloat), (GLvoid*)(11*sizeof(GLfloat)));
glEnableVertexAttribArray(position);
glEnableVertexAttribArray(texCoord);
glEnableVertexAttribArray(boneIndices);
glEnableVertexAttribArray(boneWeights);
my vertex and index buffers
GLfloat vertices[13*6] =
{-0.5*size, -0.5*size, 0, 0,1, 1,1,1,1, -1,-1, 0,0,
-0.5*size, 0.5*size, 0, 0,0, 1,1,1,1, -1,-1, 0,0,
0.5*size, 0.5*size, 0, 1,0, 1,1,1,1, -1,-1, 0,0,
-0.5*size, -0.5*size, 0, 0,1, 1,1,1,1, -1,-1, 0,0,
0.5*size, 0.5*size, 0, 1,0, 1,1,1,1, -1,-1, 0,0,
0.5*size, -0.5*size, 0, 1,1, 1,1,1,1, -1,-1, 0,0 };
GLushort indices[]= {0,1,2, 3,4,5};
I am expecting the indices to be -1 in the shader; but they are not.
After days of frustration, I finally found the problem by myself.
The culprit was the "int()" function call, which was returning 0 even if I specify -1.
The observed behavior is that it returns
0 for -1
-1 for -2
-2 for -3 and like...
I am not sure if this is a driver/hw bug, or if it is because of the floating point representation where -1 is represented as something like "-.9999999"
Can anybody shed a little more light on this?