Search code examples
javascriptc++openglv8embedded-v8

Does glBindTexture() require a GLuint pointer?


I'm trying to implement OpenGL bindings for Texture usage in OpenGL/V8/JavaScript.

My question is pretty simple:

Does OpenGL's glBindTexture() method require a pointer to a GLuint or does it only require a valid GLuint?

The khronos docs here says it only requires a GLuint.

http://www.khronos.org/opengles/documentation/opengles1_0/html/glBindTexture.html

The problem is the following:

I'm having a custom data type in the v8 JavaScript context that is used for initializing and loading the textures. After loading, the textures were generated via:

// blabla...
GLuint texture;
glGenTextures(1, &texture);
glBindTexture(GL_TEXTURE_2D, texture);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, width, height, 0, GL_RGBA, GL_UNSIGNED_BYTE, (GLvoid*) image_data);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
// ...blabla
return texture;

So this method returns the GLuint from the c++ side and attaches the casted Integer value as the property "id" to an object in the JavaScript context:

var texture = new Texture('./texture.png');
texture.load();
texture.onload = function() {
  gl.bindTexture(gl.TEXTURE_2D, texture.id); // this is the casted v8::Integer value
};

When using the bindTexture method on the global gl (namespace) object in v8, it does the following on the native side:

v8::Handle<v8::Value> GLglBindTextureCallback(const v8::Arguments& args) {

  // blabla...
  int arg0 = args[0]->IntegerValue();
  unsigned int arg1 = args[1]->Uint32Value();
  glBindTexture((GLenum) arg0, (GLuint) arg1);
  // ... blabla

}

Solution

  • Why do you think it would need a pointer? You just pass it a GLuint.