I started reading OpenGL 4.0 Shading Language version 1. At the start of the book they show have code to find out OpenGL version but is does not work. I would like to know how to find out if set it up correctly.
#include <GL/glew.h>
#include <GL/gl.h>
#include <GL/glu.h>
#include <iostream>
#include <stdio.h>
int main(int argc, char** argv) {
/*
GLenum err = glewInit();
if( GLEW_OK != err ) {
fprintf(stderr, "Error initializing GLEW: %s\n",
glewGetErrorString(err) );
}
*/
const GLubyte *renderer = glGetString( GL_RENDERER );
const GLubyte *vendor = glGetString( GL_VENDOR );
const GLubyte *version = glGetString( GL_VERSION );
const GLubyte *glslVersion = glGetString( GL_SHADING_LANGUAGE_VERSION );
GLint major, minor;
glGetIntegerv(GL_MAJOR_VERSION, &major);
glGetIntegerv(GL_MINOR_VERSION, &minor);
printf("GL Vendor : %s\n", vendor);
printf("GL Renderer : %s\n", renderer);
printf("GL Version (string) : %s\n", version);
printf("GL Version (integer) : %d.%d\n", major, minor);
printf("GLSL Version : %s\n", glslVersion);
}
The code gives me null out puts. I do not believe that this is correct because before starting this book I found code online for displaying a box in openGL 2.0 which worked.
GL Vendor : (null)
GL Renderer : (null)
GL Version (string) : (null)
GL Version (integer) : 0.0
GLSL Version : (null)
Also I have looked at this link which gives information about my graphics card. I think that this has nothing to do with my development package installed.
I need to find a way to confirm I have openGL 4.0 installed. I would prefer a terminal command or my code fixed.
Edit: I have added a additional question about the code McLovin provided: I noticed that I get a extra line 'Segmentation fault (core dumped)'. I have a 7970 AMD GPU. Is this output an error?
You need to create an OpenGL context before you get any data about OpenGL. A context is essentially the set of data related to the OpenGL instance used in your application. It is usually created by opening a window with a library like freeglut, SFML, GLFW, or SDL.
The following code uses freeglut:
#include <stdio.h>
#include <GL/gl.h>
#include <GL/freeglut.h>
int main(int argc, char **argv)
{
glutInit(&argc, argv); /* create opengl context */
glutInitContextVersion(4, 4); /* use version 4.4 */
/* there's no main loop, so the window will close immediately */
glutCreateWindow("You won't see this window");
/* we can now get data for the specific OpenGL instance we created */
const GLubyte *renderer = glGetString( GL_RENDERER );
const GLubyte *vendor = glGetString( GL_VENDOR );
const GLubyte *version = glGetString( GL_VERSION );
const GLubyte *glslVersion = glGetString( GL_SHADING_LANGUAGE_VERSION );
GLint major, minor;
glGetIntegerv(GL_MAJOR_VERSION, &major);
glGetIntegerv(GL_MINOR_VERSION, &minor);
printf("GL Vendor : %s\n", vendor);
printf("GL Renderer : %s\n", renderer);
printf("GL Version (string) : %s\n", version);
printf("GL Version (integer) : %d.%d\n", major, minor);
printf("GLSL Version : %s\n", glslVersion);
return 0;
}
For example, this is the output it gave me (Currently using Ubuntu 12.04).
GL Vendor : NVIDIA Corporation
GL Renderer : GeForce GTX 660/PCIe/SSE2
GL Version (string) : 4.4.0 NVIDIA 331.20
GL Version (integer) : 4.4
GLSL Version : 4.40 NVIDIA via Cg compiler