Search code examples
macosopenglsdlzigglad

Trying to load OpenGL 4.1 but only get OpenGL 2.1 on MacOS


I am trying to setup a OpenGL-based project in my Macbook Pro 14 (M2 Pro CPU, Macos Sonoma 14.5). I have basically managed to get everything working, BUT no matter what I do I cannot make the program use OpenGL 4.1, and 2.1 loads instead.

I am using SDL3, Glad and Zig for this, but I am just loading the same libraries as if I were to use C or C++. Here is my initialization code:

pub fn initialize_sdl() !SDLComponents {
    c.SDL_Log("Initializing SDL");

    // Very important to set these before initializing SDL
    _ = c.SDL_GL_LoadLibrary(null);
    _ = c.SDL_GL_SetAttribute(c.SDL_GL_ACCELERATED_VISUAL, 1);
    _ = c.SDL_GL_SetAttribute(c.SDL_GL_CONTEXT_FORWARD_COMPATIBLE_FLAG, c.GL_TRUE);
    _ = c.SDL_GL_SetAttribute(c.SDL_GL_CONTEXT_PROFILE_MASK, c.SDL_GL_CONTEXT_PROFILE_CORE);
    _ = c.SDL_GL_SetAttribute(c.SDL_GL_CONTEXT_MAJOR_VERSION, 4);
    _ = c.SDL_GL_SetAttribute(c.SDL_GL_CONTEXT_MINOR_VERSION, 1);
    _ = c.SDL_GL_SetAttribute(c.SDL_GL_DOUBLEBUFFER, 1);
    _ = c.SDL_GL_SetAttribute(c.SDL_GL_DEPTH_SIZE, 24);

    if (c.SDL_Init(c.SDL_INIT_VIDEO) != 0) {
        c.SDL_LogError(c.SDL_LOG_CATEGORY_RENDER, "Unable to initialize SDL: %s", c.SDL_GetError());
        return error.SDLInitializationFailed;
    }

    const window: *c.SDL_Window = c.SDL_CreateWindow("Powder3D", 400, 400, c.SDL_WINDOW_OPENGL | c.SDL_WINDOW_RESIZABLE) orelse {
        c.SDL_LogError(c.SDL_LOG_CATEGORY_RENDER, "Unable to create window: %s", c.SDL_GetError());
        return error.SDLInitializationFailed;
    };

    const open_gl_context: c.SDL_GLContext = c.SDL_GL_CreateContext(window) orelse {
        c.SDL_LogError(c.SDL_LOG_CATEGORY_RENDER, "Unable to create OpenGL context: %s", c.SDL_GetError());
        return error.SDLInitializationFailed;
    };

    if (c.gladLoadGLLoader(sdlGLGetProcAddress) == 0) {
        c.SDL_LogError(c.SDL_LOG_CATEGORY_RENDER, "Unable to load OpenGL functions with glad");
        return error.GLADInitializationFailed;
    }

    const versionStringPtr = c.glGetString(c.GL_VERSION) orelse {
        c.SDL_LogError(c.SDL_LOG_CATEGORY_RENDER, "Could not get GL version");
        return error.OPENGLInitializationFailed;
    };

    const versionString = std.mem.span(versionStringPtr);
    std.debug.print("OpenGL version: {s}\n", .{versionString});

    return .{ .window = window, .context = open_gl_context };
}

The reported open gl version (the code prints it at the bottom) outputs "2.1 Metal - 88.1". After skimming through the web, most people have fixed this by settings the SDL_GL_CONTEXT_PROFILE_MASK, SDL_GL_CONTEXT_PROFILE_CORE values, but this doesn't really work for me.


Solution

  • SDL_GL_SetAttribute() calls need to occur after SDL_Init(SDL_INIT_VIDEO)/SDL_VideoInit() but before SDL_CreateWindow() to be effective:

    /**
     * Set an OpenGL window attribute before window creation.
     *
     * This function sets the OpenGL attribute `attr` to `value`. The requested
     * attributes should be set before creating an OpenGL window. You should use
     * SDL_GL_GetAttribute() to check the values after creating the OpenGL
     * context, since the values obtained can differ from the requested ones.
     *
     * \param attr an SDL_GLattr enum value specifying the OpenGL attribute to set
     * \param value the desired value for the attribute
     * \returns 0 on success or a negative error code on failure; call
     *          SDL_GetError() for more information.
     *
     * \since This function is available since SDL 3.0.0.
     *
     * \sa SDL_GL_GetAttribute
     * \sa SDL_GL_ResetAttributes
     */
    extern DECLSPEC int SDLCALL SDL_GL_SetAttribute(SDL_GLattr attr, int value);