I am trying to create an OpenGL texture using Core Graphics. My code setup is in prepareOpenGL()
. While drawing to the initialized CGContext
, I get a EXC_BAD_ACCESS
fault. However, when I po the context in LLDB (po context!
, see below), I get an address with all of the CGContext's information. I have tried calling an expression through LLDB as well, and the output is nil
. Address sanitizer, thread sanitizer, and zombies were non-revealing. My question is why is the CGContext being stated as nil, EXC_BAD_ACCESS
, when the Optional<CGContext>
is not nil
? I have tried calling this code in drawRect(_:)
, but I get the same results.
import Cocoa
import CoreGraphics
import OpenGL.GL3
class MyView: NSOpenGLView {
var textureTBO: GLuint = 0
override func prepareOpenGL() {
let bitmapRows = 10
let bitmapColumns = 10
let floatsPerPoint = 4
var bitmapBuffer = UnsafeMutablePointer<CGFloat>.allocate(capacity: bitmapRows * bitmapColumns * floatsPerPoint)
if let context = CGContext(data: &bitmapBuffer,
width: bitmapColumns,
height: bitmapRows,
bitsPerComponent: 8,
bytesPerRow: bitmapColumns * floatsPerPixel,
space: CGColorSpaceCreateDeviceRGB(),
bitmapInfo: CGImageAlphaInfo.noneSkipLast.rawValue) {
context.setFillColor(red: 0, green: 0, blue: 1, alpha: 0.5)
context.fill(CGRect(x: 0, y: 0, width: 5, height: 5)) //=> EXC_BAD_ACCESS, code=2
context.setFillColor(red: 1, green: 0, blue: 1, alpha: 0.5)
context.fill(CGRect(x: 5, y: 5, width: 5, height: 5))
context.setFillColor(red: 0, green: 1, blue: 0, alpha: 0.5)
context.fill(CGRect(x: 0, y: 5, width: 5, height: 5))
context.setFillColor(red: 0.5, green: 0, blue: 0.5, alpha: 0.5)
context.fill(CGRect(x: 5, y: 0, width: 5, height: 5))
glGenTextures(1, &textureTBO)
glBindTexture(GLenum(GL_TEXTURE_2D), textureTBO)
glTexParameteri(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_MIN_FILTER), GL_LINEAR)
glTexParameteri(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_MAG_FILTER), GL_LINEAR)
glTexParameteri(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_WRAP_S), GL_REPEAT)
glTexParameteri(GLenum(GL_TEXTURE_2D), GLenum(GL_TEXTURE_WRAP_T), GL_REPEAT)
}
glTexImage2D(GLenum(GL_TEXTURE_2D), 0, GLint(GL_RGB), 256, 256, 0, GLenum(GL_RGBA), GLenum(GL_UNSIGNED_INT), &bitmapBuffer)
bitmapBuffer.deallocate()
}
}
LLDB output - the optional is not nil
(lldb) po context!
<CGContext 0x7b3000000300> (kCGContextTypeBitmap)
<<CGColorSpace 0x7b1800002280> (kCGColorSpaceDeviceRGB)>
width = 256, height = 256, bpc = 8, bpp = 32, row bytes = 2048
kCGImageAlphaNoneSkipLast | 0 (default byte order)
LLDB output - is this telling me that a color cannot be set because the context actually is nil?
(lldb) expression context?.setFillColor(red: 0, green: 0, blue: 1, alpha: 0.5)
(Void?) $R4 = nil
LLDB output - an exercise in futility: doing the same thing and expecting a different result, haha
(lldb) expression context?.fill(CGRect(x: 0, y: 0, width: 5, height: 5))
error: Execution was interrupted, reason: EXC_BAD_ACCESS (code=2, address=0x7b08000c7ff0).
The process has been returned to the state before expression evaluation.
It seems that this can be done as I found this answer. I appreciate your thoughts, thank you!
data: &bitmapBuffer
is a pointer to the pointer to the buffer. Replace it with data: bitmapBuffer
.
Not related to the crash but 8 bits per component fits in UInt8
:
var bitmapBuffer = UnsafeMutablePointer<UInt8>.allocate(capacity: bitmapRows * bitmapColumns * floatsPerPoint)