Search code examples
iosswiftuiimagemediapipecvpixelbuffer

How to convert a UIImage to a CVPixelBuffer 32BGRA for mediapipe?


I am using mediapipe to develop a iOS application, now I need input an image data to the mediapipe, but mediapipe only accepted 32BGRA CVPixelBuffer.

how can I convert UIImage to 32BGRA CVPixelBuffer?

I am using this code:

        let frameSize = CGSize(width: self.cgImage!.width, height: self.cgImage!.height)
        
        var pixelBuffer:CVPixelBuffer? = nil
        let status = CVPixelBufferCreate(kCFAllocatorDefault, Int(frameSize.width), Int(frameSize.height), kCVPixelFormatType_32BGRA , nil, &pixelBuffer)
        
        if status != kCVReturnSuccess {
            return nil
        }
        
        CVPixelBufferLockBaseAddress(pixelBuffer!, CVPixelBufferLockFlags.init(rawValue: 0))
        let data = CVPixelBufferGetBaseAddress(pixelBuffer!)
        let rgbColorSpace = CGColorSpaceCreateDeviceRGB()
        let bitmapInfo = CGBitmapInfo(rawValue: CGBitmapInfo.byteOrder32Little.rawValue | CGImageAlphaInfo.premultipliedFirst.rawValue)
        let context = CGContext(data: data, width: Int(frameSize.width), height: Int(frameSize.height), bitsPerComponent: 8, bytesPerRow: CVPixelBufferGetBytesPerRow(pixelBuffer!), space: rgbColorSpace, bitmapInfo: bitmapInfo.rawValue)
        
        
        context?.draw(self.cgImage!, in: CGRect(x: 0, y: 0, width: self.cgImage!.width, height: self.cgImage!.height))
        
        CVPixelBufferUnlockBaseAddress(pixelBuffer!, CVPixelBufferLockFlags(rawValue: 0))
        
        return pixelBuffer

but I will throw an error on mediapipe mediapipe/0 (11): signal SIGABRT

enter image description here

If I use AVCaptureVideoDataOutput it is all well.

btw: I am using swift.


Solution

  • Maybe you can try this. Also, I have a question for you. Do you know how to use a static image? Are used for face recognition in mediapipe? If you know, please tell me. Thank you.

    func pixelBufferFromCGImage(image:CGImage) -> CVPixelBuffer? {
            let options = [
                    kCVPixelBufferCGImageCompatibilityKey as String: NSNumber(value: true),
                    kCVPixelBufferCGBitmapContextCompatibilityKey as String: NSNumber(value: true),
                    kCVPixelBufferIOSurfacePropertiesKey as String: [:]
            ] as CFDictionary
            
            let size:CGSize = .init(width: image.width, height: image.height)
            var pxbuffer: CVPixelBuffer? = nil
            let status = CVPixelBufferCreate(
                kCFAllocatorDefault,
                Int(size.width),
                Int(size.height),
                kCVPixelFormatType_32BGRA,
                options,
                &pxbuffer)
            guard let pxbuffer = pxbuffer else { return nil }
            
            CVPixelBufferLockBaseAddress(pxbuffer, [])
            guard let pxdata = CVPixelBufferGetBaseAddress(pxbuffer) else {return nil}
            
            let bitmapInfo = CGBitmapInfo(rawValue: CGBitmapInfo.byteOrder32Little.rawValue | CGImageAlphaInfo.premultipliedFirst.rawValue)
            
            guard let context = CGContext(data: pxdata, width: Int(size.width), height: Int(size.height), bitsPerComponent: 8, bytesPerRow: CVPixelBufferGetBytesPerRow(pxbuffer), space: CGColorSpaceCreateDeviceRGB(), bitmapInfo:bitmapInfo.rawValue) else {
                return nil
            }
            context.concatenate(CGAffineTransformIdentity)
            context.draw(image, in: .init(x: 0, y: 0, width: size.width, height: size.height))
            
            ///error: CGContextRelease' is unavailable: Core Foundation objects are automatically memory managed
            ///maybe CGContextRelease should not use it 
            CGContextRelease(context)
            CVPixelBufferUnlockBaseAddress(pxbuffer, [])
            return pxbuffer
        }