I'm building a camera app that captures a photo in the BGRA format, and applies a Core Image filter on it before saving it to the Photos app. On the iPhone 7 Plus, the input photo is in the Display P3 color space, but the output is in the sRGB color space:
How do I prevent this from happening?
Here's my code:
let sampleBuffer: CMSampleBuffer = ...
let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!
let metadata = CMCopyDictionaryOfAttachments(nil, self, kCMAttachmentMode_ShouldPropagate)!
let ciImage = CIImage(cvImageBuffer: pixelBuffer,
options:[kCIImageProperties: metadata])
NSLog("\(ciImage.colorSpace)")
let context = CIContext()
let data = context.jpegRepresentation(of: ciImage,
colorSpace: ciImage.colorSpace!,
options: [:])!
// Save this using PHPhotoLibrary.
This prints:
Optional(<CGColorSpace 0x1c40a8a60> (kCGColorSpaceICCBased; kCGColorSpaceModelRGB; Display P3))
(In my actual code, I apply a filter to the CIImage, which creates another CIImage, which I save. But I can reproduce this problem even with the original CIImage, so I've eliminated the filter.)
How do I apply a Core Image filter to a P3 image and save it as a P3 image, not sRGB?
Notes:
(1) This is on iPhone 7 Plus running iOS 11.
(2) I'm using the wide camera, not tele, dual or front.
(3) If I ask AVFoundation to give me a JPEG-encoded image rather than BGRA, and save it without involving Core Image, this problem doesn't occur — the color space isn't reduced to sRGB.
(4) I tried using kCIImageColorSpace, but it made no difference:
let p3 = CGColorSpace(name: CGColorSpace.displayP3)!
let ciImage = CIImage(
cvImageBuffer: pixelBuffer,
options:[kCIImageProperties: metadata,
kCIImageColorSpace: p3])
(5) I tried using kCIContextOutputColorSpace in addition to the above, as an argument when creating the CIContext, but it again made no difference.
(6) The code that takes a Data and saves it to PHPhotoLibrary is not the problem, since it works in case (2) above.
let context = CIContext(options: [kCIContextOutputColorSpace: CGColorSpace.p3])
How do I apply a Core Image filter to a P3 image and save it as a P3 image, not sRGB?
I've had the same issue and I think this may be a bug with context.jpegRepresentation(..).
I've had more success using ImageIO to create the JPEG data, as shown in the createJPEGData function below. For example:
let eaglContext = EAGLContext(api: .openGLES2)
let options = [kCIContextWorkingColorSpace: CGColorSpace(name: CGColorSpace.extendedSRGB)!,
kCIContextOutputPremultiplied: true,
kCIContextUseSoftwareRenderer: false] as [String : Any]
let ciContext = CIContext(eaglContext: eaglContext, options: options)
let colorSpace = CGColorSpace(name: CGColorSpace.displayP3)!
guard let imageData = createJPEGData(from: image,
jpegQuality: 0.9,
outputColorSpace: colorSpace,
context: ciContext) else {
return
}
PHPhotoLibrary.shared().performChanges({ () -> Void in
let creationRequest = PHAssetCreationRequest.forAsset()
creationRequest.addResource(with: .photo,
data: imageData,
options: nil)
}, completionHandler: { (success: Bool, error : Error?) -> Void in
// handle errors, etc
})
func createJPEGData(from image: CIImage,
jpegQuality: Float,
outputColorSpace: CGColorSpace,
context: CIContext) -> Data? {
let jpegData: CFMutableData = CFDataCreateMutable(nil, 0)
if let destination = CGImageDestinationCreateWithData(jpegData, kUTTypeJPEG, 1, nil) {
if let cgImage = context.createCGImage(image,
from: image.extent,
format: kCIFormatRGBA8,
colorSpace: outputColorSpace) {
CGImageDestinationAddImage(destination, cgImage, image.properties as CFDictionary?)
if CGImageDestinationFinalize(destination) {
return jpegData as Data
}
}
}
}