I'm back with other doubt about moving my app from Android to iOS (which it's more difficult that I could think...).
Ok, so in this case I'm trying to colorized the next image:
In Android I used the next code:
public static Bitmap doNalaFilter (Bitmap src) {
Bitmap bmBrown0 = Bitmap.createBitmap(src.getWidth(), src.getHeight(), src.getConfig());
Canvas cBrown0 = new Canvas(bmBrown0);
Paint paintBrown0 = new Paint();
paintBrown0.setColorFilter(new LightingColorFilter(Color.WHITE, 0x800000));
cBrown0.drawBitmap(bmTemp, 0, 0, paintBrown0);
Bitmap bmBrown = Bitmap.createBitmap(src.getWidth(), src.getHeight(), src.getConfig());
Canvas cBrown = new Canvas(bmBrown);
cBrown.drawBitmap(src, 0, 0, null);
Paint paintBrown = new Paint();
paintBrown.setXfermode(new PorterDuffXfermode(PorterDuff.Mode.MULTIPLY));
cBrown.drawBitmap(bmBrown0, 0, 0, paintBrown);
return bmBrown;
}
Getting the next image:
But in Swift I'm trying the next code:
func nalaFilter() -> UIImage? {
let inImage = CIImage (image: self)
let SRGBImage = inImage?.applyingFilter("CILinearToSRGBToneCurve")
dynamic let brownMatrix = CIFilter (name: "CIMultiplyBlendMode")
let brownRect = CGRect (
x: (SRGBImage?.extent.origin.x)!,
y: (SRGBImage?.extent.origin.y)!,
width: (SRGBImage?.extent.size.width)!,
height: (SRGBImage?.extent.size.height)!)
let brownColor = CIColor (red: 128.0/255.0, green: 0.0, blue: 0.0)
let brownOverlay = CIImage (color: brownColor)
let brownCroppedImage = brownOverlay.cropped(to: brownRect)
brownMatrix?.setValue(SRGBImage, forKey: kCIInputImageKey)
brownMatrix?.setValue(brownCroppedImage, forKey: kCIInputBackgroundImageKey)
let brownOutImage = brownMatrix?.outputImage
let linearImage = brownOutImage?.applyingFilter("CISRGBToneCurveToLinear")
let cgImage = CIContext().createCGImage(linearImage!, from: linearImage!.extent)
return UIImage (cgImage: cgImage!)
}
And I'm getting this!
Does anybody have an idea about one Swift code which can work on the same way that in Android?
Thanks in advance!
Finally I got the next code which solves my problem:
func nalaFilter() -> UIImage? {
let brownColor = CIColor (red: 128.0/255.0, green: 0.0, blue: 0.0)
let brownRect = CGRect (origin: .zero, size: self.size)
//Colored image
UIGraphicsBeginImageContextWithOptions(brownRect.size, true, 0.0)
brownColor.setFill()
UIRectFill(brownRect)
let brownColoredImage = UIGraphicsGetImageFromCurrentImageContext()
let brownContext = UIGraphicsGetCurrentContext()
brownContext!.setFillColor(UIColor.white.cgColor)
brownContext!.fill(brownRect)
self.draw(in: brownRect, blendMode: .normal, alpha: 1)
brownColoredImage?.draw(in: brownRect, blendMode: .colorDodge, alpha: 1)
let outBrown0Image = UIGraphicsGetImageFromCurrentImageContext()
//Multiplied image
self.draw(in: brownRect, blendMode: .normal, alpha: 1)
outBrown0Image?.draw(in: brownRect, blendMode: .multiply, alpha: 1)
let outBrownImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return outBrownImage
}
This gets exactly the same result than my Android one.