Here is the code that I'm using to scale images in iOS i.e. scale 500x500 image to 100x100 image and then store scaled copy:
+ (UIImage *)image:(UIImage *)originalImage scaledToSize:(CGSize)desiredSize {
UIGraphicsBeginImageContextWithOptions(desiredSize, YES, [UIScreen mainScreen].scale);
[originalImage drawInRect:CGRectMake(0, 0, desiredSize.width, desiredSize.height)];
UIImage *finalImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext;
return finalImage;
}
Now I need to implement the same functionality in my macOS app. How can I do that? I saw the question like this but I still can't understand the logic of doing this in macOS.
After some search I found a question like mine but in Swift
. So I translated it in Objective-C
and that's it:
+ (NSImage *)image:(NSImage *)originalImage scaledToSize:(NSSize)desiredSize {
NSImage *newImage = [[NSImage alloc] initWithSize:desiredSize];
[newImage lockFocus];
[originalImage drawInRect:NSMakeRect(0, 0, desiredSize.width, desiredSize.height) fromRect:NSMakeRect(0, 0, imageWidth, imageHeight) operation:NSCompositingOperationSourceOver fraction:1];
[newImage unlockFocus];
newImage.size = desiredSize;
return [[NSImage alloc] initWithData:[newImage TIFFRepresentation]];
}
But there's still an issue: if desiredSize = NSMakeSize(50,50);
it will return 50 by 50 pixels. It's should be something with screen scale i guess.
There is Swift
code that I translated: