I'm encountering a peculiar issue in my iOS app related to image resizing and uploading. I have implemented a method to resize images to fit within a specific size range (minimum size: 320px, maximum size: 1536px) before uploading them to a server. The resizing seems to work perfectly, as verified by inspecting the UIImage directly after resizing. However, the moment I convert the resized image to data (JPEG format) in preparation for the upload, it seems to revert to its original size or does not maintain the resized dimensions.
Here's the resizing method I'm using:
func resizedImage(minimumSize: CGFloat, maximumSize: CGFloat) -> UIImage? {
let originalWidth = size.width
let originalHeight = size.height
var scaleFactor: CGFloat
var newWidth: CGFloat
var newHeight: CGFloat
if originalWidth < minimumSize || originalHeight < minimumSize {
scaleFactor = max(minimumSize / originalWidth, minimumSize / originalHeight)
} else if originalWidth > maximumSize || originalHeight > maximumSize {
scaleFactor = min(maximumSize / originalWidth, maximumSize / originalHeight)
} else {
scaleFactor = 1.0
}
newWidth = originalWidth * scaleFactor
newHeight = originalHeight * scaleFactor
UIGraphicsBeginImageContextWithOptions(CGSize(width: newWidth, height: newHeight), false, 0)
draw(in: CGRect(x: 0, y: 0, width: newWidth, height: newHeight))
let resizedImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return resizedImage
}
After resizing, when I attempt to convert the resized UIImage into data like so:
if let resizedImage = resizedImage(minimumSize: 320, maximumSize: 1536) {
let imageData = resizedImage.jpegData(compressionQuality: 1.0)
// Proceed to upload imageData
}
Despite the resizing, the imageData
seems to reflect the original image size rather than the resized one. This issue occurs before the upload, during the preparation of the image data for the request body. The goal is to ensure that the image uploaded to the server adheres to the specified size limits. Is there something I'm overlooking in the resizing or data conversion process that could cause the image to revert to its original dimensions?
Any advice or insights on how to ensure the resized image maintains its dimensions when converted to data for upload would be greatly appreciated. Thank you!
Using UIGraphicsBeginImageContextWithOptions
with that last parameter using a scale of 0
means that it will use the scale of the device. E.g., if rendering an image into a size of 100×100 with a scale of 0
on a device featuring a Retina 3× screen, the resulting UIImage
will have a size
with “logical dimensions” of 100×100pt, but its scale
will be is 3
: That means that the resulting JPEG will be 300×300px.
So, check not only the size
of the resulting image, but its scale
, too. In short, consider using a scale of 1
, not 0
, if you want the resulting image to be to match your intended min/max dimension, in pixels, specified in the parameters to your function.
FWIW, UIGraphicsBeginImageContextWithOptions
is now deprecated and has been replaced with UIGraphicsImageRenderer
. So, if you want to use a scale of 1
, perhaps:
extension UIImage {
func resizedImage(minimumSize: CGFloat, maximumSize: CGFloat) -> UIImage {
let originalWidth = size.width
let originalHeight = size.height
let resizeFactor: CGFloat = if originalWidth < minimumSize || originalHeight < minimumSize {
max(minimumSize / originalWidth, minimumSize / originalHeight)
} else if originalWidth > maximumSize || originalHeight > maximumSize {
min(maximumSize / originalWidth, maximumSize / originalHeight)
} else {
1
}
let width = originalWidth * resizeFactor
let height = originalHeight * resizeFactor
let bounds = CGRect(x: 0, y: 0, width: width, height: height)
let format = UIGraphicsImageRendererFormat()
format.scale = 1
return UIGraphicsImageRenderer(bounds: bounds, format: format).image { _ in
draw(in: bounds)
}
}
}