Search code examples
iosswiftnsdataimage-compressionimage-size

Converting image from data and then data from image increases data


I am getting my image from PHAsset.

I am requesting an image data using requestImageData function of PHAsset.

After getting data from that function, I create an image from that data using scale = 1

Then again converting data from that newly created image, I get increased data, not as the data I got previously

Below is the code that I use to get data from PHAsset

imageManager.requestImageData(for: image.asset, options: nil) { [weak self] (data, str, orientation, info) in
            guard let strongSelf = self else { return }
            if let dt = data {
                //Here I get data.count as 4993397
                if let image = UIImage(data: dt, scale: 1) {
                    strongSelf.image = image 
                    print("New data : \(image.jpegData(compressionQuality: 1)?.count)")
                    //The result printed here is 12107879
                }
            }
        }

Why the data gets increased here, I don't have any idea

Thanks!!


Solution

  • This line

    let image = UIImage(data: dt, scale: 1)

    construct image with the highest quality ratio (q=1.0). That’s why the image is suddenly so big.

    So the moment you get your image as an UIImage from NSData, the decompression/conversion changes the size of the image.