I am using PhotoKit
to fetch photos from system album and put them in an UICollectionView
.
For the UICollectionViewCell
, I set it like this:
cell.imageView.contentMode = UIViewContentModeScaleAspectFill;
When initialing my UICollectionView
, I fetch the photo PHAsset
from collection
: Camera Roll
only:
PHFetchResult *fetchResult = [PHAssetCollection fetchAssetCollectionsWithType:PHAssetCollectionTypeSmartAlbum subtype:PHAssetCollectionSubtypeAlbumRegular options:nil];
[fetchResult enumerateObjectsUsingBlock:^(PHAssetCollection *collection, NSUInteger idx, BOOL *stop) {
NSLog(@"ALBUM NAME: %@", collection.localizedTitle);
if ([collection.localizedTitle isEqualToString:@"Camera Roll"]) {
_fetchedPhotos = [PHAsset fetchAssetsInAssetCollection:collection options:nil];
_pickedStatuses = @[].mutableCopy;
NSMutableArray *assets = @[].mutableCopy;
_manager = [[PHCachingImageManager alloc] init];
_options = [[PHImageRequestOptions alloc] init];
_options.resizeMode = PHImageRequestOptionsResizeModeExact;
_options.deliveryMode = PHImageRequestOptionsDeliveryModeOpportunistic;
CGFloat scale = [UIScreen mainScreen].scale;
CGSize targetSize = CGSizeMake(layout.itemSize.width*scale, layout.itemSize.height*scale);
//I'm not sure if this api should be called here
[_manager startCachingImagesForAssets:assets targetSize:targetSize contentMode:PHImageContentModeAspectFill options:_options];
}
}];
Then I request the UIImage
from PHFetchResult
from above like this:
-(UICollectionViewCell *)collectionView:(UICollectionView *)collectionView cellForItemAtIndexPath:(NSIndexPath *)indexPath
{
MYCell *cell = (MYCell *)[collectionView dequeueReusableCellWithReuseIdentifier:@"reuseCell" forIndexPath:indexPath];
CGFloat scale = [UIScreen mainScreen].scale;
CGSize targetSize = CGSizeMake(_layout.itemSize.width*scale, _layout.itemSize.height*scale);
PHAsset *asset = _fetchedPhotos[indexPath.item];
[_manager requestImageForAsset:asset targetSize:targetSize contentMode:PHImageContentModeAspectFill options:_options resultHandler:^(UIImage *result, NSDictionary *info) {
cell.imageView.image = result;
}];
return cell;
}
But when I run it and scroll the UICollectionView
fast enough, I found the memory use becomes steep like this:
How can I reduce it in case it would crash when memory is not enough?
What I have done now is
PHAsset
objects;Try to adopt caching. My code is like below:
CGFloat scale = 1.5; //or an even smaller one
if (!_manager) {
_manager = [[PHCachingImageManager alloc] init];
}
if (!_options) {
_options = [[PHImageRequestOptions alloc] init];
}
_options.resizeMode = PHImageRequestOptionsResizeModeExact;
_options.deliveryMode = PHImageRequestOptionsDeliveryModeOpportunistic;
NSRange range = NSMakeRange(0, _fetchedPhotos.count);
NSIndexSet *set = [NSIndexSet indexSetWithIndexesInRange:range];
NSArray *assets = [_fetchedPhotos objectsAtIndexes:set];
CGSize targetSize = CGSizeMake(_layout.itemSize.width*scale, _layout.itemSize.height*scale);
[_manager startCachingImagesForAssets:assets targetSize:targetSize contentMode:PHImageContentModeAspectFill options:_options];
Now even I scroll the UICollectionView
fast enough, the top memory would be less than 50 MB
, and thanks to Caching
(I guess it is working based on my code) it doesn't fluctuate that much, memory use is like this:
UPDATE
According to another post here, it is recommended not to specify PHImageRequestOptions
object. Instead, you could leave it to iOS to decide what is the best for you to present the photos with best quality and least time.