PHImageManager
Yesterday’s article described various techniques for resizing images using APIs from the UIKit, Core Graphics, Core Image, and Image I/O frameworks. However, that article failed to mention some rather extraordinary functionality baked into the new Photos framework which takes care of all of this for you.
For anyone developing apps that manage photos or videos, meet your new best friend: PHImage
.
New in iOS 8, the Photos framework is something of a triumph for the platform. Photography is one of the key verticals for the iPhone: in addition to being the most popular cameras in the world, photo & video apps are regularly featured on the App Store. This framework goes a long way to empower apps to do even more, with a shared set of tools and primitives.
A great example of this is PHImage
, which acts as a centralized coordinator for image assets. Previously, each app was responsible for creating and caching their own image thumbnails. In addition to requiring extra work on the part of developers, redundant image caches could potentially add up to gigabytes of data across the system. But with PHImage
, apps don’t have to worry about resizing or caching logistics, and can instead focus on building out features.
Requesting Asset Images
PHImage
provides several APIs for asynchronously fetching image and video data for assets. For a given asset, a PHImage
can request an image at a particular size and content mode, with a high degree of configurability in terms of quality and delivery options.
But first, here’s a simple example of how a table view might asynchronously load cell images with asset thumbnails:
import Photos
var assets: [PHAsset]
func table View(table View: UITable View, cell For Row At Index Path index Path: NSIndex Path) -> UITable View Cell {
let cell = table View.dequeue Reusable Cell With Identifier("Cell", for Index Path: index Path)
let manager = PHImage Manager.default Manager()
if cell.tag != 0 {
manager.cancel Image Request(PHImage Request ID(cell.tag))
}
let asset = assets[index Path.row]
if let creation Date = asset.creation Date {
cell.text Label?.text = NSDate Formatter.localized String From Date(creation Date,
date Style: .Medium Style,
time Style: .Medium Style
)
} else {
cell.text Label?.text = nil
}
cell.tag = Int(manager.request Image For Asset(asset,
target Size: CGSize(width: 100.0, height: 100.0),
content Mode: .Aspect Fill,
options: nil) { (result, _) in
cell.image View?.image = result
})
return cell
}
@import Photos;
@property (nonatomic, strong) NSArray<PHAsset *> *assets;
- (UITable View Cell *)table View:(UITable View *)table View cell For Row At Index Path:(NSIndex Path *)index Path {
UITable View Cell *cell = [table View dequeue Reusable Cell With Identifier:@"Cell"
for Index Path:index Path];
PHImage Manager *manager = [PHImage Manager default Manager];
if (cell.tag) {
[manager cancel Image Request:(PHImage Request ID)cell.tag];
}
PHAsset *asset = self.assets[index Path.row];
if (asset.creation Date) {
cell.text Label.text = [NSDate Formatter localized String From Date:asset.creation Date
date Style:NSDate Formatter Medium Style
time Style:NSDate Formatter Medium Style];
} else {
cell.text Label.text = nil;
}
cell.tag = [manager request Image For Asset:asset
target Size:CGSize Make(100.0, 100.0)
content Mode:PHImage Content Mode Aspect Fill
options:nil
result Handler:^(UIImage * _Nullable result, NSDictionary * _Nullable info) {
cell.image View.image = result;
}];
return cell;
}
API usage is pretty straightforward: the default
asynchronously requests an image for the asset corresponding to the cell at a particular index path, and the cell image view is set whenever the result comes back. The only tricky part is handling cell reuse—(1) before assigning the resulting image to the cell’s image view, we call cell
to be sure we’re working with the right cell, and (2) we use the cell’s tag
to keep track of image requests, in order to cancel any pending requests when a cell is reused.
Batch Pre-Caching Asset Images
If there’s reasonable assurance that most of a set of assets will be viewed at some point, it may make sense to pre-cache those images. PHCaching
is a subclass of PHImage
designed to do just that.
For example, here’s how the results of a PHAsset
fetch operation can be pre-cached in order to optimize image availability:
let caching Image Manager = PHCaching Image Manager()
let options = PHFetch Options()
options.predicate = NSPredicate(format: "favorite == YES")
options.sort Descriptors = [
NSSort Descriptor(key: "creation Date", ascending: true)
]
let results = PHAsset.fetch Assets With Media Type(.Image, options: options)
var assets: [PHAsset] = []
results.enumerate Objects Using Block { (object, _, _) in
if let asset = object as? PHAsset {
assets.append(asset)
}
}
caching Image Manager.start Caching Images For Assets(assets,
target Size: PHImage Manager Maximum Size,
content Mode: .Aspect Fit,
options: nil
)
PHCaching Image Manager *caching Image Manager = [[PHCaching Image Manager alloc] init];
PHFetch Options *options = [[PHFetch Options alloc] init];
options.predicate = [NSPredicate predicate With Format:@"favorite == YES"];
options.sort Descriptors = @[[NSSort Descriptor sort Descriptor With Key:@"" ascending:YES]];
PHFetch Result *results = [PHAsset fetch Assets With Media Type:PHAsset Media Type Image
options:nil];
NSMutable Array<PHAsset *> *assets = [[NSMutable Array alloc] init];
[results enumerate Objects Using Block:^(id _Nonnull object, NSUInteger idx, BOOL * _Nonnull stop) {
if ([object is Kind Of Class:[PHAsset class]]) {
[assets add Object:object];
}
}];
[caching Image Manager start Caching Images For Assets:assets
target Size:PHImage Manager Maximum Size
content Mode:PHImage Content Mode Aspect Fit
options:nil];
Alternatively, Swift will
/ did
hooks offer a convenient way to automatically start pre-caching assets as they are loaded:
let caching Image Manager = PHCaching Image Manager()
var assets: [PHAsset] = [] {
will Set {
caching Image Manager.stop Caching Images For All Assets()
}
did Set {
caching Image Manager.start Caching Images For Assets(self.assets,
target Size: PHImage Manager Maximum Size,
content Mode: .Aspect Fit,
options: nil
)
}
}
PHImageRequestOptions
In the previous examples, the options
parameter of request
& start
have been set to nil
. Passing an instance of PHImage
allows for fine-grained control over what gets loaded and how.
PHImage
has the following properties:
delivery
Mode PHImage
: (Described Below)Request Options Delivery Mode network
Access Allowed Bool
: Will download the image from iCloud, if necessary.normalized
Crop Rect CGRect
: Specify a crop rectangle in unit coordinates of the original image.progress
: Provide caller a way to be told how much progress has been made prior to delivering the data when it comes from iCloud. Defaults to nil, shall be set by callerHandler resize
Mode PHImage
:Request Options Resize Mode .None
,.Fast
, or.Exact
. Does not apply when size isPHImage
.Manager Maximum Size synchronous
Bool
: Return only a single result, blocking until available (or failure). Defaults to NOversion
PHImage
:Request Options Version .Current
,.Unadjusted
, or.Original
Several of these properties take a specific enum
type, which are all pretty self explanatory, save for PHImage
, which encapsulates some pretty complex behavior:
PHImageRequestOptionsDeliveryMode
.Opportunistic
: Photos automatically provides one or more results in order to balance image quality and responsiveness. Photos may call the resultHandler block method more than once, such as to provide a low-quality image suitable for displaying temporarily while it prepares a high-quality image. If the image manager has already cached the requested image, Photos calls your result handler only once. This option is not available if the synchronous property isfalse
..High
: Photos provides only the highest-quality image available, regardless of how much time it takes to load. If the synchronous property isQuality Format true
or if using therequest
method, this behavior is the default and only option.Image Data For Asset:options:result Handler: .Fast
: Photos provides only a fast-loading image, possibly sacrificing image quality. If a high-quality image cannot be loaded quickly, the result handler provides a low-quality image. Check theFormat PHImage
key in theResult Is Degraded Key info
dictionary to determine the quality of image provided to the result handler.
Cropping Asset To Detected Faces Using 2-Phase Image Request
Using PHImage
and PHImage
to their full capacity allows for rather sophisticated functionality. One could, for example, use successive image requests to crop full-quality assets to any faces detected in the image.
let asset: PHAsset
@IBOutlet weak var image View: UIImage View!
@IBOutlet weak var progress View: UIProgress View!
override func view Did Load() {
super.view Did Load()
let manager = PHImage Manager.default Manager()
let initial Request Options = PHImage Request Options()
initial Request Options.synchronous = true
initial Request Options.resize Mode = .Fast
initial Request Options.delivery Mode = .Fast Format
manager.request Image For Asset(asset,
target Size: CGSize(width: 250.0, height: 250.0),
content Mode: .Aspect Fit,
options: initial Request Options) { (initial Result, _) in
guard let ci Image = initial Result?.CIImage else {
return
}
let final Request Options = PHImage Request Options()
final Request Options.progress Handler = { (progress, _, _, _) in
self.progress View.progress = Float(progress)
}
let detector = CIDetector(
of Type: CIDetector Type Face,
context: nil,
options: [CIDetector Accuracy: CIDetector Accuracy Low]
)
let features = detector.features In Image(ci Image)
if features.count > 0 {
var rect = CGRect Zero
features.for Each {
rect.union In Place($0.bounds)
}
let transform = CGAffine Transform Make Scale(1.0 / initial Result!.size.width, 1.0 / initial Result!.size.height)
final Request Options.normalized Crop Rect = CGRect Apply Affine Transform(rect, transform)
}
manager.request Image For Asset(self.asset,
target Size: PHImage Manager Maximum Size,
content Mode: .Aspect Fit,
options: final Request Options) { (final Result, _) in
self.image View.image = final Result
}
}
}
@property (nonatomic, strong) PHAsset *asset;
@property (nonatomic, weak) IBOutlet UIImage View *image View;
@property (nonatomic, weak) IBOutlet UIProgress View *progress View;
- (void)view Did Load {
[super view Did Load];
PHImage Manager *manager = [PHImage Manager default Manager];
PHImage Request Options *initial Request Options = [[PHImage Request Options alloc] init];
initial Request Options.synchronous = true;
initial Request Options.resize Mode = PHImage Request Options Resize Mode Fast;
initial Request Options.delivery Mode = PHImage Request Options Delivery Mode Fast Format;
void (^result Handler)(UIImage *, NSDictionary *) = ^(UIImage * _Nullable initial Result, NSDictionary * _Nullable info) {
if (!initial Result.CIImage) {
return;
}
PHImage Request Options *final Request Options = [[PHImage Request Options alloc] init];
final Request Options.progress Handler = ^(double progress, NSError *error, BOOL *stop, NSDictionary *info) {
self.progress View.progress = progress;
};
CIDetector *detector = [CIDetector detector Of Type:CIDetector Type Face
context:nil
options:@{CIDetector Accuracy : CIDetector Accuracy Low}];
NSArray<CIFeature *> *features = [detector features In Image:initial Result.CIImage];
if (features.count) {
CGRect rect;
for (CIFeature *feature in features) {
CGRect Union(rect, feature.bounds);
}
CGAffine Transform transform = CGAffine Transform Make Scale(1.0 / initial Result.size.width, 1.0 / initial Result.size.height);
final Request Options.normalized Crop Rect = CGRect Apply Affine Transform(rect, transform);
}
[manager request Image For Asset:self.asset
target Size:PHImage Manager Maximum Size
content Mode:PHImage Content Mode Aspect Fit
options:final Request Options
result Handler:^(UIImage * _Nullable final Result, NSDictionary * _Nullable info) {
self.image View.image = final Result;
}];
};
// typedef void (^ PHAsset Image Progress Handler)(double progress, NSError *__nullable error, BOOL *stop, NSDictionary *__nullable info) NS_AVAILABLE_IOS(8_0);
[manager request Image For Asset:self.asset
target Size:PHImage Manager Maximum Size
content Mode:PHImage Content Mode Aspect Fit
options:initial Request Options
result Handler:result Handler];
}
The initial request attempts to get the most readily available representation of an asset to pass into a CIDetector
for facial recognition. If any features were detected, the final request would be cropped to them, by specifying the normalized
on the final PHImage
.
normalized
is normalized forCrop Rect origin
andsize
components within the inclusive range0.0
to1.0
. An affine transformation scaling on the inverse of the original frame makes for an easy calculation.
From its very inception, iOS has been a balancing act between functionality and integrity. And with every release, a combination of thoughtful affordances and powerful APIs have managed to expand the role third-party applications without compromising security or performance.
By unifying functionality for fetching, managing, and manipulating photos, the Photos framework will dramatically raise the standards for existing apps, while simultaneously lowering the bar for developing apps, and is a stunning example of why developers tend to prefer iOS as a platform.