iOS 7
With the NDA finally lifted, we can finally talk about all of the amazing new APIs in iOS 7. And there are a lot of them. “1500 new APIs”, by Apple’s count during the WWDC Keynote. (Granted, a good portion of that could just be all of the changes from id
to instancetype
, but that’s a huge number, regardless).
We’ll be going over many of the new features iOS 7 in depth over the coming weeks, but with all of the excitement around this major release, this week’s issue will hit on some of the gems hiding in plain sight: NSData
Base64 encoding, NSURLComponents
, NSProgress
, CIDetector
, CIDetector
, SSReading
, AVCapture
, AVSpeech
, and MKDistance
.
NSData (NSDataBase64Encoding)
Base64 is a general term for encoding binary data as ASCII text. This is used all over the place on the web, since many core technologies are designed to support text, but not raw binary. For instance, CSS can embed images with inline data://
URIs, which are often Base64-encoded. Another example is Basic Authentication
headers, which Base64-encodes its username/password pair, which is marginally better than having them completely in the clear.
For the longest time, this boringly essential function was completely MIA, leaving thousands of developers to copy-paste random code snippets from forum threads. It was an omission as conspicuous and annoying as JSON pre-iOS 5.
But no longer! iOS 7 finally bakes-in Base64:
let string = "Lorem ipsum dolor sit amet."
if let data = string.data Using Encoding(NSUTF8String Encoding) {
let base64Encoded String = data.base64Encoded String With Options([])
print(base64Encoded String) // TG9y ZW0ga XBzd W0g ZG9s YXIgc2l0IGFt ZXQu
}
NSString *string = @"Lorem ipsum dolor sit amet.";
NSString *base64Encoded String = [[string data Using Encoding:NSUTF8String Encoding] base64Encoded String With Options:0];
NSLog(@"%@", base64Encoded String); // TG9y ZW0ga XBzd W0g ZG9s YXIgc2l0IGFt ZXQu
NSURLComponents & NSCharacterSet (NSURLUtilities)
Foundation is blessed with a wealth of functionality for working with URIs. Unfortunately, many of the APIs for manipulating URLs are strewn across NSString
, since NSURL
is immutable.
NSURLComponents
dramatically improves this situation. Think of it as NSMutable
:
if let components = NSURLComponents(string: "https://nshipster.com") {
components.path = "/i OS7"
components.query = "foo=bar"
print(components.scheme!) // http
print(components.URL!) // https://nshipster.com/i OS7?foo=bar
}
NSURLComponents *components = [NSURLComponents components With String:@"https://nshipster.com"];
components.path = @"/i OS7";
components.query = @"foo=bar";
NSLog(@"%@", components.scheme); // http
NSLog(@"%@", [components URL]); // https://nshipster.com/i OS7?foo=bar
Each property for URL components also has a percent
variation (e.g. user
& percent
), which forgoes any additional URI percent encoding of special characters.
Which characters are special, you ask? Well, it depends on what part of the URL you’re talking about. Good thing that NSCharacter
adds a new category for allowed URL characters in iOS 7:
+ (id)URLUser
Allowed Character Set + (id)URLPassword
Allowed Character Set + (id)URLHost
Allowed Character Set + (id)URLPath
Allowed Character Set + (id)URLQuery
Allowed Character Set + (id)URLFragment
Allowed Character Set
NSProgress
NSProgress
is a tough class to describe. It acts as both an observer and a delegate / coordinator, acting as a handle for reporting and monitoring progress. It integrates with system-level processes on OS X, but can also be plugged into user-facing UI. It can specify handlers for pausing and canceling, which then forward onto the operation actually doing the work.
Anything with a notion of completed and total units is a candidate for NSProgress
, whether it’s the bytes written to a file, the number of frames in a large render job, or the files downloaded from a server.
NSProgress
can be used to simply report overall progress in a localized way:
let progress = NSProgress(total Unit Count: 100)
progress.completed Unit Count = 42;
print(progress.localized Description) // 42% completed
NSProgress *progress = [NSProgress progress With Total Unit Count:100];
progress.completed Unit Count = 42;
NSLog(@"%@", [progress localized Description]); // 42% completed
…or it can be given a handler for stopping work entirely:
let timer = NSTimer(time Interval: 1.0, target: self, selector: "increment Completed Unit Count:",
user Info: nil, repeats: true)
progress.cancellation Handler = {
timer.invalidate()
}
progress.cancel()
NSTimer *timer = [NSTimer timer With Time Interval:1.0
target:self
selector:@selector(increment Completed Unit Count:) user Info:nil
repeats:YES];
progress.cancellation Handler = ^{
[timer invalidate];
};
[progress cancel];
NSProgress
makes a lot more sense in the context of OS X Mavericks, but for now, it remains a useful class for encapsulating the shared patterns of work units.
NSArray -firstObject
Rejoice! The NSRange
-dodging convenience of -last
has finally been extended to the first member of an NSArray
. (Well, it has been there as a private API since ~iOS 4, but that’s water under the bridge now).
Behold!
let array = [1, 2, 3] as NSArray
print("First Object: \(array.first Object)") // First Object: Optional(1)
print("Last Object: \(array.last Object)") // Last Object: Optional(3)
NSArray *array = @[@1, @2, @3];
NSLog(@"First Object: %@", [array first Object]); // First Object: 1
NSLog(@"Last Object: %@", [array last Object]); // Last Object: 3
Refreshing!
CIDetectorSmile & CIDetectorEyeBlink
As a random aside, shouldn’t it be a cause for concern that the device most capable of taking embarrassing photos of ourselves is also the device most capable of distributing it to millions or people? Just a thought.
Since iOS 5, the Core Image framework has provided facial detection and recognition functionality through the CIDetector
class. If it wasn’t insaneballs enough that we could detect faces in photos, in iOS 7 we can even tell if that face is smiling or has its eyes closed. *shudder*
In yet another free app idea, here’s a snippet that might be used by a camera that only saves pictures of smiling faces:
import Core Image
let smile Detector = CIDetector(of Type: CIDetector Type Face, context: context,
options: [CIDetector Tracking: true, CIDetector Accuracy: CIDetector Accuracy Low])
var features = smile Detector.features In Image(ci Image, options: [CIDetector Smile: true])
if let feature = features.first as? CIFace Feature where feature.has Smile {
UIImage Write To Saved Photos Album(UIImage(CIImage: ci Image), self, "did Finish Writing Image", &features)
} else {
label.text = "Say Cheese!"
}
@import Core Image;
CIDetector *smile Detector = [CIDetector detector Of Type:CIDetector Type Face
context:context
options:@{CIDetector Tracking: @YES,
CIDetector Accuracy: CIDetector Accuracy Low}];
NSArray *features = [smile Detector features In Image:ci Image options:@{CIDetector Smile:@YES}];
if (([features count] > 0) && (((CIFace Feature *)features[0]).has Smile)) {
UIImage Write To Saved Photos Album([UIImage image With CIImage:ci Image], self, @selector(did Finish Writing Image), features);
} else {
self.label.text = @"Say Cheese!"
}
AVCaptureMetaDataOutput
Scan UPCs, QR codes, and barcodes of all varieties with AVCapture
, new to iOS 7. All you need to do is set it up as the output of an AVCapture
, and implement the capture
method accordingly:
import AVFoundation
let session = AVCapture Session()
let device = AVCapture Device.default Device With Media Type(AVMedia Type Video)
var error: NSError?
do {
let input = try AVCapture Device Input(device: device)
session.add Input(input)
} catch let error {
print("Error: \(error)")
}
let output = AVCapture Metadata Output()
output.set Metadata Objects Delegate(self, queue: dispatch_get_main_queue())
session.add Output(output)
output.metadata Object Types = [AVMetadata Object Type QRCode]
session.start Running()
// MARK: - AVCapture Metadata Output Objects Delegate
func capture Output(
capture Output: AVCapture Output!,
did Output Metadata Objects metadata Objects: [Any Object]!,
from Connection connection: AVCapture Connection!) {
var QRCode: String?
for metadata in metadata Objects as! [AVMetadata Object] {
if metadata.type == AVMetadata Object Type QRCode {
// This will never happen; nobody has ever scanned a QR code... ever
QRCode = (metadata as! AVMetadata Machine Readable Code Object).string Value
}
}
print("QRCode: \(QRCode)")
}
@import AVFoundation;
AVCapture Session *session = [[AVCapture Session alloc] init];
AVCapture Device *device = [AVCapture Device default Device With Media Type:AVMedia Type Video];
NSError *error = nil;
AVCapture Device Input *input = [AVCapture Device Input device Input With Device:device
error:&error];
if (input) {
[session add Input:input];
} else {
NSLog(@"Error: %@", error);
}
AVCapture Metadata Output *output = [[AVCapture Metadata Output alloc] init];
[output set Metadata Objects Delegate:self queue:dispatch_get_main_queue()];
[session add Output:output];
[output set Metadata Object Types:@[AVMetadata Object Type QRCode]];
[session start Running];
#pragma mark - AVCapture Metadata Output Objects Delegate
- (void)capture Output:(AVCapture Output *)capture Output
did Output Metadata Objects:(NSArray *)metadata Objects
from Connection:(AVCapture Connection *)connection
{
NSString *QRCode = nil;
for (AVMetadata Object *metadata in metadata Objects) {
if ([metadata.type is Equal To String:AVMetadata Object Type QRCode]) {
// This will never happen; nobody has ever scanned a QR code... ever
QRCode = [(AVMetadata Machine Readable Code Object *)metadata string Value];
break;
}
}
NSLog(@"QR Code: %@", QRCode);
}
AVFoundation
supports every code you’ve heard of (and probably a few that you haven’t):
AVMetadata
Object Type UPCECode AVMetadata
Object Type Code39Code AVMetadata
Object Type Code39Mod43Code AVMetadata
Object Type EAN13Code AVMetadata
Object Type EAN8Code AVMetadata
Object Type Code93Code AVMetadata
Object Type Code128Code AVMetadata
Object Type PDF417Code AVMetadata
Object Type QRCode AVMetadata
Object Type Aztec Code
If nothing else, AVCapture
makes it possible to easily create a Passbook pass reader for the iPhone and iPad. There’s still a lot of unrealized potential in Passbook, so here’s to hoping that this API will be a factor in more widespread adoption.
SSReadingList
Even though the number of people who have actually read something saved for later is only marginally greater than the number of people who have ever used a QR code, it’s nice that iOS 7 adds a way to add items to the Safari reading list with the new Safari Services framework.
import Safari Services
let url = NSURL(string: "https://nshipster.com/ios7")!
try? SSReading List.default Reading List()?.add Reading List Item With URL(url, title: "NSHipster", preview Text: "...")
@import Safari Services;
NSURL *URL = [NSURL URLWith String:@"https://nshipster.com/ios7"];
[[SSReading List default Reading List] add Reading List Item With URL:URL
title:@"NSHipster"
preview Text:@"..."
error:nil];
AVSpeechSynthesizer
Text-to-Speech has been the killer feature of computers for accessibility and pranking enthusiasts since its inception in the late 1960s.
iOS 7 brings the power of Siri with the convenience of a Speak & Spell in a new class AVSpeech
:
import AVFoundation
let synthesizer = AVSpeech Synthesizer()
let utterance = AVSpeech Utterance(string: "Just what do you think you're doing, Dave?")
utterance.rate = AVSpeech Utterance Minimum Speech Rate // Tell it to me slowly
synthesizer.speak Utterance(utterance)
@import AVFoundation;
AVSpeech Synthesizer *synthesizer = [[AVSpeech Synthesizer alloc] init];
AVSpeech Utterance *utterance = [AVSpeech Utterance speech Utterance With String:@"Just what do you think you're doing, Dave?"];
utterance.rate = AVSpeech Utterance Minimum Speech Rate; // Tell it to me slowly
[synthesizer speak Utterance:utterance];
MKDistanceFormatter
Finally, we end our showcase of iOS 7’s new and noteworthy APIs with another class that has NSHipsters crying out “finally!”: MKDistance
.
As advertised, MKDistance
provides a way to convert distances into localized strings using either imperial or metric units:
import Core Location
import Map Kit
let san Francisco = CLLocation(latitude: 37.775, longitude: -122.4183333)
let portland = CLLocation(latitude: 45.5236111, longitude: -122.675)
let distance = portland.distance From Location(san Francisco)
let formatter = MKDistance Formatter()
formatter.units = .Imperial
print(formatter.string From Distance(distance)) // 535 miles
@import Core Location;
@import Map Kit;
CLLocation *san Francisco = [[CLLocation alloc] init With Latitude:37.775 longitude:-122.4183333];
CLLocation *portland = [[CLLocation alloc] init With Latitude:45.5236111 longitude:-122.675];
CLLocation Distance distance = [portland distance From Location:san Francisco];
MKDistance Formatter *formatter = [[MKDistance Formatter alloc] init];
formatter.units = MKDistance Formatter Units Imperial;
NSLog(@"%@", [formatter string From Distance:distance]); // 535 miles
So there you have it! This was just a small sample of the great new features of iOS 7. Still craving more? Check out Apple’s “What’s New in iOS 7” guide on the Developer Center.