iOS 7

With the NDA finally lifted, we can finally talk about all of the amazing new APIs in iOS 7. And there are a lot of them. “1500 new APIs”, by Apple’s count during the WWDC Keynote. (Granted, a good portion of that could just be all of the changes from id to instancetype, but that’s a huge number, regardless).

We’ll be going over many of the new features iOS 7 in depth over the coming weeks, but with all of the excitement around this major release, this week’s issue will hit on some of the gems hiding in plain sight: NSData Base64 encoding, NSURLComponents, NSProgress, CIDetectorSmile, CIDetectorEyeBlink, SSReadingList, AVCaptureMetaDataOutput, AVSpeechSynthesizer, and MKDistanceFormatter.


NSData (NSDataBase64Encoding)

Base64 is a general term for encoding binary data as ASCII text. This is used all over the place on the web, since many core technologies are designed to support text, but not raw binary. For instance, CSS can embed images with inline data:// URIs, which are often Base64-encoded. Another example is Basic Authentication headers, which Base64-encodes its username/password pair, which is marginally better than having them completely in the clear.

For the longest time, this boringly essential function was completely MIA, leaving thousands of developers to copy-paste random code snippets from forum threads. It was an omission as conspicuous and annoying as JSON pre-iOS 5.

But no longer! iOS 7 finally bakes-in Base64:

let string = "Lorem ipsum dolor sit amet."
if let data = string.dataUsingEncoding(NSUTF8StringEncoding) {
    let base64EncodedString = data.base64EncodedStringWithOptions([])

    print(base64EncodedString)    // TG9yZW0gaXBzdW0gZG9sYXIgc2l0IGFtZXQu
}

NSURLComponents & NSCharacterSet (NSURLUtilities)

Foundation is blessed with a wealth of functionality for working with URIs. Unfortunately, many of the APIs for manipulating URLs are strewn across NSString, since NSURL is immutable.

NSURLComponents dramatically improves this situation. Think of it as NSMutableURL:

if let components = NSURLComponents(string: "https://nshipster.com") {
    components.path = "/iOS7"
    components.query = "foo=bar"
    
    print(components.scheme!)     // http
    print(components.URL!)        // https://nshipster.com/iOS7?foo=bar
}

Each property for URL components also has a percentEncoded* variation (e.g. user & percentEncodedUser), which forgoes any additional URI percent encoding of special characters.

Which characters are special, you ask? Well, it depends on what part of the URL you’re talking about. Good thing that NSCharacterSet adds a new category for allowed URL characters in iOS 7:

  • + (id)URLUserAllowedCharacterSet
  • + (id)URLPasswordAllowedCharacterSet
  • + (id)URLHostAllowedCharacterSet
  • + (id)URLPathAllowedCharacterSet
  • + (id)URLQueryAllowedCharacterSet
  • + (id)URLFragmentAllowedCharacterSet

NSProgress

NSProgress is a tough class to describe. It acts as both an observer and a delegate / coordinator, acting as a handle for reporting and monitoring progress. It integrates with system-level processes on OS X, but can also be plugged into user-facing UI. It can specify handlers for pausing and canceling, which then forward onto the operation actually doing the work.

Anything with a notion of completed and total units is a candidate for NSProgress, whether it’s the bytes written to a file, the number of frames in a large render job, or the files downloaded from a server.

NSProgress can be used to simply report overall progress in a localized way:

let progress = NSProgress(totalUnitCount: 100)
progress.completedUnitCount = 42;

print(progress.localizedDescription) // 42% completed

…or it can be given a handler for stopping work entirely:

let timer = NSTimer(timeInterval: 1.0, target: self, selector: "incrementCompletedUnitCount:",
    userInfo: nil, repeats: true)

progress.cancellationHandler = {
    timer.invalidate()
}

progress.cancel()

NSProgress makes a lot more sense in the context of OS X Mavericks, but for now, it remains a useful class for encapsulating the shared patterns of work units.

NSArray -firstObject

Rejoice! The NSRangeException-dodging convenience of -lastObject has finally been extended to the first member of an NSArray. (Well, it has been there as a private API since ~iOS 4, but that’s water under the bridge now).

Behold!

let array = [1, 2, 3] as NSArray

print("First Object: \(array.firstObject)")   // First Object: Optional(1)
print("Last Object: \(array.lastObject)")     // Last Object: Optional(3)

Refreshing!

CIDetectorSmile & CIDetectorEyeBlink

As a random aside, shouldn’t it be a cause for concern that the device most capable of taking embarrassing photos of ourselves is also the device most capable of distributing it to millions or people? Just a thought.

Since iOS 5, the Core Image framework has provided facial detection and recognition functionality through the CIDetector class. If it wasn’t insaneballs enough that we could detect faces in photos, in iOS 7 we can even tell if that face is smiling or has its eyes closed. *shudder*

In yet another free app idea, here’s a snippet that might be used by a camera that only saves pictures of smiling faces:

import CoreImage
        
let smileDetector = CIDetector(ofType: CIDetectorTypeFace, context: context,
    options: [CIDetectorTracking: true, CIDetectorAccuracy: CIDetectorAccuracyLow])

var features = smileDetector.featuresInImage(ciImage, options: [CIDetectorSmile: true])

if let feature = features.first as? CIFaceFeature where feature.hasSmile {
    UIImageWriteToSavedPhotosAlbum(UIImage(CIImage: ciImage), self, "didFinishWritingImage", &features)
} else {
    label.text = "Say Cheese!"
}

AVCaptureMetaDataOutput

Scan UPCs, QR codes, and barcodes of all varieties with AVCaptureMetaDataOutput, new to iOS 7. All you need to do is set it up as the output of an AVCaptureSession, and implement the captureOutput:didOutputMetadataObjects:fromConnection: method accordingly:

import AVFoundation

let session = AVCaptureSession()
let device = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo)
var error: NSError?

do {
    let input = try AVCaptureDeviceInput(device: device)
    session.addInput(input)
} catch let error {
    print("Error: \(error)")
}

let output = AVCaptureMetadataOutput()
output.setMetadataObjectsDelegate(self, queue: dispatch_get_main_queue())
session.addOutput(output)
output.metadataObjectTypes = [AVMetadataObjectTypeQRCode]

session.startRunning()

// MARK: - AVCaptureMetadataOutputObjectsDelegate

func captureOutput(
    captureOutput: AVCaptureOutput!,
    didOutputMetadataObjects metadataObjects: [AnyObject]!,
    fromConnection connection: AVCaptureConnection!) {
        
        var QRCode: String?
        for metadata in metadataObjects as! [AVMetadataObject] {
            if metadata.type == AVMetadataObjectTypeQRCode {
                // This will never happen; nobody has ever scanned a QR code... ever
                QRCode = (metadata as! AVMetadataMachineReadableCodeObject).stringValue
            }
        }
        
        print("QRCode: \(QRCode)")
}

AVFoundation supports every code you’ve heard of (and probably a few that you haven’t):

  • AVMetadataObjectTypeUPCECode
  • AVMetadataObjectTypeCode39Code
  • AVMetadataObjectTypeCode39Mod43Code
  • AVMetadataObjectTypeEAN13Code
  • AVMetadataObjectTypeEAN8Code
  • AVMetadataObjectTypeCode93Code
  • AVMetadataObjectTypeCode128Code
  • AVMetadataObjectTypePDF417Code
  • AVMetadataObjectTypeQRCode
  • AVMetadataObjectTypeAztecCode

If nothing else, AVCaptureMetaDataOutput makes it possible to easily create a Passbook pass reader for the iPhone and iPad. There’s still a lot of unrealized potential in Passbook, so here’s to hoping that this API will be a factor in more widespread adoption.

SSReadingList

Even though the number of people who have actually read something saved for later is only marginally greater than the number of people who have ever used a QR code, it’s nice that iOS 7 adds a way to add items to the Safari reading list with the new Safari Services framework.

import SafariServices

let url = NSURL(string: "https://nshipster.com/ios7")!
try? SSReadingList.defaultReadingList()?.addReadingListItemWithURL(url, title: "NSHipster", previewText: "...")

AVSpeechSynthesizer

Text-to-Speech has been the killer feature of computers for accessibility and pranking enthusiasts since its inception in the late 1960s.

iOS 7 brings the power of Siri with the convenience of a Speak & Spell in a new class AVSpeechSynthesizer:

import AVFoundation

let synthesizer = AVSpeechSynthesizer()
let utterance = AVSpeechUtterance(string: "Just what do you think you're doing, Dave?")
utterance.rate = AVSpeechUtteranceMinimumSpeechRate   // Tell it to me slowly
synthesizer.speakUtterance(utterance)

MKDistanceFormatter

Finally, we end our showcase of iOS 7’s new and noteworthy APIs with another class that has NSHipsters crying out “finally!”: MKDistanceFormatter.

As advertised, MKDistanceFormatter provides a way to convert distances into localized strings using either imperial or metric units:

import CoreLocation
import MapKit

let sanFrancisco = CLLocation(latitude: 37.775, longitude: -122.4183333)
let portland = CLLocation(latitude: 45.5236111, longitude: -122.675)
let distance = portland.distanceFromLocation(sanFrancisco)

let formatter = MKDistanceFormatter()
formatter.units = .Imperial
print(formatter.stringFromDistance(distance)) // 535 miles

So there you have it! This was just a small sample of the great new features of iOS 7. Still craving more? Check out Apple’s “What’s New in iOS 7” guide on the Developer Center.

NSMutableHipster

Questions? Corrections? Issues and pull requests are always welcome.

This article uses Swift version 2.0 and was last reviewed on Sep 12, 2015. Find status information for all articles on the status page.

Written by Mattt
Mattt

Mattt (@mattt) is a writer and developer in Portland, Oregon.

Next Article

Xcode key bindings and gestures not only shave off seconds of precious work, but make you look more confident, competent, and cromulent in the process.