Skip to content

Latest commit

 

History

History
244 lines (175 loc) · 11.4 KB

2013-09-23-ios7.md

File metadata and controls

244 lines (175 loc) · 11.4 KB
layout title ref category description
post
iOS 7
With the NDA finally lifted, we can finally talk about all of the amazing new APIs in iOS 7.

With the NDA finally lifted, we can finally talk about all of the amazing new APIs in iOS 7. And there are a lot of them. "1500 new APIs", by Apple's count during the WWDC Keynote. (Granted, a good portion of that could just be all of the changes from id to instancetype, but that's a huge number, regardless).

We'll be going over many of the new features iOS 7 in depth over the coming weeks, but with all of the excitement around this major release, this week's issue will hit on some of the gems hiding in plain sight: NSData Base64 encoding, NSURLComponents, NSProgress, CIDetectorSmile, CIDetectorEyeBlink, SSReadingList, AVCaptureMetaDataOutput, AVSpeechSynthesizer, and MKDistanceFormatter.


NSData (NSDataBase64Encoding)

Base64 is a general term for encoding binary data as ASCII text. This is used all over the place on the web, since many core technologies are designed to support text, but not raw binary. For instance, CSS can embed images with inline data:// URIs, which are often Base64-encoded. Another example is Basic Authentication headers, which Base64-encodes its username/password pair, which is marginally better than having them completely in the clear.

For the longest time, this boringly essential function was completely MIA, leaving thousands of developers to copy-paste random code snippets from forum threads. It was an omission as conspicuous and annoying as JSON pre-iOS 5.

But no longer! iOS 7 finally bakes-in Base64:

NSString *string = @"Lorem ipsum dolor sit amet.";
NSString *base64EncodedString = [[string dataUsingEncoding:NSUTF8StringEncoding] base64EncodedStringWithOptions:0];

NSLog(@"%@", base64EncodedString); // @"TG9yZW0gaXBzdW0gZG9sYXIgc2l0IGFtZXQu"

NSURLComponents & NSCharacterSet (NSURLUtilities)

Foundation is blessed with a wealth of functionality for working with URIs. Unfortunately, many of the APIs for manipulating URLs are strewn across NSString, since NSURL is immutable.

NSURLComponents dramatically improves this situation. Think of it as NSMutableURL:

NSURLComponents *components = [NSURLComponents componentsWithString:@"http://nshipster.com"];
components.path = @"/iOS7";
components.query = @"foo=bar";

NSLog(@"%@", components.scheme); // @"http"
NSLog(@"%@", [components URL]); // @"http://nshipster.com/iOS7?foo=bar"

Each property for URL components also has a percentEncoded* variation (e.g. user & percentEncodedUser), which forgoes any additional URI percent encoding of special characters.

Which characters are special, you ask? Well, it depends on what part of the URL you're talking about. Good thing that NSCharacterSet adds a new category for allowed URL characters in iOS 7:

  • + (id)URLUserAllowedCharacterSet
  • + (id)URLPasswordAllowedCharacterSet
  • + (id)URLHostAllowedCharacterSet
  • + (id)URLPathAllowedCharacterSet
  • + (id)URLQueryAllowedCharacterSet
  • + (id)URLFragmentAllowedCharacterSet

NSProgress

NSProgress is a tough class to describe. It acts as both an observer and a delegate / coordinator, acting as a handle for reporting and monitoring progress. It integrates with system-level processes on OS X, but can also be plugged into user-facing UI. It can specify handlers for pausing and canceling, which then forward onto the operation actually doing the work.

Anything with a notion of completed and total units is a candidate for NSProgress, whether it's the bytes written to a file, the number of frames in a large render job, or the files downloaded from a server.

NSProgress can be used to simply report overall progress in a localized way:

NSProgress *progress = [NSProgress progressWithTotalUnitCount:100];
progress.completedUnitCount = 42;

NSLog(@"%@", [progress localizedDescription]); // 42% completed

...or it can be given a handler for stopping work entirely:

NSTimer *timer = [NSTimer timerWithTimeInterval:1.0
                                         target:self
                                       selector:@selector(incrementCompletedUnitCount:) userInfo:nil
                                        repeats:YES];

progress.cancellationHandler = ^{
    [timer invalidate];
};

[progress cancel];

NSProgress makes a lot more sense in the context of Mac OS X 10.9 Mavericks, but for now, it remains a useful class for encapsulating the shared patterns of work units.

NSArray -firstObject

Rejoice! The NSRangeException-dodging convenience of -lastObject has finally been extended to the first member of an NSArray. (Well, it has been there as a private API since ~iOS 4, but that's water under the bridge now).

Behold!

NSArray *array = @[@1, @2, @3];

NSLog(@"First Object: %@", [array firstObject]); // 1
NSLog(@"Last Object: %@", [array lastObject]); // 3

Refreshing!

CIDetectorSmile & CIDetectorEyeBlink

As a random aside, shouldn't it be a cause for concern that the device most capable of taking embarrassing photos of ourselves is also the device most capable of distributing it to millions or people? Just a thought.

Since iOS 5, the Core Image framework has provided facial detection and recognition functionality through the CIDetector class. If it wasn't insaneballs enough that we could detect faces in photos, in iOS 7 we can even tell if that face is smiling or has its eyes closed. *shudder*

In yet another free app idea, here's a snippet that might be used by a camera that only saves pictures of smiling faces:

@import CoreImage;
CIDetector *smileDetector = [CIDetector detectorOfType:CIDetectorTypeFace
                                context:context
                                options:@{CIDetectorTracking: @YES,
                                          CIDetectorAccuracy: CIDetectorAccuracyLow}];
NSArray *features = [smileDetector featuresInImage:image options:@{CIDetectorSmile:@YES}];
if (([features count] > 0) && (((CIFaceFeature *)features[0]).hasSmile)) {
    UIImageWriteToSavedPhotosAlbum(image, self, @selector(didFinishWritingImage), features);
} else {
    self.label.text = @"Say Cheese!"
}

AVCaptureMetaDataOutput

Scan UPCs, QR codes, and barcodes of all varieties with AVCaptureMetaDataOutput, new to iOS 7. All you need to do is set it up as the output of an AVCaptureSession, and implement the captureOutput:didOutputMetadataObjects:fromConnection: method accordingly:

@import AVFoundation;
AVCaptureSession *session = [[AVCaptureSession alloc] init];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;

AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device
                                                                    error:&error];
if (input) {
    [session addInput:input];
} else {
    NSLog(@"Error: %@", error);
}

AVCaptureMetadataOutput *output = [[AVCaptureMetadataOutput alloc] init];
[output setMetadataObjectsDelegate:self queue:dispatch_get_main_queue()];
[session addOutput:output];
[output setMetadataObjectTypes:@[AVMetadataObjectTypeQRCode]];

[session startRunning];
#pragma mark - AVCaptureMetadataOutputObjectsDelegate

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputMetadataObjects:(NSArray *)metadataObjects
       fromConnection:(AVCaptureConnection *)connection
{
    NSString *QRCode = nil;
    for (AVMetadataObject *metadata in metadataObjects) {
        if ([metadata.type isEqualToString:AVMetadataObjectTypeQRCode]) {
            // This will never happen; nobody has ever scanned a QR code... ever
            QRCode = [(AVMetadataMachineReadableCodeObject *)metadata stringValue];
            break;
        }
    }

    NSLog(@"QR Code: %@", QRCode);
}

AVFoundation supports every code you've heard of (and probably a few that you haven't):

  • AVMetadataObjectTypeUPCECode
  • AVMetadataObjectTypeCode39Code
  • AVMetadataObjectTypeCode39Mod43Code
  • AVMetadataObjectTypeEAN13Code
  • AVMetadataObjectTypeEAN8Code
  • AVMetadataObjectTypeCode93Code
  • AVMetadataObjectTypeCode128Code
  • AVMetadataObjectTypePDF417Code
  • AVMetadataObjectTypeQRCode
  • AVMetadataObjectTypeAztecCode

If nothing else, AVCaptureMetaDataOutput makes it possible to easily create a Passbook pass reader for the iPhone and iPad. There's still a lot of unrealized potential in Passbook, so here's to hoping that this API will be a factor in more widespread adoption.

SSReadingList

Even though the number of people who have actually read something saved for later is only marginally greater than the number of people who have ever used a QR code, it's nice that iOS 7 adds a way to add items to the Safari reading list with the new Safari Services framework.

@import SafariServices;
NSURL *URL = [NSURL URLWithString:@"http://nshipster.com/ios7"];
[[SSReadingList defaultReadingList] addReadingListItemWithURL:URL
                                                        title:@"NSHipster"
                                                  previewText:@"..."
                                                        error:nil];

AVSpeechSynthesizer

Text-to-Speech has been the killer feature of computers for accessibility and pranking enthusiasts since its inception in the late 1960s.

iOS 7 brings the power of Siri with the convenience of a Speak & Spell in a new class AVSpeechSynthesizer:

AVSpeechSynthesizer *synthesizer = [[AVSpeechSynthesizer alloc] init];
AVSpeechUtterance *utterance = [AVSpeechUtterance speechUtteranceWithString:@"Just what do you think you're doing, Dave?"];
utterance.rate = AVSpeechUtteranceMinimumSpeechRate; // Tell it to me slowly
[synthesizer speakUtterance:utterance];

MKDistanceFormatter

Finally, we end our showcase of iOS 7's new and noteworthy APIs with another class that has NSHipsters crying out "finally!": MKDistanceFormatter.

As advertised, MKDistanceFormatter provides a way to convert distances into localized strings using either imperial or metric units:

@import CoreLocation;
@import MapKit;
CLLocation *sanFrancisco = [[CLLocation alloc] initWithLatitude:37.775 longitude:-122.4183333];
CLLocation *portland = [[CLLocation alloc] initWithLatitude:45.5236111 longitude:-122.675];
CLLocationDistance distance = [portland distanceFromLocation:sanFrancisco];

MKDistanceFormatter *formatter = [[MKDistanceFormatter alloc] init];
formatter.units = MKDistanceFormatterUnitsImperial;
NSLog(@"%@", [formatter stringFromDistance:distance]); // 535 miles

So there you have it! This was just a small sample of the great new features of iOS 7. Still craving more? Check out Apple's "What's New in iOS 7" guide on the Developer Center.