In the last couple of months, Apple has released new features as part of iOS 9 that allow a deeper integration between apps and the operating system. Among those features are Spotlight Search integration, Universal Links, and 3D Touch for iPhone 6S and iPhone 6S Plus.
Here at Flickr, we have added support for these new features and we have learned a few lessons that we would love to share.
Spotlight Search
There are two different kinds of content that can be searched through Spotlight: the kind that you explicitly index, and the kind that gets indexed based on the state your app is in. To explicitly index content, you use Core Spotlight, which lets you index multiple items at once. To index content related to your app’s current state, you use NSUserActivity: when a piece of content becomes visible, you start an activity to make iOS aware of this fact. iOS can then determine which pieces of content are more frequently visited, and thus more relevant to the user. NSUserActivity also allows us to mark certain items as public, which means that they might be shown to other iOS users as well.
For a better user experience, we index as much useful information as we can right off the bat. We prefetch all the user’s albums, groups, and people they follow, and add them to the search index using Core Spotlight. Indexing an item looks like this:
// Create the attribute set, which encapsulates the metadata of the item we're indexing
CSSearchableItemAttributeSet *attributeSet = [[CSSearchableItemAttributeSet alloc] initWithItemContentType:(NSString *)kUTTypeImage];
attributeSet.title = photo.title;
attributeSet.contentDescription = photo.searchableDescription;
attributeSet.keywords = photo.keywords;
attributeSet.thumbnailData = UIImageJPEGRepresentation(photo.thumbnail, 0.98);
// Create the searchable item and index it.
CSSearchableItem *searchableItem = [[CSSearchableItem alloc] initWithUniqueIdentifier:[NSString stringWithFormat:@"%@/%@", photo.identifier, photo.searchContentType] domainIdentifier:@"FLKCurrentUserSearchDomain" attributeSet:attributeSet];
[[CSSearchableIndex defaultSearchableIndex] indexSearchableItems:@[ searchableItem ] completionHandler:^(NSError * _Nullable error) {
if (error) {
// Handle failures.
}
}];
Since we have multiple kinds of data – photos, albums, and groups – we had to create an identifier that is a combination of its type and its actual model ID.
Many users will have a large amount of data to be fetched, so it’s important that we take measures to make sure that the app still performs well. Since searching is unlikely to happen right after the user opens the app (that’s when we start prefetching this data, if needed), all this work is performed by a low-priority NSOperationQueue. If we ever need to fetch images to be used as thumbnails, we request it with low-priority NSURLSessionDownloadTask
. These kinds of measures ensure that we don’t affect the performance of any operation or network request triggered by user actions, such as fetching new images and pages when scrolling through content.
Flickr provides a huge amount of public content, including many amazing photos. If anybody searches for “Northern Lights” in Spotlight, shouldn’t we show them our best Aurora Borealis photos? For this public content – photos, public groups, tags and so on – we leverage NSUserActivity, with its new search APIs, to make it all searchable when viewed. Here’s an example:
CSSearchableItemAttributeSet *attributeSet = [[CSSearchableItemAttributeSet alloc] initWithItemContentType:(NSString *) kUTTypeImage];
// Setup attributeSet the same way we did before...
// Set the related unique identifier, so it matches to any existing item indexed with Core Spotlight.
attributeSet.relatedUniqueIdentifier = [NSString stringWithFormat:@"%@/%@", photo.identifier, photo.searchContentType];
self.userActivity = [[NSUserActivity alloc] initWithActivityType:@"FLKSearchableUserActivityType"];
self.userActivity.title = photo.title;
self.userActivity.keywords = [NSSet setWithArray:photo.keywords];
self.userActivity.webpageURL = photo.photoPageURL;
self.userActivity.contentAttributeSet = attributeSet;
self.userActivity.eligibleForSearch = YES;
self.userActivity.eligibleForPublicIndexing = photo.isPublic;
self.userActivity.requiredUserInfoKeys = [NSSet setWithArray:self.userActivity.userInfo.allKeys];
[self.userActivity becomeCurrent];
Every time a user opens a photo, public group, location page, etc., we create a new NSUserActivity and make it current. The more often a specific activity is made current, the more relevant iOS considers it. In fact, the more often an activity is made current by any number of different users, the more relevant Apple considers it globally, and the more likely it will show up for other iOS users as well (provided it’s public).
Until now we’ve only seen half the picture. We’ve seen how to index things for Spotlight search; when a user finally does search and taps on a result, how do we take them to the right place in our app? We’ll get to this a bit later, but for now suffice it to say that you’ll get a call to the method application:continueUserActivity:restorationHandler:
to our application delegate.
It’s important to note that if we wanted to make use of the userInfo
in the NSUserActivity
, iOS won’t give it back to you for free in this method. To get it, we have to make sure that we assigned an NSSet to the requiredUserInfoKeys
property of our NSUserActivity
when we created it. In their documentation, Apple also tells us that if you set the webpageURL
property when eligibleForSearch
is YES
, you need to make sure that you’re pointing to the right web URL corresponding to your content, otherwise you might end up with duplicate results in Spotlight (Apple crawls your site for content to surface in Spotlight, and if it finds the same content at a different URL it’ll think it’s a different piece of content).
Universal Links
In order to support Universal Links, Apple requires that every domain supported by the app host an “apple-app-site-association” file at its root. This is a JSON file that describes which relative paths in your domains can be handled by the app. When a user taps a link from another app in iOS, if your app is able to handle that domain for a specific path, it will open your app and call application:continueUserActivity:restorationHandler:
. Otherwise your application won’t be opened – Safari will handle the URL instead.
{
"applinks": {
"apps": [],
"details": {
"XXXXXXXXXX.com.some.flickr.domain": {
"paths": [
"/",
"/photos/*",
"/people/*",
"/groups/*"
]
}
}
}
}
This file has to be hosted on HTTPS with a valid certificate. Its MIME type needs to be “application/pkcs7-mime.” No redirects are allowed when requesting the file. If the only intent is to support Universal Links, no further steps are required. But if you’re also using this file to support Handoffs (introduced in iOS 8), then your file has to be CMS signed by a valid TLS certificate.
In Flickr, we have a few different domains. That means that each one of flickr.com, http://www.flickr.com, m.flickr.com and flic.kr must provide its own JSON association file, whether or not they differ. In our case, the flic.kr domain actually does support different paths, since it’s only used for short URLs; hence, its “apple-app-site-association” is different than the others.
On the client side, only a few steps are required to support Universal Links. First, “Associated Domains” must be enabled under the Capabilities tab of the app’s target settings. For each supported domain, an entry “applinks:” entry must be added. Here is how it looks for Flickr:
That is it. Now if someone receives a text message with a Flickr link, she will jump right to the Flickr app when she taps on it.
Deep linking into the app
Great! We have Flickr photos showing up as search results and Flickr URLs opening directly in our app. Now we just have to get the user to the proper place within the app. There are different entry points into our app, and we need to make the implementation consistent and avoid code duplication.
iOS has been supporting deep linking for a while already and so has Flickr. To support deep linking, apps could register to handle custom URLs (meaning a custom scheme, such as myscheme://mydata/123). The website corresponding to the app could then publish links directly to the app. For every custom URL published on the Flickr website, our app translates it into a representation of the data to be shown. This representation looks like this:
@interface FLKRoute : NSObject
@property (nonatomic) FLKRouteType type;
@property (nonatomic, copy) NSString *identifier;
@end
It describes the type of data to present, and a unique identifier for that type of data.
- (void)navigateToRoute:(FLKRoute *)route
{
switch (route.type) {
case FLKRouteTypePhoto:
// Navigate to photo screen
break;
case FLKRouteTypeAlbum:
// Navigate to album screen
break;
case FLKRouteTypeGroup:
// Navigate to group screen
break;
// ...
default:
break;
}
}
Now, all we have to do is to make sure we are able to translate both NSURLs and NSUserActivity objects into FLKRoute instances. For NSURLs, this translation is straightforward. Our custom URLs follow the same pattern as the corresponding website URLs; their paths correspond exactly. So translating both website URLs and custom URLs is a matter of using NSURLComponents to extract the necessary information to create the FLKRoute object.
As for NSUserActivity objects passed into application:continueUserActivity:restorationHandler:
, there are two cases. One arises when the NSUserActivity instance was used to index a public item in the app. Remember that when we created the NSUserActivity object we also assigned its webpageURL
? This is really handy because it not only uniquely identifies the data we want to present, but also gives us a NSURL object, which we can handle the same way we handle deep links or Universal Links.
The other case is when the NSUserActivity originated from a CSSearchableItem; we have some more work to do in this case. We need to parse the identifier we created for the item and translate it into a FLKRoute. Remember that our item’s identifier is a combination of its type and the model ID. We can decompose it and then create our route object. Its simplified implementation looks like this:
FLKRoute * FLKRouteFromSearchableItemIdentifier(NSString *searchableItemIdentifier)
{
NSArray *routeComponents = [searchableItemIdentifier componentsSeparatedByString:@"/"];
if ([routeComponents count] != 2) { // type + id
return nil;
}
// Handle the route type
NSString *searchableItemContentType = [routeComponents firstObject];
FLKRouteType type = FLKRouteTypeFromSearchableItemContentType(searchableItemContentType);
// Get the item identifier
NSString *itemIdentifier = [routeComponents lastObject];
// Build the route object
FLKRoute *route = [FLKRoute new];
route.type = type;
route.parameter = itemIdentifier;
return route;
}
Now we have all our bases covered and we’re sure that we’ll drop the user in the right place when she lands in our app. The final application delegate method looks like this:
- (BOOL)application:(nonnull UIApplication *)application continueUserActivity:(nonnull NSUserActivity *)userActivity restorationHandler:(nonnull void (^)(NSArray * __nullable))restorationHandler
{
FLKRoute *route;
NSString *activityType = [userActivity activityType];
NSURL *url;
if ([activityType isEqualToString:CSSearchableItemActionType]) {
// Searchable item from Core Spotlight
NSString *itemIdentifier = [userActivity.userInfo objectForKey:CSSearchableItemActivityIdentifier];
route = FLKRouteFromSearchableItemIdentifier(itemIdentifier);
} else if ([activityType isEqualToString:@"FLKSearchableUserActivityType"] ||
[activityType isEqualToString:NSUserActivityTypeBrowsingWeb]) {
// Searchable item from NSUserActivity or Universal Link
url = userActivity.webpageURL;
route = [url flk_route];
}
if (route) {
[self.router navigateToRoute:route];
return YES;
} else if (url) {
[[UIApplication sharedApplication] openURL:url]; // Fail gracefully
return YES;
} else {
return NO;
}
}
3D Touch
With the release of iPhone 6S and iPhone 6S Plus, Apple introduced a new gesture that can be used with your iOS app: 3D Touch. One of the coolest features it has brought is the ability to preview content before pushing it onto the navigation stack. This is also known as “peek and pop.”
You can easily see how this feature is implemented in the native Mail app. But you won’t always have a simple UIView hierarchy like Mail’s UITableView, where a tap anywhere on a cell opens a UIViewController. Take Flickr’s notifications screen, for example:
If the user taps on a photo in one of these cells, it will open the photo view. But if the user taps on another user’s name, it will open that user’s profile view. Previews of these UIViewControllers should be shown accordingly. But the “peek and pop” mechanism requires you to register a delegate on your UIViewController with registerForPreviewingWithDelegate:sourceView:
, which means that you’re working in a much higher layer. Your UIViewController’s view might not even know about its subviews’ structures.
To solve this problem, we used UIView’s method hitTest:withEvent:
. As the documentation describes, it will give us the “farthest descendant of the receiver in the view hierarchy.” But not every hitTest will necessarily return the UIView that we want. So we defined a protocol, FLKPeekAndPopTargetView
, that must be implemented by any UIView subclass that wants to support peeking and popping from it. That view is then responsible for returning the model used to populate the UIViewController that the user is trying to preview. If the view doesn’t implement this protocol, we query its superview. We keep checking for it until a UIView is found or there aren’t any more superviews available. This is how this logic looks:
+ (id)modelAtLocation:(CGPoint)location inSourceView:(UIView*)sourceView
// Walk up hit-test tree until we find a peek-pop target.
UIView *testView = [sourceView hitTest:location withEvent:nil];
id model = nil;
while(testView && !model) {
// Check if the current testView conforms to the protocol.
if([testView conformsToProtocol:@protocol(FLKPeekAndPopTargetView)]) {
// Translate location to view coordinates.
CGPoint locationInView = [testView convertPoint:location fromView:sourceView];
// Get model from peek and pop target.
model = [((id<FLKPeekAndPopTargetView>)testView) flk_peekAndPopModelAtLocation:locationInView];
} else {
//Move up view tree to next view
testView = testView.superview;
}
}
return model;
}
With this code in place, all we have to do is to implement UIViewControllerPreviewingDelegate
methods in our delegate, perform the hitTest
and take the model out of the FLKPeekAndPopTargetView
‘s implementor. Here’s is the final implementation:
- (UIViewController *)previewingContext:(id<UIViewControllerPreviewing>)previewingContext
viewControllerForLocation:(CGPoint)location {
id model = [[self class] modelAtLocation:location inSourceView:previewingContext.sourceView];
UIViewController *viewController = nil;
if ([model isKindOfClass:[FLKPhoto class]]) {
viewController = // ... UIViewController that displays a photo.
} else if ([model isKindOfClass:[FLKAlbum class]]) {
viewController = // ... UIViewController that displays an album.
} else if ([model isKindOfClass:[FLKGroup class]]) {
viewController = // ... UIViewController that displays a group.
} // ...
return viewController;
}
- (void)previewingContext:(id<UIViewControllerPreviewing>)previewingContext
commitViewController:(UIViewController *)viewControllerToCommit {
[self.navigationController pushViewController:viewControllerToCommit animated:YES];
}
Last but not least, we added support for Quick Actions. Now the user has the ability to quickly jump into a specific section of the app just by pressing down on the app icon. Defining these Quick Actions statically in the Info.plist file is an easy way to implement this feature, but we decided to go one step further and define these options dynamically. One of the options we provide is “Upload Photo,” which takes the user to the asset picker screen. But if the user has Auto Uploadr turned on, this option isn’t that relevant, so instead we provide a different app icon menu option in its place.
Here’s how you can create Quick Actions:
NSMutableArray<UIApplicationShortcutItem *> *items = [NSMutableArray array];
[items addObject:[[UIApplicationShortcutItem alloc] initWithType:@"FLKShortcutItemFeed"
localizedTitle:NSLocalizedString(@"Feed", nil)]];
[items addObject:[[UIApplicationShortcutItem alloc] initWithType:@"FLKShortcutItemTakePhoto"
localizedTitle:NSLocalizedString(@"Upload Photo", nil)] ];
[items addObject:[[UIApplicationShortcutItem alloc] initWithType:@"FLKShortcutItemNotifications"
localizedTitle:NSLocalizedString(@"Notifications", nil)]];
[items addObject:[[UIApplicationShortcutItem alloc] initWithType:@"FLKShortcutItemSearch"
localizedTitle:NSLocalizedString(@"Search", nil)]];
[[UIApplication sharedApplication] setShortcutItems:items];
And this is how it looks like when the user presses down on the app icon:
Finally, we have to handle where to take the user after she selects one of these options. This is yet another place where we can make use of our FLKRoute
object. To handle the app opening from a Quick Action, we need to implement application:performActionForShortcutItem:completionHandler:
in the app delegate.
- (void)application:(UIApplication *)application performActionForShortcutItem:(UIApplicationShortcutItem *)shortcutItem completionHandler:(void (^)(BOOL))completionHandler {
FLKRoute *route = [shortcutItem flk_route];
[self.router navigateToRoute:route];
completionHandler(YES);
}
Conclusion
There is a lot more to consider when shipping these features with an app. For example, with Flickr, there are various platforms the user could be using. It is important to make sure that the Spotlight index is up to date to reflect changes made anywhere. If the user has created a new album and/or left a group from his desktop browser, we need to make sure that those changes are reflected in the app, so the newly-created album can be found through Spotlight, but the newly-departed group cannot.
All of this work should be totally opaque to the user, without hogging the device’s resources and deteriorating the user experience overall. That requires some considerations around threading and network priorities. Network requests for UI-relevant data should not be blocked because we have other network requests happening at the same time. With some careful prioritizing, using NSOperationQueue
and NSURLSession
, we managed to accomplish this with no major problems.
Finally, we had to consider privacy, one of the pillars of Flickr. We had to be extremely careful not to violate any of the user’s settings. We’re careful to never publicly index private content, such as photos and albums. Also, photos marked “restricted” are not publicly indexed since they might expose content that some users might consider offensive.
In this blog post we went into the basics of integrating iOS 9 Search, Universal Links, and 3D Touch in Flickr for iOS. In order to focus on those features, we simplified some of our examples to demonstrate how you could get started with them in your own app, and to show what challenges we faced.
Like this post? Have a love of online photography? Want to work with us? Flickr is hiring mobile, back-end and front-end engineers, in our San Francisco office. Find out more at flickr.com/jobs.