iOS 10.0 The Basic

iOS 10.0 The Basic - This article summarizes the key developer-related features introduced in iOS 10, which runs on currently shipping iOS devices. The article also lists the documents that describe new features in more detail.

iOS 10.0 The Basic

iOS 10 The Basic
For late-breaking news and information about known and For the complete list of new APIs added in iOS 10, see iOS 10.0 API Diffs. For more information on new devices, see iOS Device Compatibility Reference. To learn about what’s new in Swift, see Swift Language and The Swift Programming Language (Swift 3).

SiriKit iOS 10.0 The Basic
Apps that provide services in specific domains can use SiriKit to make those services available from Siri on iOS. Making your services available requires creating one or more app extensions using the Intents and Intents UI frameworks. SiriKit supports services in the following domains:
  1. Audio or video calling
  2. Messaging
  3. Sending or receiving payments
  4. Searching photos
  5. Booking a ride
  6. Managing workouts
When the user makes a request involving your service, SiriKit sends your extension an intent object, which describes the user’s request and provides any data related to that request. You use the intent object to provide an appropriate response object, which includes details of how you can handle the user’s request. Siri typically handles all user interactions, but you can use an extension to provide custom UI that incorporates branding or additional information from your app.

SiriKit also provides a mechanism you can use to tell the system about the interactions and activities that occur within your app.When you tell the system about these interactions, the system can determine if your app can handle the user’s current request and, if it can, pass the request to your app. In addition to the intent, SiriKit defines an interaction object, which combines an intent with information about the intent-handling process, including details such as the start time and duration of a specific occurrence of the process. If your app is registered as capable of handling an activity that has the same name as an intent, the system can launch your app with an interaction object containing that intent even if you don’t provide an Intents app extension.

Ride booking is supported by both Maps and Siri, and users can also make restaurants reservations with Maps. Your Intents extension handles interactions that originate from the Maps app in the same way that it handles requests coming from Siri. If you customize the user interface, your Intents UI extension can also configure itself differently, depending on whether the request came from Siri or Maps.

To learn iOS 10.0 Basic how to support SiriKit and give users new ways to access your services, read SiriKit Programming Guide. When you’re ready to implement the app extensions that handle various intents, see Intents Framework Reference and Intents UI Framework Reference.

Proactive Suggestions

iOS 10 introduces new ways to increase engagement with your app by helping the system suggest your app to users at appropriate times. If you adopted app search in your iOS 9 app, you gave users access to activities and content deep within your app through Spotlight and Safari search results, Handoff, and Siri suggestions. In iOS 10 and later, you can provide information about what users do in your app, which helps the system promote your app in additional places, such as the keyboard with QuickType suggestions, Maps and CarPlay, the app switcher, Siri interactions, and (for media playing apps) the lock screen. These opportunities for enhanced integration with the system are supported by a collection of technologies, such as NSUserActivity, web markup defined by Schema.org, and APIs defined in the Core Spotlight, MapKit, UIKit, and Media Player frameworks.

The Basic In iOS 10, the NSUserActivity object includes the mapItem property, which lets you provide location information that can be used in other contexts. For example, if your app displays hotel reviews, you can use the mapItem property to hold the location of the hotel the user is viewing so that when the user switches to a travel planning app, that hotel’s location is automatically available. And if you support app search, you can use the new text-based address component properties in CSSearchableItemAttributeSet, such as thoroughfare and postalCode, to fully specify locations to which the user may want to go. Note that when you use the mapItem property, the system automatically populates the contentAttributeSet property, too.

To share a location with the system, be sure to specify latitude and longitude values, in addition to values for the address component properties in CSSearchableItemAttributeSet. It’s also recommended that you supply a value for the namedLocation property, so that users can view the name of the location, and the phoneNumbers property, so that users can use Siri to initiate a call to the location.

In iOS 9, adding markup to the structured data on your website enriched the content that users see in Spotlight and Safari search results. In iOS 10, you can use location-related vocabulary defined at Schema.org, such as PostalAddress, to further enhance the user’s experience. For example, if users view a location described on your website, the system can suggest the same location when users switch to Maps. Note that Safari supports both JSON-LD and Microdata encodings of Schema.org vocabularies.

UIKit introduces the textContentType property in the UITextInputTraits protocol so that you can specify the semantic meaning of the content you expect users to enter in a text area. When you provide this information, the system can in some cases automatically select an appropriate keyboard and improve keyboard corrections and proactive integration with information supplied from other apps and websites. For example, if you use UITextContentTypeFullStreetAddress to tell the system that you expect users to enter a complete address in a text field, the system can suggest the address of a location the user was recently viewing.

If your app plays media and you use the MPPlayableContentManager APIs, iOS 10 helps you let users view album art and play media through your app on the lock screen.

If your ride-sharing app uses the MKDirectionsRequest API, iOS 10 can display it in the app switcher when the user is likely to want a ride. To register as a ride-share provider, specify the MKDirectionsModeRideShare value for the MKDirectionsApplicationSupportedModes key in your Info.plist file. If your app supports only ride sharing, the system suggests your app with text that begins “Get a ride to...”; if your app supports both ride sharing and another routing type (such as Automobile or Bike), the system uses the text “Get directions to...”. Note that the MKMapItem object you receive may not include latitude and longitude information and would require geocoding.

Integrating with the Messages App

In iOS 10, you can create app extensions that interact with the Messages app and let users send text, stickers, media files, and interactive messages. You can also support interactive messages that update as each recipient responds to the message. You can create two types of app extensions:

A Sticker pack provides a set of stickers that users can add to their Messages content.
An iMessage app lets you present a custom user interface within the Messages app, create a sticker browser, include text, stickers, and media files within a conversation, and create, send, and update interactive messages.
An iMessage app can also help users search images that you host on your app’s related website while they’re in the Messages app.

You can create a Sticker pack without writing any code: Simply drag images into the Sticker Pack folder inside the Stickers asset catalog in Xcode.

To develop an iMessage app, you use the APIs in the Messages framework (Messages.framework). To learn about the Messages framework, see Messages Framework Reference. For general information about creating app extensions, see App Extension Programming Guide.

If your app provides images for sharing in Messages and you want users to be able to use the Spotlight popular image search (that is, “#images”) to search these images without leaving the Messages app, first create an iMessage app. Then follow these steps:

Add the com.apple.developer.associated-domains key to your app’s entitlements. Include a list of the web domains that host the images you want to make searchable. For each domain, specify the spotlight-image-search service.
Add an apple-app-site-association file to your website. Add a dictionary for the spotlight-image-search service and include your app ID, which is the team ID or app ID prefix, followed by the bundle ID. You can also specify up to 500 paths and patterns that should be included for indexing by the Spotlight popular image search (for some examples of website paths, see the universal links examples in Creating and Uploading the Association File).
Allow crawling by Applebot (to learn more, see About Applebot).

User Notifications
iOS 10 introduces the User Notifications framework (UserNotifications.framework), which supports the delivery and handling of local and remote notifications. You use the classes of this framework to schedule the delivery of local notifications based on specific conditions, such as time or location. Apps and app extensions can use this framework to receive and potentially modify local and remote notifications when they are delivered to the user’s device.

Also introduced in iOS 10, the User Notifications UI framework (UserNotificationsUI.framework) lets you customize the appearance of local and remote notifications when they appear on the user’s device. You use this framework to define an app extension that receives the notification data and provides the corresponding visual representation. Your extension can also respond to custom actions associated with those notifications.

Speech Recognition

iOS 10 introduces a new API that supports continuous speech recognition and helps you build apps that can recognize speech and transcribe it into text. Using the APIs in the Speech framework (Speech.framework), you can perform speech transcription of both real-time and recorded audio. For example, you can get a speech recognizer and start simple speech recognition using code like this:

let recognizer = SFSpeechRecognizer()
let request = SFSpeechURLRecognitionRequest(url: audioFileURL)
recognizer?.recognitionTask(with: request, resultHandler: { (result, error) in
print (result?.bestTranscription.formattedString)
})
As with accessing other types of protected data, such as Calendar and Photos data, performing speech recognition requires the user’s permission (for more information about accessing protected data classes, see Security and Privacy Enhancements). In the case of speech recognition, permission is required because data is transmitted and temporarily stored on Apple’s servers to increase the accuracy of speech recognition. To request the user’s permission, you must add the NSSpeechRecognitionUsageDescription key to your app’s Info.plist file.

When you adopt speech recognition in your app, be sure to indicate to users that their speech is being recognized, and that they should not make sensitive utterances at that time.

Wide Color
Most graphics frameworks throughout the system, including Core Graphics, Core Image, Metal, and AVFoundation, have substantially improved support for extended-range pixel formats and wide-gamut color spaces. By extending this behavior throughout the entire graphics stack, it is easier than ever to support devices with a wide color display. In addition, UIKit standardizes on working in a new extended sRGB color space, making it easy to mix sRGB colors with colors in other, wider color gamuts without a significant performance penalty.

Here are some best practices iOS 10.0 The Basic to adopt as you start working with Wide Color.

In iOS 10, the UIColor class uses the extended sRGB color space and its initializers no longer clamp raw component values to between 0.0 and 1.0. If your app relies on UIKit to clamp component values (whether you’re creating a color or asking a color for its component values), you need to change your app’s behavior when you link against iOS 10.
When performing custom drawing in a UIView on an iPad Pro (9.7 inch), the underlying drawing environment is configured with an extended sRGB color space.
If your app renders custom image objects, use the new UIGraphicsImageRenderer class to control whether the destination bitmap is created using an extended-range or standard-range format.
If you are performing your own image processing on wide-gamut devices using a lower level API, such as Core Graphics or Metal, you should use an extended range color space and a pixel format that supports 16-bit floating-point component values. When clamping of color values is necessary, you should do so explicitly.
Core Graphics, Core Image, and Metal Performance Shaders provide new options for easily converting colors and images between color spaces.
Adapting to the True Tone Display

The True Tone display uses ambient light sensors to automatically adjust the color and intensity of the display to match the lighting conditions of the current environment. To ensure that your app works well with the standard color shift provided by True Tone, add the new UIWhitePointAdaptivityStyle key to your Info.plist file to describe your app’s primary visual content. For example:
If your app is a photo editing app, color fidelity is more important than automatic adjustment to the environmental white point. In this case, you can use the UIWhitePointAdaptivityStylePhoto style to reduce the strength of True Tone shift applied by the system.
If your app is a reading app, conformance with the environmental white point is helpful to users. In this case, you can use the UIWhitePointAdaptivityStyleReading style to increase the strength of True Tone shift applied by the system.
App Search Enhancements