Adding Configuration Options to a WidgetKit Widget

The WidgetDemo app created in the preceding chapters is currently only able to display weather information for a single geographical location. Through the use of configuration intents, it is possible to make aspects of the widget user configurable. In this chapter we will enhance the widget extension so that the user can choose to view the weather for different cities. This will involve some minor changes to the weather data, the modification of the SiriKit intent definition and updates to the widget implementation.

Modifying the Weather Data

Before adding configuration support to the widget, an additional structure needs to be added to the widget data to provide a way to associate cities with weather timelines. Add this structure by modifying the WeatherData. swift file as follows:

import Foundation
import WidgetKit
 
struct LocationData: Identifiable {
    
    let city: String
    let timeline: [WeatherEntry]
    
    var id: String {
        city
    }
    
    static let london = LocationData(city: "London", 
                                 timeline: londonTimeline)
    static let miami = LocationData(city: "Miami", 
                                timeline: miamiTimeline)
    
    func hash(into hasher: inout Hasher) {
        hasher.combine(city)
    }
}
.
.

Configuring the Intent Definition

The next step is to configure the intent definition which will be used to present the user with widget configuration choices. When the WeatherWidget extension was added to the project, the “Include Configuration Intent” option was enabled, causing Xcode to generate a definition file named WeatherWidget.intentdefinition located in the WeatherWidget project folder. Select this file to load it into the intent definition editor where it will appear as shown in Figure 51-1:

Figure 51-1

Begin by making sure that the Configuration intent (marked A in Figure 51-1 above) is selected. This is the intent that was created by Xcode and will be referenced as ConfigurationIntent in the WeatherWidget.swift file. Additional intents may be added to the definition by clicking on the ‘+’ button (D) and selecting New Intent from the menu.

The Category menu (B) must be set to View to allow the intent to display a dialog to the user containing the widget configuration options. Also ensure that the Intent is eligible for widgets option (B) is enabled.

Before we add a parameter to the intent, an enumeration needs to be added to the definition file to contain the available city names. Add this now by clicking on the ‘+’ button (D) and selecting the New Enum option from the menu.

After the enumeration has been added, change both the enumeration name and Display Name to Locations as highlighted in Figure 51-2 below:

Figure 51-2

With the Locations entry selected, refer to the main editor panel and click on the ‘+’ button beneath the Cases section to add a new value. Change the new case entry name to londonUK and, in the settings area, change the display name to London so that the settings resemble Figure 51-3:

Figure 51-3

Repeat the above steps to add an additional cased named miamiFL with the display name set to Miami.

In the left-hand panel, select the Configuration option located under the Custom Intents heading. In the custom intent panel, locate the Parameters section and click on the ‘+’ button highlighted in Figure 51-4 to add a new parameter:

Figure 51-4

Name the parameter locations and change the Display Name setting to Locations. From the Type menu select Locations listed under Enums as shown in Figure 51-5 (note that this is not the same as the Location entry listed under System Types):

Figure 51-5

Once completed, the parameter settings should match those shown in Figure 51-6 below:

Figure 51-6

Modifying the Widget

With the intent configured, all that remains is to adapt the widget so that it responds to location configuration changes made by the user. When WidgetKit requests a timeline from the provider it will pass to the getTimeline() method a ConfigurationIntent object containing the current configuration settings from the intent. To return the timeline for the currently selected city, the getTimeline() method needs to be modified to extract the location from the intent and use it to return the matching timeline.

Edit the WeatherWidget.swift file, locate the getTimeline() method within the provider declaration and modify it so that it reads as follows:

func getTimeline(for configuration: ConfigurationIntent, in context: Context, 
              completion: @escaping (Timeline<Entry>) -> ()) {
    
    var chosenLocation: LocationData
        
    if configuration.locations == .londonUK {
        chosenLocation = .london
    } else {
        chosenLocation = .miami
    }
 
    var entries: [WeatherEntry] = []
    var currentDate = Date()
    let halfMinute: TimeInterval = 30
 
    for var entry in chosenLocation.timeline {
        entry.date = currentDate
        currentDate += halfMinute
        entries.append(entry)
    }
    let timeline = Timeline(entries: entries, policy: .never)
    completion(timeline)
}

In the above code, if the intent object passed to the method has London set as the location, then the london entry within the LocationData instance is used to provide the timeline for WidgetKit. If any of the above changes result in syntax errors within the editor try rebuilding the project to trigger the generation of the files associated with the intent definition file.

Testing Widget Configuration

Run the widget extension on a device or simulator and wait for it to load. Once it is running, perform a long press on the widget to display the menu shown in Figure 51-7 below:

Figure 51-7

Select the Edit Widget menu option to display the configuration intent dialog as shown in Figure 51-8:

Figure 51-8

Select the Miami location before tapping on any screen area outside of the dialog. On returning to the home screen, the widget should now be displaying entries from the Miami timeline.

Note that the intent does all of the work involved in presenting the user with the configuration options, automatically adjusting to reflect the type and quantity of options available. If more cities are included in the enumeration, for example, the intent will provide a Choose button which, when tapped, will display a scrollable list of cities from which to choose:

Figure 51-9

Customizing the Configuration Intent UI

The final task in this tutorial is to change the accent colors of the intent UI to match those used by the widget. Since we already have the widget background color declared in the widget extension’s Assets.xcassets file from the steps in an earlier chapter, this can be used for the background of the intent UI.

The color settings for the intent UI are located in the build settings screen for the widget extension. To find these settings, select the WidgetDemo entry located at the top of the project navigator panel (marked A in Figure 5110 below), followed by the WeatherWidgetExtension entry (B) in the Targets list:

Figure 51-10

In the toolbar, select Build Settings (C), then the Basic filter option (D) before scrolling down to the Asset Catalog Compiler – Options section (E).

Click on the WidgetBackground value (F) and change it to weatherBackgroundColor. If required, the foreground color used within the intent UI is defined by the Global Accent Color Name value. Note that these values must be named colors declared within the Assets.xcassets file.

Test the widget to verify that the intent UI now uses the widget background color:

Figure 51-11

Summary

When a widget is constructed using the intent configuration type (as opposed to static configuration), configuration options can be made available to the user by setting up intents and parameters within the SiriKit intent definition file. Each time the provider getTimeline() method is called, WidgetKit passes it a copy of the configuration intent object, the parameters of which can be inspected and used to tailor the resulting timeline to match the user’s preferences.

A SwiftUI WidgetKit Deep Link Tutorial

WidgetKit deep links allow the individual views that make up the widget entry view to open different screens within the companion app when tapped. In addition to the main home screen, the WidgetDemo app created in the preceding chapters contains a detail screen to provide the user with information about different weather systems. As currently implemented, however, tapping the widget always launches the home screen of the companion app, regardless of the current weather conditions.

The purpose of this chapter is to implement deep linking on the widget so that tapping the widget opens the appropriate weather detail screen within the app. This will involve some changes to both the app and widget extension.

Adding Deep Link Support to the Widget

Deep links allow specific areas of an app to be presented to the user based on the opening of a URL. The WidgetDemo app used in the previous chapters consists of a list of severe storm types. When a list item is selected, the app navigates to a details screen where additional information about the selected storm is displayed. In this tutorial, changes will be made to both the app and widget to add deep link support. This means, for example, that when the widget indicates that a thunder storm is in effect, tapping the widget will launch the app and navigate to the thunder storm detail screen.

The first step in adding deep link support is to modify the WeatherEntry structure to include a URL for each timeline entry. Edit the WeatherData.swift file and modify the structure so that it reads as follows:

.
.
struct WeatherEntry: TimelineEntry {
    var date: Date
    let city: String
    let temperature: Int
    let description: String
    let icon: String
    let image: String
    let url: URL?
}
.
.

Next, add some constants containing the URLs which will be used to identify the storm types that the app knows about:

.
.
let hailUrl = URL(string: "weatherwidget://hail")
let thunderUrl = URL(string: "weatherwidget://thunder")
let tropicalUrl = URL(string: "weatherwidget://tropical")
.
.

The last remaining change to the weather data is to include the URL within the sample timeline entries:

.
.
let londonTimeline = [
    WeatherEntry(date: Date(), city: "London", temperature: 87, 
          description: "Hail Storm", icon: "cloud.hail", 
                image: "hail", url: hailUrl),
    WeatherEntry(date: Date(), city: "London", temperature: 92, 
          description: "Thunder Storm", icon: "cloud.bolt.rain", 
                image: "thunder", url: thunderUrl),
    WeatherEntry(date: Date(), city: "London", temperature: 95,   
          description: "Hail Storm", icon: "cloud.hail", 
                image: "hail", url: hailUrl)
]
 
let miamiTimeline = [
    WeatherEntry(date: Date(), city: "Miami", temperature: 81, 
          description: "Thunder Storm", icon: "cloud.bolt.rain", 
                image: "thunder", url: thunderUrl),
    WeatherEntry(date: Date(), city: "Miami", temperature: 74,
          description: "Tropical Storm", icon: "tropicalstorm", 
                image: "tropical", url: tropicalUrl),
    WeatherEntry(date: Date(), city: "Miami", temperature: 72, 
          description: "Thunder Storm", icon: "cloud.bolt.rain", 
                image: "thunder", url: thunderUrl)
]
.
.

With the data modified to include deep link URLs, the widget declaration now needs to be modified to match the widget entry structure. First, the placeholder() and getSnapshot() methods of the provider will need to return an entry which includes the URL. Edit the WeatherWidget.swift file, locate these methods within the IntentTimelineProvider structure and modify them as follows:

struct Provider: IntentTimelineProvider {
   func placeholder(in context: Context) -> WeatherEntry {
       
        WeatherEntry(date: Date(), city: "London",
                           temperature: 89, description: "Thunder Storm",
                                icon: "cloud.bolt.rain", image: "thunder", 
                                    url: thunderUrl)
    }
 
    func getSnapshot(for configuration: ConfigurationIntent, with context: Context, completion: @escaping (WeatherEntry) -> ()) {
       
        let entry = WeatherEntry(date: Date(), city: "London", 
                      temperature: 89, description: "Thunder Storm", 
                        icon: "cloud.bolt.rain", image: "thunder", 
                         url: thunderUrl)
        completion(entry)
    }
.
.

Repeat this step for both declarations in the preview provider:

struct WeatherWidget_Previews: PreviewProvider {
    static var previews: some View {
       
        Group {
            WeatherWidgetEntryView(entry: WeatherEntry(date: Date(), 
                        city: "London", temperature: 89, 
                 description: "Thunder Storm", icon: "cloud.bolt.rain", 
                       image: "thunder", url: thunderUrl))
                .previewContext(WidgetPreviewContext(
                                      family: .systemSmall))
        
            WeatherWidgetEntryView(entry: WeatherEntry(date: Date(), 
                        city: "London", temperature: 89, 
                 description: "Thunder Storm", icon: "cloud.bolt.rain", 
                       image: "thunder", url: thunderUrl))
                 .previewContext(WidgetPreviewContext(
                                      family: .systemMedium))
        }
    }
}

The final task within the widget code is to assign a URL action to the widget entry view. This is achieved using the widgetUrl() modifier, passing through the URL from the widget entry. Remaining in the WeatherWidget.swift file, locate the WeatherWidgetEntryView declaration and add the modifier to the top level ZStack as follows:

struct WeatherWidgetEntryView : View {
    var entry: Provider.Entry
 
    @Environment(\.widgetFamily) var widgetFamily
    
    var body: some View {
 
        ZStack {
            Color("weatherBackgroundColor")
   
            HStack {
                WeatherSubView(entry: entry)
                if widgetFamily == .systemMedium {
                    ZStack {
                        Image(entry.image)
                            .resizable()
                    }
                }
            }
        }
        .widgetURL(entry.url)
    }
}

With deep link support added to the widget the next step is to add support to the app.

Adding Deep Link Support to the App

When an app is launched via a deep link, it is passed a URL object which may be accessed via the top level view in the main content view. This URL can then be used to present different content to the user than would normally be displayed.

The first step in adding deep link support to the WidgetDemo app is to modify the ContentView.swift file to add some state properties. These variables will be used to control which weather detail view instance is displayed when the app is opened by a URL:

import SwiftUI
 
struct ContentView: View {
    
    @State private var hailActive: Bool = false
    @State private var thunderActive: Bool = false
    @State private var tropicalActive: Bool = false
    
    var body: some View {
        NavigationView {
            List {

The above state variables now need to be referenced in the navigation links within the List view:

var body: some View {
    
    NavigationView {
        List {
            NavigationLink(destination: WeatherDetailView(
                   name: "Hail Storms", icon: "cloud.hail"), 
                       isActive: $hailActive) {
                Label("Hail Storm", systemImage: "cloud.hail")
            }
 
            NavigationLink(destination: WeatherDetailView(
                   name: "Thunder Storms", icon: "cloud.bolt.rain"), 
                      isActive: $thunderActive) {
                Label("Thunder Storm", systemImage: "cloud.bolt.rain")
            }
            
            NavigationLink(destination: WeatherDetailView(
                   name: "Tropical Storms", icon: "tropicalstorm"), 
                      isActive: $tropicalActive) {
                Label("Tropical Storm", systemImage: "tropicalstorm")
            }
        }
        .navigationTitle("Severe Weather")
    }
}

The isActive argument to the NavigationLink view allows the link to be controlled programmatically. For example, the first link will navigate to the WeatherDetailView screen configured for hail storms when manually selected by the user. With the addition of the isActive argument, the navigation will also occur if the hailActive state property is changed to true as the result of some other action within the code.

When a view is displayed as the result of a deep link, the URL used to launch the app can be identified using the onOpenUrl() modifier on the parent view. By applying this modifier to the NavigationView we can write code to modify the state properties based on the URL, thereby programmatically triggering navigation to an appropriately configured detail view.

Modify the ContentView declaration to add the onOpenUrl() modifier as follows:

struct ContentView: View {
    
    @State private var hailActive: Bool = false
    @State private var thunderActive: Bool = false
    @State private var tropicalActive: Bool = false
    
    var body: some View {
        
        NavigationView {
            List {
.
.                
                NavigationLink(destination: WeatherDetailView(name: "Tropical Storms", icon: "tropicalstorm"), isActive: $tropicalActive) {
                    Label("Tropical Storm", systemImage: "tropicalstorm")
                }
                
            }
            .navigationTitle("Severe Weather")
            .onOpenURL(perform: { (url) in
                self.hailActive = url == hailUrl
                self.thunderActive = url == thunderUrl
                self.tropicalActive = url == tropicalUrl
            })
        }
    }
}

The added code performs a comparison of the URL used to launch the app with each of the custom URLs supported by the widget. The result of each comparison (i.e. true or false) is then assigned to the corresponding state property. If the URL matches the thunder URL, for example, then the thunderActive state will be set to true causing the view to navigate to the detail view configured for thunder storms.

Testing the Widget

After making the changes, run the app on a device or simulator and make sure that tapping the widget opens the app and displays the detail screen correctly configured for the current weather.

Figure 50-1

Summary

By default, a widget will launch the main view of the companion app when tapped by the user. This behavior can be enhanced by establishing deep links that take the user to specific areas of the app. This involves using the widgetUrl() modifier to assign destination URLs to the views in a widget entry layout. Within the app the onOpenUrl() modifier is then used to identify the URL used to launch the app and initiate navigation to the corresponding view.

Supporting WidgetKit Size Families in SwiftUI

In the chapter titled Building Widgets with SwiftUI and WidgetKit, we learned that a widget is able to appear in small, medium and large sizes. The project created in the previous chapter included a widget view designed to fit within the small size format. Since the widget did not specify the supported sizes, it would still be possible to select a large or medium sized widget from the gallery and place it on the home screen. In those larger formats, however, the widget content would have filled only a fraction of the available widget space. If larger widget sizes are to be supported, the widget should be designed to make full use of the available space.

In this chapter, the WidgetDemo project created in the previous chapter will be modified to add support for the medium widget size.

Supporting Multiple Size Families

Begin by launching Xcode and loading the WidgetDemo project from the previous chapter. As outlined above, this phase of the project will add support for the medium widget size (though these steps apply equally to adding support for the large widget size).

In the absence of specific size configuration widgets are, by default, configured to support all three size families. To restrict a widget to specific sizes, the supportedFamilies() modifier must be applied to the widget configuration.

To restrict the widget to only small and medium sizes for the WidgetDemo project, edit the WeatherWidget.swift file and modify the WeatherWidget declaration to add the modifier. Also take this opportunity to modify the widget display name and description:

@main
struct WeatherWidget: Widget {
    private let kind: String = "WeatherWidget"
 
    public var body: some WidgetConfiguration {
        IntentConfiguration(kind: kind, intent: ConfigurationIntent.self, provider: Provider(), placeholder: PlaceholderView()) { entry in
            WeatherWidgetEntryView(entry: entry)
        }
        .configurationDisplayName("My Weather Widget")
        .description("A demo weather widget.")
        .supportedFamilies([.systemSmall, .systemMedium])
    }
}

To preview the widget in medium format, edit the preview provider to add an additional preview, embedding both in a Group:

struct WeatherWidget_Previews: PreviewProvider {
    static var previews: some View {
       
        Group {
            WeatherWidgetEntryView(entry: WeatherEntry(date: Date(), 
                  city: "London", temperature: 89, 
                  description: "Thunder Storm", icon: "cloud.bolt.rain", 
                        image: "thunder"))
                .previewContext(WidgetPreviewContext(
                                         family: .systemSmall))
        
            WeatherWidgetEntryView(entry: WeatherEntry(date: Date(), 
                  city: "London", temperature: 89, 
                  description: "Thunder Storm", icon: "cloud.bolt.rain", 
                        image: "thunder"))
                .previewContext(WidgetPreviewContext(
                                         family: .systemMedium))
        }
    }
}

When the preview canvas updates, it will now include the widget rendered in medium size as shown in Figure 49-1:

Figure 49-1

Clearly the widget is not taking advantage of the additional space offered by the medium size. To address this shortcoming, some changes to the widget view need to be made.

Adding Size Support to the Widget View

The changes made to the widget configuration mean that the widget can be displayed in either small or medium size. To make the widget adaptive, the widget view needs to identify the size in which it is currently being displayed. This can be achieved by accessing the widgetFamily property of the SwiftUI environment. Remaining in the WeatherWidget.swift file, locate and edit the WeatherWidgetEntryView declaration to obtain the widget family setting from the environment:

struct WeatherWidgetEntryView: View {
    var entry: Provider.Entry
    
    @Environment(\.widgetFamily) var widgetFamily
.
.

Next, embed the subview in a horizontal stack and conditionally display the image for the entry if the size is medium:

struct WeatherWidgetEntryView : View {
    var entry: Provider.Entry
 
    @Environment(\.widgetFamily) var widgetFamily
    
    var body: some View {
 
        ZStack {
            Color("weatherBackgroundColor")
   
            HStack {
                WeatherSubView(entry: entry)
                if widgetFamily == .systemMedium {
                    Image(entry.image)
                        .resizable()
                }
            }
        }
    }
}

When previewed, the medium sized version of the widget should appear as shown in Figure 49-2:

Figure 49-2

To test the widget on a device or simulator, run the extension as before and, once the widget is installed and running, perform a long press on the home screen background. After a few seconds have elapsed, the screen will change as shown in Figure 49-3:

Figure 49-3

Click on the ‘+’ button indicated by the arrow in the above figure to display the widget gallery and scroll down the list of widgets until the WidgetDemo entry appears:

Figure 49-4

Select the WidgetDemo entry to display the widget size options. Swipe to the left to display the medium widget size as shown in Figure 49-5 before tapping on the Add Widget button:

Figure 49-5

On returning to the home screen, click on the Done button located in the top right-hand corner of the home screen to commit the change. The widget will appear as illustrated in Figure 49-6 and update as the timeline progresses:

Figure 49-6

Summary

WidgetKit supports small, medium and large widget size families and, by default, a widget is assumed to support all three formats. In the event that a widget only supports specific sizes, WidgetKit needs to be notified using a widget configuration modifier.

To fully support a size format, a widget should take steps to detect the current size and provide a widget entry layout which makes use of the available space allocated to the widget on the device screen. This involves accessing the SwiftUI environment widgetFamily property and using it as the basis for conditional layout declarations within the widget view.

Now that widget size family support has been added to the project, the next chapter will add some interactive support to the widget in the form of deep linking into the companion app and widget configuration.

A SwiftUI WidgetKit Tutorial

From the previous chapter we now understand the elements that make up a widget and the steps involved in creating one. In this, the first of a series of tutorial chapters dedicated to WidgetKit, we will begin the process of creating an app which includes a widget extension. On completion of these tutorials, a functioning widget will have been created, including widget design and the use of timelines, support for different size families, deep links, configuration using intents and basic intelligence using SiriKit donations and relevance.

About the WidgetDemo Project

The project created in this tutorial can be thought of as the early prototype of a weather app designed to teach children about weather storms. The objective is to provide the user with a list of severe weather systems (tropical storms, thunderstorms etc.) and, when a storm type is selected, display a second screen providing a description of the weather system.

A second part of the app is intended to provide real-time updates on severe weather occurring in different locations around the world. When a storm is reported, a widget will be updated with information about the type and location of the storm, together with the prevailing temperature. When the widget is tapped by the user, the app will open the screen containing information about that storm category.

Since this app is an early prototype, however, it will only provide weather updates from two cities, and that data will be simulated rather than obtained from a real weather service. The app will be functional enough, however, to demonstrate how to implement the key features of WidgetKit.

Creating the WidgetDemo Project

Launch Xcode and select the option to create a new Multiplatform App project named WidgetDemo.

Building the App

Before adding the widget extension to the project, the first step is to build the basic structure of the app. This will consist of a List view populated with some storm categories which, when selected, will appear in a detail screen.

The detail screen will be declared in a new SwiftUI View file named WeatherDetailView.swift. Within the project navigator panel, right-click on the Shared folder and select the New File… menu option. In the resulting dialog, select the SwiftUI View template option and click on the Next button. Name the file WeatherDetailView.swift before creating the file.

With the WeatherDetailView.swift file selected, modify the view declaration so that it reads as follows:

import SwiftUI
 
struct WeatherDetailView: View {
    
    var name: String
    var icon: String
    
    var body: some View {
        VStack {
            Image(systemName: icon)
                .resizable()
                    .scaledToFit()
                    .frame(width: 150.0, height: 150.0)
            Text(name)
                .padding()
                .font(.title)
            Text("If this were a real weather app, a description of \(name) would appear here.")
                .padding()
            Spacer()
        }
    }
}
 
struct WeatherDetailView_Previews: PreviewProvider {
    static var previews: some View {
        WeatherDetailView(name: "Thunder Storms", icon: "cloud.bolt")
    }
}

When rendered, the above view should appear in the preview canvas as shown in Figure 48-1 below:

Figure 48-1

Next, select the ContentView.swift file and modify it to add a List view embedded in a NavigationView as follows:

import SwiftUI
 
struct ContentView: View {
    var body: some View {
        
        NavigationView {
            List {      
                NavigationLink(destination: WeatherDetailView(
                                      name: "Hail Storms", 
                                      icon: "cloud.hail")) {
                    Label("Hail Storm", systemImage: "cloud.hail")
                }
 
                NavigationLink(destination: WeatherDetailView(
                                      name: "Thunder Storms", 
                                      icon: "cloud.bolt.rain")) {
                    Label("Thunder Storm", 
                               systemImage: "cloud.bolt.rain")
                }
                
                NavigationLink(destination: WeatherDetailView(
                                      name: "Tropical Storms", 
                                      icon: "tropicalstorm")) {
                    Label("Tropical Storm", systemImage: "tropicalstorm")
                }
                
            }
            .navigationTitle("Severe Weather")
        }
    }
}
.
.

Once the changes are complete, make sure that the layout matches that shown in Figure 48-2:

Figure 48-2

Using Live Preview, make sure that selecting a weather type displays the detail screen populated with the correct storm name and image.

Adding the Widget Extension

The next step in the project is to add the widget extension by selecting the File -> New -> Target… menu option. From within the target template panel, select the Widget Extension option as shown in Figure 48-3 before clicking on the Next button:

Figure 48-3

On the subsequent screen, enter WeatherWidget into the product name field. When the widget is completed, the user will be able to select the geographical location for which weather updates are to be displayed. To make this possible the widget will need to use the intent configuration type. Before clicking on the Finish button, therefore, make sure that the Include Configuration Intent option is selected as shown in Figure 48-4:

Figure 48-4

When prompted, click on the Activate button to activate the extension within the project scheme. This will ensure that the widget is included in the project build process:

Figure 48-5

Once the extension has been added, refer to the project navigator panel, where a new folder containing the widget extension will have been added as shown in Figure 48-6:

Figure 48-6

Adding the Widget Data

Now that the widget extension has been added to the project, the next step is to add some data and data structures that will provide the basis for the widget timeline. Begin by right-clicking on the Shared folder in the project navigator and selecting the New File… menu option.

From the template selection panel, select the Swift File entry, click on the Next button and name the file WeatherData.swift. Before clicking on the Create button, make sure that the WeatherWidgetExtension entry is enabled in the Targets section of the panel as shown in Figure 48-7 so that the file will be accessible to the extension:

Figure 48-7

As outlined in the previous chapter, each point in the widget timeline is represented by a widget timeline entry instance. Instances of this structure contain the date and time that the entry is to be presented by the widget, together with the data to be displayed. Within the WeatherData.swift file, add a TimelineEntry structure as follows (noting that the WidgetKit framework also needs to be imported):

import Foundation
import WidgetKit
 
struct WeatherEntry: TimelineEntry {
    var date: Date
    let city: String
    let temperature: Int
    let description: String
    let icon: String
    let image: String
}

Creating Sample Timelines

Since this prototype app does not have access to live weather data, the timelines used to drive the widget content will contain sample weather entries for two cities. Remaining within the WeatherData.swift file, add these timeline declarations as follows:

.
.
let londonTimeline = [
    WeatherEntry(date: Date(), city: "London", temperature: 87, 
          description: "Hail Storm", icon: "cloud.hail", 
                image: "hail"),
    WeatherEntry(date: Date(), city: "London", temperature: 92, 
          description: "Thunder Storm", icon: "cloud.bolt.rain", 
                image: "thunder"),
    WeatherEntry(date: Date(), city: "London", temperature: 95,   
          description: "Hail Storm", icon: "cloud.hail", 
                image: "hail")
]
 
let miamiTimeline = [
    WeatherEntry(date: Date(), city: "Miami", temperature: 81, 
          description: "Thunder Storm", icon: "cloud.bolt.rain", 
                image: "thunder"),
    WeatherEntry(date: Date(), city: "Miami", temperature: 74,
          description: "Tropical Storm", icon: "tropicalstorm", 
                image: "tropical"),
    WeatherEntry(date: Date(), city: "Miami", temperature: 72, 
          description: "Thunder Storm", icon: "cloud.bolt.rain", 
                image: "thunder")
]

Note that the timeline entries are populated with the current date and time via a call to the Swift Date() method. These values will be replaced with more appropriate values by the provider when the timeline is requested by WidgetKit.

Adding Image and Color Assets

Before moving to the next step of the tutorial, some image and color assets need to be added to the asset catalog of the widget extension.

Begin by selecting the Assets.xcassets file located in the WeatherWidget folder in the project navigator panel as highlighted in Figure 48-8:

Figure 48-8

Add a new entry to the catalog by clicking on the button indicated by the arrow in Figure 48-8 above. In the resulting menu, select the Color Set option. Click on the new Color entry and change the name to weatherBackgroundColor. With this new entry selected, click on the Any Appearance block in the main panel as shown in Figure 48-9:

Figure 48-9

Referring to the Color section of the attributes inspector panel, set Content to Display P3, Input Method to 8-bit Hexadecimal and the Hex field to #4C5057:

Figure 48-10

Select the Dark Appearance and make the same attribute changes, this time setting the Hex value to #3A4150.

Next, add a second Color Set asset, name it weatherInsetColor and use #4E7194 for the Any Appearance color value and #7E848F for the Dark Appearance.

The images used by this project can be found in the weather_images folder of the sample code download available from the following URL:

https://www.ebookfrenzy.com/code/SwiftUI-iOS14-CodeSamples.zip

Once the source archive has been downloaded and unpacked, open a Finder window, navigate to the weather_ images folder and select, drag and drop the images on to the left-hand panel of the Xcode asset catalog screen as shown in Figure 48-11:

Figure 48-11

Designing the Widget View

Now that the widget entry has been created and used as the basis for some sample timeline data, the widget view needs to be designed. When the widget extension was added to the project, a template widget entry view was included in the WeatherWidget.swift file which reads as follows:

struct WeatherWidgetEntryView : View {
    var entry: Provider.Entry
 
    var body: some View {
        Text(entry.date, style: .time)
    }
}

As currently implemented, the view is passed a widget entry from which the date value is extracted and displayed on a Text view.

Modify the view structure so that it reads as follows, keeping in mind that it will result in syntax errors appearing in the editor. These will be resolved later in the tutorial:

struct WeatherWidgetView: View {
    var entry: Provider.Entry
        
    var body: some View {
        ZStack {
            Color("weatherBackgroundColor")
            WeatherSubView(entry: entry)
        }
    }
}
 
struct WeatherSubView: View {
    
    var entry: WeatherEntry
    
    var body: some View {
        
        VStack {
            VStack {
                Text("\(entry.city)")
                    .font(.title)
                Image(systemName: entry.icon)
                    .font(.largeTitle)
                Text("\(entry.description)")
                    .frame(minWidth: 125, minHeight: nil)
            }
            .padding(.bottom, 2)
            .background(ContainerRelativeShape()
                       .fill(Color("weatherInsetColor")))
            Label("\(entry.temperature)°F", systemImage: "thermometer")
        }
        .foregroundColor(.white)
        .padding()
    }
}

Since we have changed the view, the preview provider declaration will also need to be changed as follows:

struct WeatherWidget_Previews: PreviewProvider {
    static var previews: some View {
       
        WeatherWidgetEntryView(entry: WeatherEntry(date: Date(), 
                     city: "London", temperature: 89, 
              description: "Thunder Storm", 
                     icon: "cloud.bolt.rain", image: "thunder"))
            .previewContext(WidgetPreviewContext(family: .systemSmall))
    }
}

Once all of the necessary changes have eventually been made to the WeatherWidget.swift file, the above preview provider will display a preview canvas configured for the widget small family size.

Modifying the Widget Provider

When the widget extension was added to the project, Xcode added a widget provider to the WeatherWidget. swift file. This declaration now needs to be modified to make use of the WeatherEntry structure declared in the WeatherData.swift file. The first step is to modify the getSnapshot() method to use WeatherEntry and to return an instance populated with sample data:

.
.
struct Provider: IntentTimelineProvider {
    func getSnapshot(for configuration: ConfigurationIntent, with context: Context, completion: @escaping (WeatherEntry) -> ()) {
       
        let entry = WeatherEntry(date: Date(), city: "London", 
                    temperature: 89, description: "Thunder Storm", 
                         icon: "cloud.bolt.rain", image: "thunder")
        completion(entry)
    }
.
.

Next, the getTimeline() method needs to be modified to return an array of timeline entry objects together with a reload policy value. Since user configuration has not yet been added to the widget, the getTimeline() method will be configured initially to return the timeline for London:

struct Provider: IntentTimelineProvider {
.
.
    func getTimeline(for configuration: ConfigurationIntent, with context: Context, completion: @escaping (Timeline<Entry>) -> ()) {
        
        var entries: [WeatherEntry] = []
        var eventDate = Date()
        let halfMinute: TimeInterval = 30
    
        for var entry in londonTimeline {
            entry.date = eventDate
            eventDate += halfMinute
            entries.append(entry)
        }
        let timeline = Timeline(entries: entries, policy: .never)
        completion(timeline)
    }
}

The above code begins by declaring an array to contain the WeatherEntry instances before creating variables designed to represent the current event time and a 30 second time interval respectively.

A loop then iterates through the London timeline declared in the WeatherData.swift file, setting the eventDate value as the date and time at which the event is to be displayed by the widget. A 30 second interval is then added to the eventDate ready for the next event. Finally, the modified event is appended to the entries array. Once all of the events have been added to the array, it is used to create a Timeline instance with a reload policy of never (in other words WidgetKit will not ask for a new timeline when the first timeline ends). The timeline is then returned to WidgetKit via the completion handler.

This implementation of the getTimeline() method will result in the widget changing content every 30 seconds until the final entry in the London timeline array is reached.

Configuring the Placeholder View

The Final task before previewing the widget is to make sure that the placeholder view has been implemented. Xcode will have already created a placeholder() method for this purpose within the WeatherWidget.swift file which reads as follows:

func placeholder(in context: Context) -> SimpleEntry {
   SimpleEntry(date: Date(), configuration: ConfigurationIntent())
}

This method now needs to be modified so that it returns a WeatherWidget instance populated with some sample data as follows:

func placeholder(in context: Context) -> WeatherEntry {    
    WeatherEntry(date: Date(), city: "London",
                       temperature: 89, description: "Thunder Storm",
                            icon: "cloud.bolt.rain", image: "thunder")
}

Previewing the Widget

Using the preview canvas, verify that the widget appears as shown in Figure 48-12 below:

Figure 48-12

Next, test the widget on a device or simulator by changing the active scheme in the Xcode toolbar to the WeatherWidgetExtension scheme before clicking on the run button:

Figure 48-13

After a short delay, the widget will appear on the home screen and cycle through the different weather events at 30 second intervals:

Figure 48-14

Summary

The example project created in this chapter has demonstrated how to use WidgetKit to create a widget extension for an iOS app. This included the addition of the extension to the project, the design of the widget view and entry together with the implementation of a sample timeline. The widget created in this chapter, however, has not been designed to make use of the different widget size families supported by WidgetKit, a topic which will be covered in the next chapter.

Building Widgets with SwiftUI and WidgetKit

Introduced in iOS 14, widgets allow small amounts of app content to be displayed alongside the app icons that appear on the device home screen pages, the Today view and the macOS Notification Center. Widgets are built using SwiftUI in conjunction with the WidgetKit Framework.

The focus of this chapter is to provide a high level outline of the various components that make up a widget before exploring widget creation in practical terms in the chapters that follow.

An Overview of Widgets

Widgets are intended to provide users with “at a glance” views of important, time sensitive information relating to your app. When a widget is tapped by the user, the corresponding app is launched, taking the user to a specific screen where more detailed information may be presented. Widgets are intended to display information which updates based on a timeline, ensuring that only the latest information is displayed to the user. A single app can have multiple widgets displaying different information.

Widgets are available in three size families (small, medium and large), of which at least one size must be supported by the widget, and can be implemented such that the information displayed is customizable by the user.

Widgets are selected from the widget gallery and positioned by the user on the device home screens. To conserve screen space, iOS allows widgets to be stacked, providing the user the ability to flip through each widget in the stack with a swiping gesture. A widget can increase the probability of moving automatically to the top of the stack by assigning a relevancy score to specific timeline entries. The widget for a weather app might, for example, assign a high relevancy to a severe weather warning in the hope that WidgetKit will move it to the top of the stack, thereby increasing the likelihood that the user will see the information.

The Widget Extension

A widget is created by adding a widget extension to an existing app. A widget extension consists of a Swift file, an optional intent definition file (required if the widget is to be user configurable), an asset catalog and an Info. plist file.

The widget itself is declared as a structure conforming to the Widget protocol, and it is within this declaration that the basic configuration of the widget is declared. The body of a typical widget declaration will include the following items:

  • Widget kind – Identifies the widget within the project. This can be any String value that uniquely identifies the widget within the project.
  • Widget Configuration – A declaration which conforms to the WidgetConfiguration protocol. This includes a reference to the provider of the timeline containing the information to be displayed, the widget display name and description, and the size families supported by the widget. WidgetKit supports two types of widget configuration referred to as static configuration and intent configuration.
  • Entry View – A reference to the SwiftUI View containing the layout that is to be presented to the user when the widget is displayed. This layout is populated with content from individual timeline entries at specific points in the widget timeline.

In addition to the widget declaration, the extension must also include a placeholder View defining the layout to be displayed to the user while the widget is loading and gathering data. This may either be declared manually, or configured to be generated automatically by WidgetKit based on the entry view included in the Widget view declaration outlined above.

Widget Configuration Types

When creating a widget, the choice needs to be made as to whether it should be created using the static or intent configuration model. These two options can be summarized as follows:

  • Intent Configuration – Used when it makes sense for the user to be able to configure aspects of the widget. For example, allowing the user to select the news publications from which headlines are to be displayed within the widget.
  • Static Configuration – Used when the widget does not have any user configurable properties.

When the Intent Configuration option is used, the configuration options to be presented to the user are declared within a SiriKit intent definition file.

The following is an example widget entry containing a static configuration designed to support both small and medium size families:

@main
struct SimpleWidget: Widget {
    private let kind: String = "SimpleWidget"
 
    public var body: some WidgetConfiguration {
        StaticConfiguration(kind: kind, provider: Provider(), 
                 placeholder: PlaceholderView()) { entry in
            SimpleWidgetEntryView(entry: entry)
        }
        .configurationDisplayName("A Simple Weather Widget")
        .description("This is an example widget.")
        .supportedFamilies([.systemSmall, .systemMedium])
    }
}

The following listing, on the other hand, declares a widget using the intent configuration:

@main
struct SimpleWidget: Widget {
    private let kind: String = "SimpleWidget"
 
    public var body: some WidgetConfiguration {
        IntentConfiguration(kind: kind, 
             intent: LocationSelectionIntent.self, provider: Provider(), 
               placeholder: PlaceholderView()) { entry in
            SimpleWidgetEntryView(entry: entry)
        }
        .configurationDisplayName("Weather Fun")
        .description("Learning about weather in real-time.")
    }
}

The primary difference in the above declaration is that it uses IntentConfiguration instead of StaticConfiguration which, in turn, includes a reference to a SiriKit intent definition. The absence of the supported families modifier in the above example indicates to WidgetKit that the widget supports all three sizes. Both examples include a widget entry view.

Widget Entry View

The widget entry view is simply a SwiftUI View declaration containing the layout to be displayed by the widget. Conditional logic (for example if or switch statements based on the widgetFamily environment property) can be used to present different layouts subject to the prevailing size family.

With the exception of tapping to open the corresponding app, widgets are non-interactive. As such, the entry view will typically consist of display-only views (in other words no buttons, sliders or toggles).

When WidgetKit creates an instance of the entry view, it passes it a widget timeline entry containing the data to be displayed on the views that make up the layout. The following view declaration is designed to display city name and temperature values:

struct SimpleWidgetEntryView : View {
    var entry: Provider.Entry
 
    var body: some View {
        VStack {
            Text("City: \(entry.city)")
            Text("Tempurature: \(entry.temperature)")
        }
    }
}

The purpose of a widget is to display different information at specific points in time. The widget for a calendar app, for example, might change throughout the day to display the user’s next upcoming appointment. The content to be displayed at each point in the timeline is contained within widget entry objects conforming to the TimelineEntry protocol. Each entry must, at a minimum, include a Date object defining the point in the timeline at which the data in the entry is to be displayed, together with any data that is needed to fully populate the widget entry view at the specified time. The following is an example of a timeline entry declaration designed for use with the above entry view:

struct WeatherEntry: TimelineEntry {
    var date: Date
    let city: String
    let temperature: Int
}

If necessary, the Date object can simply be a placeholder to be updated with the actual time and date when the entry is added to a timeline.

Widget Timeline

The widget timeline is simply an array of widget entries that defines the points in time that the widget is to be updated, together with the content to be displayed at each time point. Timelines are constructed and returned to WidgetKit by a widget provider.

Widget Provider

The widget provider is responsible for providing the content that is to be displayed on the widget and must be implemented to conform to the TimelineProvider protocol. At a minimum, it must implement the following methods:

  • getSnapshot() – The getSnapshot() method of the provider will be called by WidgetKit when a single, populated widget timeline entry is required. This snapshot is used within the widget gallery to show an example of how the widget would appear if the user added it to the device. Since real data may not be available at the point that the user is browsing the widget gallery, the entry returned should typically be populated with sample data.
  • getTimeline() – This method is responsible for assembling and returning a Timeline instance containing the array of widget timeline entries that define how and when the widget content is to be updated together with an optional reload policy value.

The following code excerpt declares an example timeline provider:

struct Provider: TimelineProvider {
    public typealias Entry = SimpleEntry
 
    public func getSnapshot(with context: Context, 
              completion: @escaping (SimpleEntry) -> ()) {
 
        // Construct a single entry using sample content
        let entry = SimpleEntry(date: Date(), city: "London", 
                               temperature: 89)
        completion(entry)
    }
 
    public func getTimeline(with context: Context, 
                   completion: @escaping (Timeline<Entry>) -> ()) {
        var entries: [SimpleEntry] = []
 
        // Construct timeline array here
 
        let timeline = Timeline(entries: entries, policy: .atEnd)
        completion(timeline)
    }
}

Reload Policy

When a widget is displaying entries from a timeline, WidgetKit needs to know what action to take when it reaches the end of the timeline. The following predefined reload policy options are available for use when the provider returns a timeline:

  • atEnd – At the end of the current timeline, WidgetKit will request a new timeline from the provider. This is the default behavior if no reload policy is specified.
  • after(Date) – WidgetKit will request a new timeline after the specified date and time.
  • never – The timeline is not reloaded at the end of the timeline.

Relevance

As previously mentioned, iOS allows widgets to be placed in a stack in which only the uppermost widget is visible. While the user can scroll through the stacked widgets to decide which is to occupy the topmost position, this presents the risk that an important update may not be seen by the user in time to act on the information.

To address this issue, WidgetKit is allowed to move a widget to the top of the stack if the information it contains is considered to be of relevance to the user. This decision is based on a variety of factors such as previous behavior of the user (for example checking a bus schedule widget at the same time every day) together with a relevance score assigned by the widget to a particular timeline entry.

Relevance is declared using a TimelineEntryRelevance structure. This contains a relevancy score and a time duration for which the entry is relevant. The score can be any floating point value and is measured relative to all other timeline entries generated by the widget. For example, if most of the relevancy scores in the timeline entries range between 0.0 and 10.0, a relevancy score of 20.0 assigned to an entry may cause the widget to move to the top of the stack. The following code declares two relevancy entries:

let lowScore = TimelineEntryRelevance(score: 0.0, duration: 0)
let highScore = TimelineEntryRelevance(score: 10.0, duration: 0)

If a relevancy is to be included in an entry, it must appear after the date entry, for example:

struct WeatherEntry: TimelineEntry {
    var date: Date
    var relevance: TimelineEntryRelevance?
    let city: String
    let temperature: Int
}
.
.
let entry1 = WeatherEntry(date: Date(), relevance: lowScore, city: "London", temperature: 87)
 
let entry2 = WeatherEntry(date: Date(), relevance: highScore, city: "London", temperature: 87)

Forcing a Timeline Reload

When a widget is launched, WidgetKit requests a timeline containing the timepoints and content to display to the user. Under normal conditions, WidgetKit will not request another timeline update until the end of the timeline is reached, and then only if required by the reload policy.

Situations may commonly arise, however, where the information in a timeline needs to be updated. A user might, for example, add a new appointment to a calendar app which requires a timeline update. Fortunately, the widget can be forced to request an updated timeline by making a call to the reloadTimelines() method of the WidgetKit WidgetCenter instance, passing through the widget’s kind string value (defined in the widget configuration as outlined earlier in the chapter). For example:

WidgetCenter.shared.reloadTimelines(ofKind: "My Kind")

Alternatively, it is also possible to trigger a timeline reload for all the active widgets associated with an app as follows:

WidgetCenter.shared.reloadAllTimelines()

Widget Sizes

As previously discussed, widgets can be displayed in small, medium and large sizes. The widget declares which sizes it supports by applying the supportedFamilies() modifier to the widget configuration as follows:

@main
struct SimpleWidget: Widget {
    private let kind: String = "SimpleWidget"
 
    public var body: some WidgetConfiguration {
        IntentConfiguration(kind: kind, 
             intent: LocationSelectionIntent.self, provider: Provider(), 
               placeholder: PlaceholderView()) { entry in
            SimpleWidgetEntryView(entry: entry)
        }
        .configurationDisplayName("Weather Fun")
        .description("Learning about weather in real-time.")
        .supportedFamilies([.systemSmall, .systemMedium])
    }
}

The following figure shows the built-in iOS Calendar widget in small, medium and large formats:

Figure 47-1

Widget Placeholder

As previously mentioned, the widget extension must provide a placeholder. This is the view which is displayed to the user while the widget is initializing and takes the form of the widget entry view without any data or information. Consider the following example widget:

Figure 47-2

The above example, of course, shows the widget running after it has received timeline data to be displayed. During initialization, however, the placeholder view resembling Figure 47-3 would be expected to be displayed:

Figure 47-3

Fortunately, SwiftUI includes the redacted(reason:) modifier which may be applied to an instance of the widget entry view to act as a placeholder. The following is an example of a placeholder view declaration for a widget extension using the redacted() modifier (note that the reason is set to placeholder):

struct PlaceholderView : View {
    var body: some View {
        SimpleWidgetEntryView()
            .redacted(reason: .placeholder)
    }
}

Summary

Introduced in iOS 14, widgets allow apps to present important information to the user directly on the device home screen without the need to launch the app. Widgets are implemented using the WidgetKit Framework and take the form of extensions added to the main app. Widgets are driven by timelines which control the information to be displayed to the user and when it is to appear. Widgets can support small, medium and large formats and may be designed to be configurable by the user. When adding widgets to the home screen, the user has the option to place them into a stack. By adjusting the relevance of a timeline entry, a widget can increase the chances of being moved to the top of the stack.

A SwiftUI Siri Shortcut Tutorial

As previously discussed, the purpose of Siri Shortcuts is to allow key features of an app to be invoked by the user via Siri by speaking custom phrases. This chapter will demonstrate how to integrate shortcut support into an existing iOS app, including the creation of a custom intent and intent UI, the configuration of a SiriKit Intent Definition file and outline the code necessary to handle the intents, provide responses and donate shortcuts to Siri.

About the Example App

The project used as the basis for this tutorial is an app which simulates the purchasing of financial stocks and shares. The app is named ShortcutDemo and can be found in the sample code download available at the following URL:

https://www.ebookfrenzy.com/code/SwiftUI-iOS14-CodeSamples.zip

The app consists of a “Buy” screen into which the stock symbol and quantity are entered and the purchase initiated, and a “History” screen consisting of a List view listing all previous transactions. Selecting an entry in the transaction history displays a third screen containing details about the corresponding stock purchase.

App Groups and UserDefaults

Much about the way in which the app has been constructed will be familiar from techniques outlined in previous chapters of the book. The project also makes use of app storage in the form of UserDefaults and the @AppStorage property wrapper, concepts which were introduced in the chapter entitled SwiftUI Data Persistence using AppStorage and SceneStorage. The ShortcutDemo app uses app storage to store an array of objects containing the symbol, quantity and time stamp data for all stock purchase transactions. Since this is a test app that needs to store minimal amounts of data, this storage is more than adequate. In a real-world environment, however, a storage system capable of handling larger volumes of data such as SQLite, CoreData or iCloud storage would need to be used.

In order to share the UserDefaults data between the app and the SiriKit intents extension, the project also makes use of App Groups. App Groups allow apps to share data with other apps and targets within the same app group. App Groups are assigned a name (typically similar to group.com.yourdomain.myappname) and are enabled and configured within the Xcode project Signing & Capabilities screen.

Preparing the Project

Once the ShortcutDemo project has been downloaded and opened within Xcode, some configuration changes need to be made before the app can be compiled and run. Begin by selecting the ShortcutDemo target at the top of the project navigator panel (marked A in Figure 46-1) followed by the ShortcutDemo (iOS) entry in the Targets list (B). Select the Signing & Capabilities tab (C) and choose your developer ID from the Team menu in the Signing section (D):

Figure 46-1

Next, click the “+ Capabilities” button (E) and double-click on the App Groups entry in the resulting dialog to add the capability to the project. Once added, click on the ‘+’ button located beneath the list of App Groups (as indicated in Figure 46-2):

Figure 46-2

In the resulting panel, provide a name for the app group container that will be unique to your project (for example group.com.<your domain name>.shortcutdemo). Once a name has been entered, click on the OK button to add it to the project entitlements file (ShortcutDemo.entitlements) and make sure that its toggle button is enabled.

Now that the App Group container has been created, the name needs to be referenced in the project code. Edit the PurchaseStore.swift file and replace the placeholder text in the following line of code with your own App Group name:

@AppStorage("demostorage", store: UserDefaults(
    suiteName: "YOUR APP GROUP NAME HERE")) var store: Data = Data()

Running the App

Run the app on a device or simulator and enter a stock symbol and quantity (for example 100 shares of TSLA and 20 GE shares) and click on the Purchase button. Assuming the transaction is successful, select the History tab at the bottom of the screen and confirm that the transactions appear in the list as shown in Figure 46-3:

Figure 46-3

If the purchased stocks do not appear in the list, switch between the Buy and History screens once more at which point the items should appear (this is a bug in SwiftUI which has been reported to Apple but not yet fixed). Select a transaction from the list to display the Detail screen for that purchase:

Figure 46-4

With the app installed, configured and running, the next step is to begin integrating shortcut support into the project.

Enabling Siri Support

To add the Siri entitlement, return to the Signing & Capabilities screen, click on the “+ Capability” button to display the capability selection dialog, enter Siri into the filter bar and double-click on the result to add the capability to the project.

Seeking Siri Authorization

In addition to enabling the Siri entitlement, the app must also seek authorization from the user to integrate the app with Siri. This is a two-step process which begins with the addition of an entry to the Info.plist file of the iOS app target for the NSSiriUsageDescription key with a corresponding string value explaining how the app makes use of Siri.

Select the Info.plist file located within the iOS folder in the project navigator panel as shown in Figure 46-5:

Figure 46-5

Once the file is loaded into the editor, locate the bottom entry in the list of properties and hover the mouse pointer over the item. When the plus button appears, click on it to add a new entry to the list. From within the drop-down list of available keys, locate and select the Privacy – Siri Usage Description option as shown in Figure 46-6:

Figure 46-6

Within the value field for the property, enter a message to display to the user when requesting permission to use speech recognition. For example:

Siri support is used to suggest shortcuts

In addition to adding the Siri usage description key, a call also needs to be made to the requestSiriAuthorization class method of the INPreferences class. Ideally, this call should be made the first time that the app runs, not only so that authorization can be obtained, but also so that the user learns that the app includes Siri support. For the purposes of this project, the call will be made within the onChange() modifier based on the scenePhase changes within the app declaration located in the ShortcutDemoApp.swift file as follows:

import SwiftUI
import Intents
 
@main
struct ShortcutDemoApp: App {
    
    @Environment(\.scenePhase) private var scenePhase
    
    var body: some Scene {
        WindowGroup {
            ContentView()
        }
        .onChange(of: scenePhase) { phase in
                 INPreferences.requestSiriAuthorization({status in
                 // Handle errors here
             })
         }
    }
}

Before proceeding, compile and run the app on an iOS device or simulator. When the app loads, a dialog will appear requesting authorization to use Siri. Select the OK button in the dialog to provide authorization.

Adding the Intents Extension

To add shortcut support, an intents extension will be needed for the Siri shortcuts associated with this app. Select the File -> New -> Target… menu option, followed by the Intents Extension option and click on the Next button. On the options screen, enter ShortcutDemoIntent into the product name field, change the Starting Point to None and make sure that the Include UI Extension option is enabled before clicking on the Finish button:

Figure 46-7

If prompted to do so, activate the ShortcutDemoIntent target scheme.

Adding the SiriKit Intent Definition File

Now that the intent extension has been added to the project, the SiriKit Intent Definition file needs to be added so that the intent can be configured. Right-click on the Shared folder in the project navigator panel and select New File… from the menu. In the template selection dialog scroll down to the Resource section and select the SiriKit Intent Definition File template followed by the Next button:

Figure 46-8

Keep the default name of Intents in the Save As: field, but make sure that the file is available to all of the targets in the project by enabling all of the options in the Targets section of the dialog before clicking on the Create button:

Figure 46-9

Adding the Intent to the App Group

The purchase history data will be shared between the main app and the intent using app storage. This requires that the App Group capability be added to the ShortcutDemoIntent target and enabled for the same container name as that used by the ShortcutDemo target. To achieve this, select the ShortcutDemo item at the top of the project navigator panel, switch to the Signing & Capabilities panel, select the ShortcutDemoIntent entry in the list of targets and add the App Group capability. Once added, make sure that the App Group name used by the ShortcutDemo target is selected:

Figure 46-10

Configuring the SiriKit Intent Definition File

Locate the Intents.intentdefinition file and select it to load it into the editor. The file is currently empty, so add a new intent by clicking on the ‘+’ button in the lower left-hand corner of the editor panel and selecting the New Intent menu option:

Figure 46-11

In the Custom Intents panel, rename the new intent to BuyStock as shown in Figure 46-12:

Figure 46-12

Next, change the Category setting in the Custom Intent section of the editor from “Do” to “Buy”, enter “ShortcutDemo” and “Buy stocks and shares” into the Title and Description fields respectively, and enable both the Configurable in Shortcuts and Siri Suggestions options. Since this is a buy category intent, the User confirmation required option is enabled by default and cannot be disabled:

Figure 46-13

Adding Intent Parameters

In order to complete a purchase, the intent is going to need two parameters in the form of the stock symbol and quantity. Remaining within the Intent Definition editor, use the ‘+’ button located beneath the Parameters section to add a parameter named symbol with the type set to String, the display name set to “Symbol”, and both the Configurable and Resolvable options enabled. Within the Siri Dialog section, enter “Specify a stock symbol” into the Prompt field:

Figure 46-14

Repeat the above steps to add a quantity parameter to the intent, setting the prompt to “Specify a quantity to purchase”.

Declaring Shortcut Combinations

A shortcut intent can be configured to handle different combinations of intent parameters. Each unique combination of parameters defines a shortcut combination. For each combination, the intent needs to know the phrase that Siri will speak when interacting with the user which can contain embedded parameters. These need to be configured both for shortcut suggestions and for use within the Shortcuts apps so that the shortcuts can be selected manually by the user. For this example, the only combination required involves both the symbol and quantity which will have been added automatically within the Supported Combinations panel of the Shortcuts app section of the intents configuration editor screen.

Within the Supported Combinations panel, select the symbol, quantity parameter combination and begin typing into the Summary field. Type the word “Buy” followed by the first few letters of the word “quantity”. Xcode will notice that this could be a reference to the quantity parameter name and suggests it as an option to embed the parameter into the phrase as shown in Figure 46-15:

Figure 46-15

Select the parameter from the suggestion to embed it into the phrase, then continue typing so that the message reads “Buy quantity shares of symbol” where “symbol” is also an embedded parameter:

Figure 46-16

These combination settings will have been automatically duplicated under the Suggestions heading. The Supports background execution for suggestions defines whether or not the app can handle the shortcut type in the background without having to be presented to the user for additional input. Make sure this option is enabled for this shortcut combination.

Configuring the Intent Response

The final area to be addressed within the Intent Definition file is the response handling. To view these settings, select the Response entry located beneath the BuyStock intent in the Custom Intents panel:

Figure 46-17

The first task is to declare the parameters that will be included in the response phrases. As with the intent configuration, add both the symbol and quantity parameters configured as Strings.

Next, select the success entry in the response templates code list:

Figure 46-18

Enter the following message into the Voice-Only Dialog field (making sure to insert the parameters for the symbol and quantity using the same technique used above the combination summary settings):

Successfully purchased quantity symbol shares

Repeat this step to add the following template text to the failure code:

Sorry, could not purchase quantity shares of symbol

Behind the scenes, Xcode will take the information provided within the Intent Definition file and automatically generate new classes named BuyStockIntentHandling, BuyStockIntent and BuyStockIntentResponse, all of which will be used in the intent handling code. To make sure these files are generated before editing the code, select the Product -> Clean Builder Folder menu option followed by Product -> Build.

Configuring Target Membership

Many of the classes and targets in the project are interdependent and need to be accessible to each other during both compilation and execution. To allow this access, a number of classes and files within the project need specific target membership settings. While some of these settings will have set up correctly by default, others may need to be set up manually before testing the app. Begin by selecting the IntentHandler.swift file (located in the ShortcutDemoIntent folder) in the project navigator panel and display the File Inspector (View -> Inspectors -> File). In the file inspector panel, locate the Target Membership section and make sure that all targets are enabled as shown in Figure 46-19:

Figure 46-19

Repeat these steps for the Purchase.swift, PurchaseData.swift and PurchaseStore.swift files located in the Shared folder.

Modifying the Intent Handler Code

Now that the intent definition is complete and the classes have been auto-generated by Xcode, the intent handler needs to be modified to implement the BuyStockIntentHandling protocol. Edit the IntentHandler.swift file and make the following changes:

import Intents
 
class IntentHandler: INExtension, BuyStockIntentHandling {
    
    override func handler(for intent: INIntent) -> Any {
 
        guard intent is BuyStockIntent else {
            fatalError("Unknown intent type: \(intent)")
        }
 
        return self
    }
 
    func handle(intent: BuyStockIntent, 
       completion: @escaping (BuyStockIntentResponse) -> Void) {
        
    } 
}

The handler() method simply checks that the intent type is recognized and, if so, returns itself as the intent handler.

Next, add the resolution methods for the two supported parameters:

.
.
    func resolveSymbol(for intent: BuyStockIntent, with completion: @escaping (INStringResolutionResult) -> Void) {
        
        if let symbol = intent.symbol {
            completion(INStringResolutionResult.success(with: symbol))
        } else {
            completion(INStringResolutionResult.needsValue())
        }
    }
    
    func resolveQuantity(for intent: BuyStockIntent, with completion: @escaping (INStringResolutionResult) -> Void) {
        if let quantity = intent.quantity {
            completion(INStringResolutionResult.success(with: quantity))
        } else {
            completion(INStringResolutionResult.needsValue())
        }
    }
.
.

Code now needs to be added to the handle() method to perform the stock purchase. Since this will need access to the user defaults app storage, begin by making the following changes (replacing the placeholder text with your app group name):

import Intents
import SwiftUI
 
class IntentHandler: INExtension, BuyStockIntentHandling {
    
    @AppStorage("demostorage", store: UserDefaults(
      suiteName: "YOUR APP GROUP NAME HERE")) var store: Data = Data()
 
    var purchaseData = PurchaseData()
.
.

Before modifying the handle() method, add the following method to the IntentHandler.swift file which will be called to save the latest purchase to the app storage:

func makePurchase(symbol: String, quantity: String) -> Bool {
    
    var result: Bool = false
    let decoder = JSONDecoder()
    
    if let history = try? decoder.decode(PurchaseData.self, 
                                                 from: store) {
        purchaseData = history   
        result = purchaseData.saveTransaction(symbol: symbol, 
                                            quantity: quantity)
    }
    return result
}

The above method uses a JSON decoder to decode the data contained within the app storage (for a reminder about encoding and decoding app storage data, refer to the chapter entitled “SwiftUI Data Persistence using AppStorage and SceneStorage”). The result of this decoding is a PurchaseData instance, the saveTransaction() method of which is called to save the current purchase. Next, modify the handle() method as follows:

func handle(intent: BuyStockIntent,
   completion: @escaping (BuyStockIntentResponse) -> Void) {
 
    guard let symbol = intent.symbol,
            let quantity = intent.quantity
       else {
            completion(BuyStockIntentResponse(code: .failure,
                    userActivity: nil))
            return
    }
        
    let result = makePurchase(symbol: symbol, quantity: quantity)
        
    if result {
        completion(BuyStockIntentResponse.success(quantity: quantity,
                                symbol: symbol))
    } else {
        completion(BuyStockIntentResponse.failure(quantity: quantity,
                                symbol: symbol))
    }    
}

When called, the method is passed a BuyStockIntent intent instance and completion handler to be called when the purchase is completed. The method begins by extracting the symbol and quantity parameter values from the intent object:

guard let symbol = intent.symbol,
       let quantity = intent.quantity
   else {
       completion(BuyStockIntentResponse(code: .failure,
                    userActivity: nil))
       return
}

These values are then passed through to the makePurchase() method to perform the purchase transaction. Finally, the result returned by the makePurchase() method is used to select the appropriate response to be passed to the completion handler. In each case, the appropriate parameters are passed to the completion handler for inclusion in the response template:

let result = makePurchase(symbol: symbol, quantity: quantity)
    
if result {
    completion(BuyStockIntentResponse.success(quantity: quantity,
                            symbol: symbol))
} else {
    completion(BuyStockIntentResponse.failure(quantity: quantity,
                            symbol: symbol))
}

Adding the Confirm Method

To fully conform with the BuyStockIntentHandling protocol, the IntentHandler class also needs to contain a confirm() method. As outlined in the SiriKit introductory chapter, this method is called by Siri to check that the handler is ready to handle the intent. All that is needed for this example is for the confirm() method to provide Siri with a ready status as follows:

public func confirm(intent: BuyStockIntent, 
    completion: @escaping (BuyStockIntentResponse) -> Void) {
    
    completion(BuyStockIntentResponse(code: .ready, userActivity: nil))
}

Donating Shortcuts to Siri

Each time the user successfully completes a stock purchase within the main app the action needs to be donated to Siri as a potential shortcut. The code to make these donations should now be added to the PurchaseView.swift file in a method named makeDonation(), which also requires that the Intents framework be imported:

import SwiftUI
import Intents
 
struct PurchaseView: View {
.
.
 
        .onAppear() {
            purchaseData.refresh()
          
        }
    }
    
    func makeDonation(symbol: String, quantity: String) {
        let intent = BuyStockIntent()
        
        intent.quantity = quantity
        intent.symbol = symbol
        intent.suggestedInvocationPhrase = "Buy \(quantity) \(symbol)"
        
        let interaction = INInteraction(intent: intent, response: nil)
        
        interaction.donate { (error) in
            if error != nil {
                if let error = error as NSError? {
                    print(
                     "Donation failed: %@" + error.localizedDescription)
                }
            } else {
                print("Successfully donated interaction")
            }
        }
    }
.
.
}

The method is passed string values representing the stock and quantity of the purchase. A new BuyStockIntent instance is then created and populated with both these values and a suggested activation phrase containing both the quantity and symbol. Next, an INInteraction object is created using the BuyStockIntent instance and the donate() method of the object called to make the donation. The success or otherwise of the donation is then output to the console for diagnostic purposes.

The donation will only be made after a successful purchase has been completed, so add the call to makeDonation() after the saveTransaction() call in the buyStock() method:

private func buyStock() {
    if (symbol == "" || quantity == "") {
        status = "Please enter a symbol and quantity"
    } else {
        if purchaseData.saveTransaction(symbol: symbol, 
                                           quantity: quantity) {
            status = "Purchase completed"
            makeDonation(symbol: symbol, quantity: quantity)
        }
    }
}

Testing the Shortcuts

Before running and testing the app, some settings on the target device or simulator need to be changed in order to be able to fully test the shortcut functionality. To enable these settings, open the Settings app on the device or simulator on which you intend to test the app, select the Developer option and locate and enable the Display Recent Shortcuts and Display Donations on Lock Screen options as shown in Figure 46-20:

Figure 46-20

These settings will ensure that newly donated shortcuts always appear in Siri search and on the lock screen rather than relying on Siri to predict when the shortcuts should be suggested to the user.

With the settings changed, run the ShortcutDemo app target and make a stock purchase (for example buy 75 IBM shares). After the purchase is complete, check the Xcode console to verify that the “Successfully donated interaction” message appeared.

Next, locate the built-in iOS Shortcuts app on the device home screen as highlighted in Figure 46-21 below and tap to launch it:

Figure 46-21

Within the Shortcuts app, select the Gallery tab where the donated shortcut should appear as shown in Figure 46-22 below:

Figure 46-22

Click on the ‘+’ button to the right of the shortcut title to display the Add to Siri screen (Figure 46-23). Note that “Buy 75 IBM” is suggested as configured when the donation was made in the makeDonation() method. To change the phrase, delete the current “When I say” setting and enter a different phrase:

Figure 46-23

Click the Add to Siri button to add the shortcut and return to the Gallery screen. Select the My Shortcuts tab and verify that the new shortcut has been added:

Figure 46-24

Press and hold the Home button to launch Siri and speak the shortcut phrase. Siri will seek confirmation that the purchase is to be made. After completing the purchase, Siri will use the success response template declared in the Intent Definition file to confirm that the transaction was successful.

After making a purchase using the shortcut, open the ShortcutDemo app and verify that the transaction appears in the transaction history (keeping in mind that it may be necessary to switch between the Buy and History screens before the purchase appears due to the previously mentioned SwiftUI bug).

Designing the Intent UI

When the shortcut was tested, the intent UI will have appeared as a large empty space. Clearly some additional steps are required before the shortcut is complete. Begin by selecting the MainInterface.storyboard file located in the ShortcutDemoIntentUI folder in the project navigator so that it loads into Interface Builder.

Add a Label to the layout by clicking on the button marked A in Figure 46-25 below and dragging and dropping a Label object from the Library (B) onto the layout canvas as indicated by the arrow:

Figure 46-25

Next, the Label needs to be constrained so that it has a 5dp margin between the leading, trailing and top edges of the parent view. With the Label selected in the canvas, click on the Add New Constraints button located in the bottom right-hand corner of the editor to display the menu shown in Figure 46-26 below:

Figure 46-26

Enter 5 into the top, left and right boxes and click on the I-beam icons next to each value so that they are displayed in solid red instead of dashed lines before clicking on the Add 3 Constraints button.

Before proceeding to the next step, establish an outlet connection from the Label component to a variable in the IntentViewController.swift file named contentLabel. This will allow the view controller to change the text displayed on the Label to reflect the intent content parameter. This is achieved using the Assistant Editor which is displayed by selecting the Xcode Editor -> Assistant menu option. Once displayed, Ctrl-click on the Label in the canvas and drag the resulting line to a position in the Assistant Editor immediately above the viewDidLoad() method declaration:

Figure 46-27

On releasing the line, the dialog shown in Figure 46-28 will appear. Enter contentLabel into the Name field before clicking on Connect to establish the outlet.

Figure 46-28

On completion of these steps, the outlets should appear in the IntentViewController.swift file as follows:

class IntentViewController: UIViewController, INUIHostedViewControlling {
    
    @IBOutlet weak var contentLabel: UILabel!
.
.

Edit the IntentViewController.swift file and modify the configureView() method and declaredSize variable so that the code reads as follows:

func configureView(for parameters: Set<INParameter>, 
  of interaction: INInteraction, 
  interactiveBehavior: INUIInteractiveBehavior, 
  context: INUIHostedViewContext, 
  completion: @escaping (Bool, Set<INParameter>, CGSize) -> Void) {
    
    guard let intent = interaction.intent as? BuyStockIntent else {
        completion(false, Set(), .zero)
        return
    }
    
    if let symbol = intent.symbol, let quantity = intent.quantity {
        self.contentLabel.text = "Buy \(quantity) \(symbol) shares?"
    }
    
    completion(true, parameters, self.desiredSize)
}
 
var desiredSize: CGSize {
    return CGSize.init(width: 10, height: 100)
}

Re-build and run the app, then use Siri to trigger the shortcut. This time the intent UI will contain text describing the purchase to be made:

Figure 46-29

Summary

This chapter has provided a practical demonstration of how to integrate Siri shortcut support into a SwiftUI app. This included the creation and configuration of an Intent Definition file, the addition of a custom intent extension and the implementation of intent handling code.

An Overview of SwiftUI Siri Shortcut Integration

When SiriKit was first introduced with iOS 10, an app had to fit neatly into one of the SiriKit domains covered in the chapter entitled An Introduction to SiriKit in order to integrate with Siri. In iOS 12, however, SiriKit was extended to allow an app of any type to make key features available for access via Siri voice commands, otherwise known as Siri Shortcuts. This chapter will provide a high level overview of Siri Shortcuts and the steps involved in turning app features into Siri Shortcuts.

An Overview of Siri Shortcuts

A Siri shortcut is essentially a commonly used feature of an app that can be invoked by the user via a chosen phrase. The app for a fast-food restaurant might, for example, allow the user to order a favorite lunch item by simply using the phrase “Order Lunch” from within Siri. Once a shortcut has been configured, iOS learns the usage patterns of the shortcut and will begin to place that shortcut in the Siri Suggestions area on the device at appropriate times of the day. If the user uses the lunch ordering shortcut at lunch times on weekdays, the system will suggest the shortcut at that time of day.

A shortcut can be configured within an app by providing the user with an Add to Siri button at appropriate places in the app. Our hypothetical restaurant app might, for example, include an Add to Siri button on the order confirmation page which, when selected, will allow the user to add that order as a shortcut and provide a phrase to Siri with which to launch the shortcut (and order the same lunch) in future.

An app may also suggest (a concept referred to as donating) a feature or user activity as a possible shortcut candidate which the user can then turn into a shortcut via the iOS Shortcuts app.

As with the SiriKit domains, the key element of a Siri Shortcut is an intent extension. Unlike domain based extensions such as messaging and photo search, which strictly define the parameters and behavior of the intent, Siri shortcut intents are customizable to meet the specific requirements of the app. A restaurant app, for example would include a custom shortcut intent designed to handle information such as the menu items ordered, the method of payment and the pick-up or delivery location. Unlike SiriKit domain extensions, however, the parameters for the intent are stored when the shortcut is created and are not requested from the user by Siri when the shortcut is invoked. A shortcut to order the user’s usual morning coffee, for example, will already be configured with the coffee item and pickup location, both of which will be passed to the intent handler by Siri when the shortcut is triggered.

The intent will also need to be configured with custom responses such as notifying the user through Siri that the shortcut was successfully completed, a particular item in an order is currently unavailable or that the pickup location is currently closed.

The key component of a Siri Shortcut in terms of specifying the parameters and responses is the SiriKit Intent Definition file.

An Introduction to the Intent Definition File

When a SiriKit extension such as a Messaging Extension is added to an iOS project, it makes use of a pre-defined system intent provided by SiriKit (in the case of a Messaging extension, for example, this might involve the INSendMessageIntent). In the case of Siri shortcuts, however, a custom intent is created and configured to fit with the requirements of the app. The key to creating a custom intent lies within the Intent Definition file.

An Intent Definition file can be added to an Xcode project by selecting the Xcode File -> New -> File… menu option and choosing the SiriKit Intent Definition File option from the Resource section of the template selection dialog.

Once created, Xcode provides an editor specifically for adding and configuring custom intents (in fact the editor may also be used to create customized variations of the system intents such as the Messaging, Workout and Payment intents). New intents are added by clicking on the ‘+’ button in the lower left corner of the editor window and making a selection from the menu:

Figure 45-1

Selecting the Customize System Intent option will provide a list of system intents from which to choose while the New Intent option creates an entirely new intent with no pre-existing configuration settings. Once a new custom intent has been created, it appears in the editor as shown in Figure 45-2:

Figure 45-2

The section marked A in the above figure lists the custom intents contained within the file. The Custom Intent section (B) is where the category of the intent is selected from a list of options including categories such as order, buy, do, open, search etc. Since this example is an intent for ordering food, the category is set to Order. This section also includes the title of the intent, a description and an optional image to include with the shortcut. A checkbox is also provided to require Siri to seek confirmation from the user before completing the shortcut. For some intent categories such as buying and ordering this option is mandatory.

The parameters section (C) allows the parameters that will be passed to the intent to be declared. If the custom intent is based on an existing SiriKit system intent, this area will be populated with all of the parameters the intent is already configured to handle and the settings cannot be changed. For a new custom intent, as many parameters as necessary may be declared and configured in terms of name, type and whether or not the parameter is an array.

Finally, the Shortcut Types section (D) allows different variations of the shortcut to be configured. A shortcut type is defined by the combination of parameters used within the shortcut. For each combination of parameters that the shortcut needs to support, settings are configured including the title of the shortcut and whether or not the host app can perform the associated task in the background without presenting a user interface. Each intent can have as many parameter combinations as necessary to provide a flexible user shortcut experience.

The Custom Intents panel (A) also lists a Response entry under the custom intent name. When selected, the editor displays the screen shown in Figure 45-3:

Figure 45-3

Keep in mind that a Siri shortcut involves a two-way interaction between Siri and the user and that to maintain this interaction Siri needs to know how to respond to the user based on the results of the intent execution. This is defined through a range of response templates which define the phrases to be spoken by Siri when the intent returns specific codes. These response settings also allow parameters to be configured for inclusion in the response phrases. A restaurant app might, for example, have a response code named failureStoreClosed which results in Siri responding with the phrase “We are sorry, we cannot complete your cheeseburger order because the store is closed.”, and a success response code with the phrase “Your order of two coffees will be ready for pickup in 20 minutes”.

Once the Intent Definition file has been configured, Xcode will automatically generate a set of class files ready for use within the intent handler code of the extension.

Automatically Generated Classes

The purpose of the Intent Definition file is to allow Xcode to generate a set of classes that will be used when implementing the shortcut extension. Assuming that a custom intent named OrderFood was added to the definition file, the following classes would be automatically generated by Xcode:

  • OrderFoodIntent – The intent object that encapsulates all of the parameters declared in the definition file. An instance of this class will be passed by Siri to the handler(), handle() and confirm() methods of the extension intent handler configured with the appropriate parameter values for the current shortcut.
  • OrderIntentHandling – Defines the protocol to which the intent handler must conform in order to be able to fully handle food ordering intents.
  • OrderIntentResponse – A class encapsulating the response codes, templates and parameters declared for the intent in the Intent Definition file. The intent handler will use this class to return response codes and parameter values to Siri so that the appropriate response can be communicated to the user.

Use of these classes within the intent handler will be covered in the next chapter entitled A SwiftUI Siri Shortcut Tutorial.

Donating Shortcuts

An app typically donates a shortcut when a common action is performed by the user. This donation does not automatically turn the activity into a shortcut, but includes it as a suggested shortcut within the iOS Shortcuts app. A donation is made by calling the donate() method of an INInteraction instance which has been initialized with a shortcut intent object. For example:

let intent = OrderFoodIntent()
 
intent.menuItem = "Cheeseburger"
intent.quantity = 1
intent.suggestedInvocationPhrase = "Order Lunch"
 
let interaction = INInteraction(intent: intent, response: nil)
 
interaction.donate { (error) in
    if error != nil {
        // Handle donation failure
    }
}

The Add to Siri Button

The Add to Siri button allows a shortcut to be added to Siri directly from within an app. This involves writing code to create an INUIAddVoiceShortcutButton instance, initializing it with a shortcut intent object with shortcut parameters configured and then adding it to a user interface view. A target method is then added to the button to be called when the button is clicked.

As of iOS 14, the Add to Siri button has not been integrated directly into SwiftUI, requiring the integration of UIKit into the SwiftUI project.

Summary

Siri shortcuts allow commonly performed tasks within apps to be invoked using Siri via a user provided phrase. When added, a shortcut contains all of the parameter values needed to complete the task within the app together with templates defining how Siri should respond based on the status reported by the intent handler. A shortcut is implemented by creating a custom intent extension and configuring an Intent Definition file containing all of the information regarding the intent, parameters and responses. From this information, Xcode generates all of the class files necessary to implement the shortcut. Shortcuts can be added to Siri either via an Add to Siri button within the host app, or by the app donating suggested shortcuts. A list of donated shortcuts can be found in the iOS Shortcuts app.

A SwiftUI SiriKit NSUserActivity Tutorial

In this chapter, an example project will be created that uses the Photo domain of SiriKit to allow the user, via Siri voice commands, to search for and display a photo taken on a specified date. In the process of designing this app, the tutorial will also demonstrate the use of the NSUserActivity class to allow processing of the intent to be transferred from the Intents Extension to the main iOS app.

About the SiriKit Photo Search Project

The project created in this tutorial is going to take the form of an app that uses the SiriKit Photo Search domain to locate photos in the Photo library. Specifically, the app will allow the user to use Siri to search for photos taken on a specific date. In the event that photos matching the date criteria are found, the main app will be launched and used to display the first photo taken on the chosen day.

Creating the SiriPhoto Project

Begin this tutorial by launching Xcode and selecting the options to create a new Multiplatform App project named SiriPhoto.

Enabling the Siri Entitlement

Once the main project has been created the Siri entitlement must be enabled for the project. Select the SiriPhoto target located at the top of the Project Navigator panel (marked A in Figure 44-1) so that the main panel displays the project settings. From within this panel, select the Signing & Capabilities tab (B) followed by the SiriPhoto target entry (C):

Figure 44-1

Click on the “+ Capability” button (D) to display the dialog shown in Figure 44-2 below. Enter Siri into the filter bar, select the result and press the keyboard enter key to add the capability to the project:

Figure 44-2

Seeking Siri Authorization

In addition to enabling the Siri entitlement, the app must also seek authorization from the user to integrate the app with Siri. This is a two-step process which begins with the addition of an entry to the Info.plist file of the iOS app target for the NSSiriUsageDescription key with a corresponding string value explaining how the app makes use of Siri.

Select the Info.plist file located within the iOS folder in the project navigator panel as shown in Figure 44-3:

Figure 44-3

Once the file is loaded into the editor, locate the bottom entry in the list of properties and hover the mouse pointer over the item. When the ‘+’ button appears, click on it to add a new entry to the list. From within the drop-down list of available keys, locate and select the Privacy – Siri Usage Description option as shown in Figure 44-4:

Figure 44-4

Within the value field for the property, enter a message to display to the user when requesting permission to use speech recognition. For example:

Siri support search and display photo library images.

Repeat the above steps to add a Privacy – Photo Library Usage Description entry set to the following to that the app is able to request photo library access permission from the user:

This app accesses your photo library to search and display photos.

In addition to adding the Siri usage description key, a call also needs to be made to the requestSiriAuthorization class method of the INPreferences class. Ideally, this call should be made the first time that the app runs, not only so that authorization can be obtained, but also so that the user learns that the app includes Siri support. For the purposes of this project, the call will be made within the onChange() modifier based on the scenePhase changes within the app declaration located in the SiriPhotoApp.swift file as follows:

import SwiftUI
import Intents
 
@main
struct SiriPhotoApp: App {
    
    @Environment(\.scenePhase) private var scenePhase
    
    var body: some Scene {
        WindowGroup {
            ContentView()
        }
        .onChange(of: scenePhase) { phase in
            INPreferences.requestSiriAuthorization({status in
                // Handle errors here
            })
        }
    }
}

Before proceeding, compile and run the app on an iOS device or simulator. When the app loads, a dialog will appear requesting authorization to use Siri. Select the OK button in the dialog to provide authorization.

Adding an Image Asset

The completed app will need an image to display when no matching photo is found for the search criteria. This image is named image-missing.png and can be found in the project_images folder of the source code download archive available from the following URL:

https://www.ebookfrenzy.com/code/SwiftUI-iOS14-CodeSamples.zip

Within the Xcode project navigator, locate and select the Assets.xcassets file located in the Shared folder. In a separate Finder window, locate the project_images folder from the sample code and drag and drop the image into the asset catalog as shown in Figure 44-5 below:

Figure 44-5

Adding the Intents Extension to the Project

With some of the initial work on the iOS app complete, it is now time to add the Intents Extension to the project. Select Xcode’s File -> New -> Target… menu option to display the template selection screen. From the range of available templates, select the Intents Extension option as shown in Figure 44-6:

Figure 44-6

With the Intents Extension template selected, click on the Next button and enter SiriPhotoIntent into the Product Name field. Before clicking on the Finish button, turn off the Include UI Extension option and make sure that the Starting Point is set to None since this extension will not be based on the Messaging domain. When prompted to do so, enable the build scheme for the Intents Extension by clicking on the Activate button in the resulting panel.

Reviewing the Default Intents Extension

The files for the Intents Extension are located in the SiriPhotoIntent folder which will now be accessible from within the Project Navigator panel. Within this folder are an Info.plist file and a file named IntentHandler.swift.

The IntentHandler.swift file contains the IntentHandler class declaration which currently only contains a stub handler() method.

Modifying the Supported Intents

Currently we have an app which is intended to search for photos but for which no supported intents have been declared. Clearly some changes need to be made to implement the required functionality.

The first step is to configure the Info.plist file for the SiriPhotoIntent extension. Select this file and unfold the NSExtension settings until the IntentsSupported array is visible:

Figure 44-7

Currently the array does not contain any supported intents. Add a photo search intent to the array by clicking on the + button indicated by the arrow in the above figure and entering INSearchForPhotosIntent into the newly created Item 0 value field. On completion of these steps the array should match that shown in Figure 44-8:

Figure 44-8

Modifying the IntentHandler Implementation

The IntentHandler class now needs to be updated to add support for Siri photo search intents. Edit the IntentHandler.swift file and change the class declaration so it reads as follows:

import Intents
import Photos
 
class IntentHandler: INExtension, INSearchForPhotosIntentHandling {
 
    override func handler(for intent: INIntent) -> Any {
        
        return self
    }
}

The only method currently implemented within the IntentHandler.swift file is the handler method. This method is the entry point into the extension and is called by SiriKit when the user indicates that the SiriPhoto app is to be used to perform a task. When calling this method, SiriKit expects in return a reference to the object responsible for handling the intent. Since this will be the responsibility of the IntentHandler class, the handler method simply returns a reference to itself.

Implementing the Resolve Methods

SiriKit is aware of a range of parameters which can be used to specify photo search criteria. These parameters consist of the photo creation date, the geographical location where the photo was taken, the people in the photo and album in which it resides. For each of these parameters, SiriKit will call a specific resolve method on the IntentHandler instance. Each method is passed the current intent object and is required to notify Siri whether or not the parameter is required and, if so, whether the intent contains a valid property value. The methods are also passed a completion handler reference which must be called to notify Siri of the response.

The first method called by Siri is the resolveDateCreated method which should now be implemented in the IntentHandler.swift file as follows:

func resolveDateCreated(for
    intent: INSearchForPhotosIntent,
    with completion: @escaping
        (INDateComponentsRangeResolutionResult) -> Void) {
 
    if let dateCreated = intent.dateCreated {
        completion(INDateComponentsRangeResolutionResult.success(
            with: dateCreated))
    } else {
        completion(INDateComponentsRangeResolutionResult.needsValue())
    }
}

The method verifies that the dateCreated property of the intent object contains a value. In the event that it does, the completion handler is called indicating to Siri that the date requirement has been successfully met within the intent. In this situation, Siri will call the next resolve method in the sequence.

If no date has been provided the completion handler is called indicating the property is still needed. On receiving this response, Siri will ask the user to provide a date for the photo search. This process will repeat until either a date is provided or the user abandons the Siri session.

The SiriPhoto app is only able to search for photos by date. The remaining resolver methods can, therefore, be implemented simply to return notRequired results to Siri. This will let Siri know that values for these parameters do not need to be obtained from the user. Remaining within the IntentHandler.swift file, implement these methods as follows:

func resolveAlbumName(for intent: INSearchForPhotosIntent, 
    with completion: @escaping (INStringResolutionResult) -> Void) {
    completion(INStringResolutionResult.notRequired())
}
 
func resolvePeopleInPhoto(for intent: 
     INSearchForPhotosIntent, with completion: @escaping ([INPersonResolutionResult]) -> Void) {
    completion([INPersonResolutionResult.notRequired()])
}
 
func resolveLocationCreated(for intent: 
    INSearchForPhotosIntent, with completion: @escaping (INPlacemarkResolutionResult) -> Void) {
        completion(INPlacemarkResolutionResult.notRequired())
}

With these methods implemented, the resolution phase of the intent handling process is now complete.

Implementing the Confirmation Method

When Siri has gathered the necessary information for the user, a call is made to the confirm method of the intent handler instance. The purpose of this call is to provide the handler with an opportunity to check that everything is ready to handle the intent. In the case of the SiriPhoto app, there are no special requirements so the method can be implemented to reply with a ready status:

func confirm(intent: INSearchForPhotosIntent, 
    completion: @escaping (INSearchForPhotosIntentResponse) -> Void)
{
    let response = INSearchForPhotosIntentResponse(code: .ready, 
        userActivity: nil)
    completion(response)
}

Handling the Intent

The last step in implementing the extension is to handle the intent. After the confirm method indicates that the extension is ready, Siri calls the handle method. This method is, once again, passed the intent object and a completion handler to be called when the intent has been handled by the extension. Implement this method now so that it reads as follows:

func handle(intent: INSearchForPhotosIntent, completion: @escaping
    (INSearchForPhotosIntentResponse) -> Void) {
    
    let activityType = "com.ebookfrenzy.siriphotointent"
    let activity = NSUserActivity(activityType: activityType)
    
    let response = INSearchForPhotosIntentResponse(code:
        INSearchForPhotosIntentResponseCode.continueInApp,
                                             userActivity: activity)
    
    if intent.dateCreated != nil {
        let calendar = Calendar(identifier: .gregorian)
        
        if let startComponents = intent.dateCreated?.startDateComponents,
            let endComponents = intent.dateCreated?.endDateComponents {
            
            if let startDate = calendar.date(from:
                startComponents),
                let endDate = calendar.date(from:
                    endComponents) {
                
                response.searchResultsCount = 
                   photoSearchFrom(startDate, to: endDate)
            }
        }
    }
    completion(response)
}

The above code requires some explanation. The method is responsible for constructing the intent response object containing the NSUserActivity object which will be handed off to the SiriPhoto app. The method begins by creating a new NSUserActivity instance configured with a type as follows:

let activityType = "com.ebookfrenzy.siriphotointent"
let activity = NSUserActivity(activityType: activityType)

The activity type can be any string value but generally takes the form of the app or extension name and company reverse domain name. Later in the chapter, this type name will need to be added as a supported activity type to the Info.plist file for the SiriPhoto app and referenced in the App declaration so that SiriPhoto knows which intent triggered the app launch.

Next, the method creates a new intent response instance and configures it with a code to let Siri know that the intent handling will be continued within the main SiriPhoto app. The intent response is also initialized with the NSUserActivity instance created previously:

let response = INSearchForPhotosIntentResponse(code:
                    INSearchForPhotosIntentResponseCode.continueInApp,
                               userActivity: activity)

The code then converts the start and end dates from DateComponents objects to Date objects and calls a method named photoSearchFrom(to:) to confirm that photo matches are available for the specified date range. The photoSearchFrom(to:) method (which will be implemented next) returns a count of the matching photos. This count is then assigned to the searchResultsCount property of the response object, which is then returned to Siri via the completion handler:

    if let startComponents = intent.dateCreated?.startDateComponents,
        let endComponents = intent.dateCreated?.endDateComponents {
 
        if let startDate = calendar.date(from:
            startComponents),
          let endDate = calendar.date(from:
              endComponents) {
 
        response.searchResultsCount = photoSearchFrom(startDate,
                            to: endDate)
        }
    }
}
completion(response)

If the extension returns a zero count via the searchResultsCount property of the response object, Siri will notify the user that no photos matched the search criteria. If one or more photo matches were found, Siri will launch the main SiriPhoto app and pass it the NSUserActivity object.

The final step in implementing the extension is to add the photoSearchFrom(to:) method to the IntentHandler. swift file:

func photoSearchFrom(_ startDate: Date, to endDate: Date) -> Int {
 
    let fetchOptions = PHFetchOptions()
 
    fetchOptions.predicate = NSPredicate(format: "creationDate > %@ AND creationDate < %@", startDate as CVarArg, endDate as CVarArg)
    let fetchResult = PHAsset.fetchAssets(with: PHAssetMediaType.image, 
                           options: fetchOptions)
    return fetchResult.count
}

The method makes use of the standard iOS Photos Framework to perform a search of the Photo library. It begins by creating a PHFetchOptions object. A predicate is then initialized and assigned to the fetchOptions instance specifying that the search is looking for photos taken between the start and end dates. Finally, the search for matching images is initiated, and the resulting count of matches returned.

Testing the App

Though there is still some work to be completed for the main SiriPhoto app, the Siri extension functionality is now ready to be tested. Within Xcode, make sure that SiriPhotoIntent is selected as the current target and click on the run button. When prompted for a host app, select Siri and click the run button. When Siri has started listening, say the following:

“Find a photo with SiriPhoto”

Siri will respond by seeking the day for which you would like to find a photo. After you specify a date, Siri will either launch the SiriPhoto app if photos exist for that day, or state that no photos could be found. Note that the first time a photo is requested the privacy dialog will appear seeking permission to access the photo library.

Provide permission when prompted and then repeat the photo search request.

Adding a Data Class to SiriPhoto

When SiriKit launches the SiriPhoto app in response to a successful photo search, it will pass the app an NSUserActivity instance. The app will need to handle this activity and use the intent response it contains to extract the matching photo from the library. The photo image will, in turn, need to be stored as a published observable property so that the content view is always displaying the latest photo. These tasks will be performed in a new Swift class declaration named PhotoHandler.

Add this new class to the project by right-clicking on the Shared folder in the project navigator panel and selecting the New File… menu option. In the template selection panel, choose the Swift File option before clicking on the Next button. Name the new class PhotoHandler and click on the Create button. With the PhotoHandler.swift file loaded into the code editor, modify it as follows:

import SwiftUI
import Combine
import Intents
import Photos
 
class PhotoHandler: ObservableObject {
    
    @Published var image: UIImage?
    var userActivity: NSUserActivity
    
    init (userActivity: NSUserActivity) {
        
        self.userActivity = userActivity
        self.image = UIImage(named: "image-missing")
        
    }
}

The above changes declare an observable class containing UIImage and NSUserActivity properties. The image property is declared as being published and will be observed by the content view later in the tutorial.

The class initializer stores the NSUserActivity instance passed through when the class is instantiated and assigns the missing image icon to the image property so that it will be displayed if there is no matching image from SiriKit.

Next, the class needs a method which can be called by the app to extract the photo from the library. Remaining in the PhotoHandler.swift file, add this method to the class as follows:

func handleActivity() {
    
    let intent = userActivity.interaction?.intent
        as! INSearchForPhotosIntent
    
    if (intent.dateCreated?.startDateComponents) != nil {
        let calendar = Calendar(identifier: .gregorian)
        let startDate = calendar.date(from:
            (intent.dateCreated?.startDateComponents)!)
        let endDate = calendar.date(from:
            (intent.dateCreated?.endDateComponents)!)
        getPhoto(startDate!, endDate!)
    }
}

The handleActivity() method extracts the intent from the user activity object and then converts the start and end dates to Date objects. These dates are then passed to the getPhoto() method which now also needs to be added to the class:

func getPhoto(_ startDate: Date, _ endDate: Date){
    
    let fetchOptions = PHFetchOptions()
    
    fetchOptions.predicate = NSPredicate(
         format: "creationDate > %@ AND creationDate < %@", 
                  startDate as CVarArg, endDate as CVarArg)
    let fetchResult = PHAsset.fetchAssets(with:
        PHAssetMediaType.image, options: fetchOptions)
    
    let imgManager = PHImageManager.default()
    
    if let firstObject = fetchResult.firstObject {
        imgManager.requestImage(for: firstObject as PHAsset,
                                targetSize: CGSize(width: 500, 
                                                    height: 500),
                                contentMode: 
                                     PHImageContentMode.aspectFill,
                                options: nil,
                                resultHandler: { (image, _) in
                                    self.image = image
        })
    }
}

The getPhoto() method performs the same steps used by the intent handler to search the Photo library based on the search date parameters. Once the search results have returned, however, the PHImageManager instance is used to retrieve the image from the library and assign it to the published image variable.

Designing the Content View

The user interface for the app is going to consist of a single Image view on which will be displayed the first photo taken on the day chosen by the user via Siri voice commands. Edit the ContentView.swift file and modify it so that it reads as follows:

import SwiftUI
 
struct ContentView: View {
 
    @StateObject var photoHandler: PhotoHandler
    
    var body: some View {
        Image(uiImage: photoHandler.image!)
            .resizable()
            .aspectRatio(contentMode: .fit)
            .padding()
    }
}
 
struct ContentView_Previews: PreviewProvider {
    static var previews: some View {
        ContentView(photoHandler: PhotoHandler(userActivity: 
              NSUserActivity(activityType: "Placeholder")))
    }
}

The changes simply add a PhotoHandler state object variable declaration, the image property of which is used to display an image on an Image view. The preview declaration is then adapted to pass a PhotoHandler instance to the content view initialized with a placeholder NSUserObject. Steps also need to be taken to pass a placeholder PhotoHandler instance to the content view within the SiriPhotoApp.swift file as follows:

import SwiftUI
import Intents
 
@main
struct SiriPhotoApp: App {
 
    @Environment(\.scenePhase) private var scenePhase
    var photoHandler: PhotoHandler = 
        PhotoHandler(userActivity: NSUserActivity(activityType: "Placeholder"))
    
    var body: some Scene {
        WindowGroup {
            ContentView(photoHandler: photoHandler)
        }
        .onChange(of: scenePhase) { phase in
            INPreferences.requestSiriAuthorization({status in
                // Handle errors here
            })
        }
    }
}

When previewed, the ContentView layout should be rendered as shown in the figure below:

Figure 44-9

Adding Supported Activity Types to SiriPhoto

When the intent handler was implemented earlier in the chapter, the NSUserActivity object containing the photo search information was configured with an activity type string. In order for the SiriPhoto app to receive the activity, the type must be declared using the NSUserActivityTypes property in the app’s iOS Info.plist file. Within the project navigator panel, select the Info.plist file located in the iOS folder. Hover the mouse pointer over the last entry in the property list and click on the ‘+’ button to add a new property. In the Key field, enter NSUserActivityTypes and change the Type setting to Array as shown in Figure 44-10:

Figure 44-10

Click on the ‘+’ button indicated by the arrow above to add a new item to the array. Set the value for Item 0 to com.ebookfrenzy.siriphotointent so that it matches the type assigned to the user activity instance:

Figure 44-11

Handling the NSUserActivity Object

The intent handler in the extension has instructed Siri to continue the intent handling process by launching the main SiriPhoto app. When the app is launched by Siri it will be provided the NSUserActivity object for the session containing the intent object. When an app is launched and passed an NSUserActivity object it can be accessed from within the App declaration by adding the onContinueUserActivity() modifier to the ContentView, passing through the activity type and defining the actions to be performed. Within the SiriPhotoApp.swift file, implement these changes as follows:

import SwiftUI
 
@main
struct SiriPhotoApp: App {
    
    var photoHandler: PhotoHandler = PhotoHandler(userActivity: 
        NSUserActivity(activityType: "Placeholder"))
    
    var body: some Scene {
        WindowGroup {
            ContentView(photoHandler: photoHandler)
                .onContinueUserActivity(
                       "com.ebookfrenzy.siriphotointent", 
                perform: { userActivity in
                    photoHandler.userActivity = userActivity
                    photoHandler.handleActivity()
                })
        }
.
.

The declaration begins by creating a placeholder PhotoHandler instance which can be passed to the ContentView in the event that the app is not launched by a supported activity type, or by the user tapping on the app in on the device home screen.

Next, the onContinueUserActivity() modifier is configured to only detect the activity type associated with the SiriPhotoIntent. If the type is detected, the NSUserActivity object passed to the app is assigned to the placeholder PhotoHandler instance and the handleActivity() method called to fetch the photo from the library. Because the content view is observing the image property, the Image view will update to display the extracted photo image.

Testing the Completed App

Run the SiriPhotoIntent extension, perform a photo search and, assuming photos are available for the selected day, wait for the main SiriPhoto app to load. When the app has loaded, the first photo taken on the specified date should appear within the Image view:

Figure 44-12

Summary

This chapter has worked through the creation of a simple app designed to use SiriKit to locate a photo taken on a particular date. The example has demonstrated the creation of an Intents Extension and the implementation of the intent handler methods necessary to interact with the Siri environment, including resolving missing parameters in the Siri intent. The project also explored the use of the NSUserActivity class to transfer the intent from the extension to the main iOS app.

Customizing the SiriKit Intent User Interface

Each SiriKit domain will default to a standard user interface layout to present information to the user during the Siri session. In the previous chapter, for example, the standard user interface was used by SiriKit to display to the user the message recipients and content to the user before sending the message. The default appearance can, however, be customized by making use of an Intent UI app extension. This UI Extension provides a way to control the appearance of information when it is displayed within the Siri interface. It also allows an extension to present additional information that would not normally be displayed by Siri or to present information using a visual style that reflects the design theme of the main app.

Adding the Intents UI Extension

When the Intents Extension was added to the SiriDemo project in the previous chapter, the option to include an Intents UI Extension was disabled. Now that we are ready to create a customized user interface for the intent, select the Xcode File -> New -> Target… menu option and add an Intents UI Extension to the project. Name the product SiriDemoIntentUI and, when prompted to do so, activate the build scheme for the new extension.

Modifying the UI Extension

SiriKit provides two mechanisms for performing this customization each of which involves implementing a method in the intent UI view controller class file. A simpler and less flexible option involves the use of the configure method. For greater control, the previously mentioned configureView method is available.

Using the configure Method

The files for this Intent UI Extension added above can be found within the Project navigator panel under the SiriDemoIntentUI folder.

Included within the SiriDemoIntentUI extension is a storyboard file named MainInterface.storyboard. For those unfamiliar with how user interfaces were built prior to the introduction of SwiftUI, this is an Interface Builder file. When the configure method is used to customize the user interface, this scene is used to display additional content which will appear directly above the standard SiriKit provided UI content. This layout is sometimes referred to as the Siri Snippet.

Although not visible by default, at the top of the message panel presented by Siri is the area represented by the UI Extension. Specifically, this displays the scene defined in the MainInterface.storyboard file of the SiriDemoIntentUI extension folder. The lower section of the panel is the default user interface provided by Siri for this particular SiriKit domain.

To provide a custom user interface using the UI Extension, the user interface needs to be implemented in the MainInterface.storyboard file and the configure method added to the IntentViewController.swift file. The IntentViewController class in this file is a subclass of UIViewController and configured such that it implements the INUIHostedViewControlling protocol.

The UI Extension is only used when information is being presented to the user in relation to an intent type that has been declared as supported in the UI Extension’s Info.plist file. When the extension is used, the configure method of the IntentViewController is called and passed an INInteraction object containing both the NSUserActivity and intent objects associated with the current Siri session. This allows context information about the session to be extracted and displayed to the user via the custom user interface defined in the MainInterface.storyboard file.

To add content above the “To:” line, therefore, we just need to implement the configure method and add some views to the UIView instance in the storyboard file. These views can be added either via Interface Builder or programmatically with the configure method.

For more advanced configuration, however, the configureView() approach provides far greater flexibility, and is the focus of this chapter.

Using the configureView Method

Unlike the configure method, the configureView method allows each section of the default user interface to be replaced with custom content and view layout.

SiriKit considers the default layout to be a vertical stack in which each row is represented by a parameter. For each layer of the stack (starting at the top and finishing at the bottom of the layout) the configureView method is called, passed information about the corresponding parameters and given the opportunity to provide a custom layout to be displayed within the corresponding stack row of the Siri user interface. The method is also passed a completion handler to be called with the appropriate configuration information to be passed back to Siri.

The parameters passed to the method take the form of INParameter instances. It is the responsibility of the configureView method to find out if a parameter is one for which it wants to provide a custom layout. It does this by creating local NSParameter instances of the type it is interested in and comparing these to the parameters passed to the method. Parameter instances are created by combining the intent class type with a specific key path representing the parameter (each type of intent has its own set of path keys which can be found in the documentation for that class). If the method needs to confirm that the passed parameter relates to the content of a send message intent, for example, the code would read as follows:

func configureView(for parameters: Set<INParameter>, of interaction: 
   INInteraction, interactiveBehavior: INUIInteractiveBehavior, context: 
    INUIHostedViewContext, completion: @escaping (Bool, Set<INParameter>, 
      CGSize) -> Void) {
 
    let content = INParameter(for: INSendMessageIntent.self, 
               keyPath: #keyPath(INSendMessageIntent.content))
 
    if parameters == [content] {
       // Configure ViewController before calling completion handler
   }
.
.
}

When creating a custom layout, it is likely that the method will need to access the data contained within the parameter. In the above code, for example, it might be useful to extract the message content from the parameter and incorporate it into the custom layout. This is achieved by calling the parameterValue method of the INInteraction object which is also passed to the configureView method. Each parameter type has associated with it a set of properties. In this case, the property for the message content is named, appropriately enough, content and can be accessed as follows:

.
.
let content = INParameter(for: INSendMessageIntent.self, 
               keyPath: #keyPath(INSendMessageIntent.content))
 
if parameters == [content] {
   let contentString = interaction.parameterValue(for: content)
}
.
.

When the configureView method is ready to provide Siri with a custom layout, it calls the provided completion handler, passing through a Boolean true value, the original parameters and a CGSize object defining the size of the layout as it is to appear in the corresponding row of the Siri user interface stack, for example:

completion(true, parameters, size)

If the default Siri content is to be displayed for the specified parameters instead of a custom user interface, the completion handler is called with a false value and a zero CGSize object:

completion(false, parameters, CGSize.zero)

In addition to calling the configureView method for each parameter, Siri will first make a call to the method to request a configuration for no parameters. By default, the method should check for this condition and call the completion handler as follows:

if parameters.isEmpty {
    completion(false, [], CGSize.zero)
}

The foundation for the custom user interface for each parameter is the View contained within the intent UI MainInterface.storyboard file. Once the configureView method has identified the parameters it can dynamically add views to the layout, or make changes to existing views contained within the scene.

Designing the Siri Snippet

The previous section covered a considerable amount of information, much of which will become clearer by working through an example.

Begin by selecting the MainInterface.storyboard file belonging to the SiriDemoIntentUI extension. While future releases of Xcode will hopefully allow the snippet to be declared using SwiftUI, this currently involves working with Interface Builder to add components, configure layout constraints and set up outlets.

The first step is to add a Label to the layout canvas. Display the Library by clicking on the button marked A in Figure 43-1 below and drag and drop a Label object from the Library (B) onto the layout canvas as indicated by the arrow:

Figure 43-1

Next, the Label needs to be constrained so that it has a 5dp margin between the leading, trailing and top edges of the parent view. With the Label selected in the canvas, click on the Add New Constraints button located in the bottom right-hand corner of the editor to display the menu shown in Figure 43-2 below:

Figure 43-2

Enter 5 into the top, left and right boxes and click on the I-beam icons next to each value so that they are displayed in solid red instead of dashed lines before clicking on the Add 3 Constraints button.

Before proceeding to the next step, establish an outlet connection from the Label component to a variable in the IntentViewController.swift file named contentLabel. This will allow the view controller to change the text displayed on the Label to reflect the intent content parameter. This is achieved using the Assistant Editor which is displayed by selecting the Xcode Editor -> Assistant menu option. Once displayed, Ctrl-click on the Label in the canvas and drag the resulting line to a position in the Assistant Editor immediately above the viewDidLoad() declaration:

Figure 43-3

On releasing the line, the dialog shown in Figure 43-4 will appear. Enter contentLabel into the Name field and click on Connect to establish the outlet.

Figure 43-4

Ctrl-click on the snippet background view and drag to immediately beneath the newly declared contentLabel outlet, this time creating an outlet named contentView:

Figure 43-5

On completion of these steps, the outlets should appear in the IntentViewController.swift file as follows:

class IntentViewController: UIViewController, INUIHostedViewControlling {
    
    @IBOutlet weak var contentLabel: UILabel!
    @IBOutlet weak var contentView: UIView!
.
.

Implementing a configureView Method

Next, edit the configureView method located in the IntentViewController.swift file to extract the content and recipients from the intent, and to modify the Siri snippet for the content parameter as follows:

func configureView(for parameters: Set<INParameter>, of interaction: 
    INInteraction, interactiveBehavior: INUIInteractiveBehavior, context: 
    INUIHostedViewContext, completion: @escaping (Bool, Set<INParameter>, 
     CGSize) -> Void) {
 
    var size = CGSize.zero
    
    let content = INParameter(for: INSendMessageIntent.self, keyPath:
        #keyPath(INSendMessageIntent.content))
 
    let recipients = INParameter(for: INSendMessageIntent.self,
                        keyPath: #keyPath(INSendMessageIntent.recipients))
    
    let recipientsValue = interaction.parameterValue(
           for: recipients) as! Array<INPerson>
 
    if parameters == [content] {
        let contentValue = interaction.parameterValue(for: content)
        
        self.contentLabel.text = contentValue as? String
        self.contentLabel.textColor = UIColor.white
        self.contentView.backgroundColor = UIColor.brown
        size = CGSize(width: 100, height: 70)
    }
    completion(true, parameters, size)
}

The code begins by declaring a variable in which to contain the required size of the Siri snippet before the content and recipients are extracted from the intent parameter. If the parameters include message content, it is applied to the Label widget in the snippet. The background of the snippet view is set to brown, the text color to white, and the dimensions to 100 x 70dp.

The recipients parameter takes the form of an array of INPerson objects, from which can be extracted the recipients’ display names. Code now needs to be added to iterate through each recipient in the array, adding each name to a string to be displayed on the contentLabel view. Code will also be added to use a different font and text color on the label and to change the background color of the view. Since the recipients list requires less space, the height of the view is set to 30dp:

.
.
    if parameters == [content] {
        let contentValue = interaction.parameterValue(for: content)
        self.contentLabel.text = contentValue as? String
        self.contentView.backgroundColor = UIColor.brown
        size = CGSize(width: 100, height: 70)      
    } else if recipientsValue.count > 0 {
        var recipientStr = "To:"
        var first = true
            
        for name in recipientsValue {
            let separator = first ? " " : ", "
            first = false
            recipientStr += separator + name.displayName
        }
            
        self.contentLabel.font = UIFont(name: "Arial-BoldItalicMT", size: 20.0)
        self.contentLabel.text = recipientStr
        self.contentLabel.textColor = UIColor.white
        self.contentView.backgroundColor = UIColor.blue
        size = CGSize(width: 100, height: 30)
    } else if parameters.isEmpty {
        completion(false, [], CGSize.zero)
    }
    completion(true, parameters, size)
.
.

Note that the above additions to the configureView() method also include a check for empty parameters, in which case a false value is returned together with a zeroed CGSize object indicating that there is nothing to display.

Testing the Extension

To test the extension, begin by changing the run target menu to the SiriDemoIntentUI target as shown in Figure 43-6 below:

Figure 43-6

Next, display the menu again, this time selecting the Edit Scheme… menu option:

Figure 43-7

In the resulting dialog select the Run option from the left-hand panel and enter the following into the Siri Intent Query box before clicking on the Close button:

Use SiriDemo to tell John and Kate I’ll be 10 minutes late.

Compile and run the Intents UI Extension and verify that the recipient row now appears with a blue background, a 30 point height and uses a larger italic font while the content appears with a brown background and a 70dp height:

Figure 43-8

Summary

While the default user interface provided by SiriKit for the various domains will be adequate for some apps, most intent extensions will need to be customized to present information in a way that matches the style and theme of the associated app, or to provide additional information not supported by the default layout. The default UI can be replaced by adding an Intent UI extension to the app project. The UI extension provides two options for configuring the user interface presented by Siri. The simpler of the two involves the use of the configure method to present a custom view above the default Siri user interface layout. A more flexible approach involves the implementation of the configureView method. SiriKit associates each line of information displayed in the default layout with a parameter. When implemented, the configureView method will be called for each of these parameters and provided with the option to return a custom View containing the layout and information to be used in place of the default user interface element.

A SwiftUI SiriKit Tutorial

The previous chapter covered much of the theory associated with integrating Siri into an iOS app. This chapter will review the example Siri messaging extension that is created by Xcode when a new Intents Extension is added to a project. This will not only show a practical implementation of the topics covered in the previous chapter, but will also provide some more detail on how the integration works. The next chapter will cover the steps required to make use of a UI Extension within an app project.

Creating the Example Project

Begin by launching Xcode and creating a new Multiplatform App project named SiriDemo.

Enabling the Siri Entitlement

Once the main project has been created the Siri entitlement must be enabled for the project. Select the SiriDemo target located at the top of the Project Navigator panel (marked A in Figure 42-1) so that the main panel displays the project settings. From within this panel, select the Signing & Capabilities tab (B) followed by the SiriDemo target entry (C):

Figure 42-1

Click on the “+ Capability” button (D) to display the dialog shown in Figure 42-2. Enter Siri into the filter bar, select the result and press the keyboard enter key to add the capability to the project:

Figure 42-2

If Siri is not listed as an option, you will need to pay to join the Apple Developer program as outlined in the chapter entitled “Joining the Apple Developer Program”.

Seeking Siri Authorization

In addition to enabling the Siri entitlement, the app must also seek authorization from the user to integrate the app with Siri. This is a two-step process which begins with the addition of an entry to the Info.plist file of the iOS app target for the NSSiriUsageDescription key with a corresponding string value explaining how the app makes use of Siri.

Select the Info.plist file located within the iOS folder in the project navigator panel as shown in Figure 42-3:

Figure 42-3

Once the file is loaded into the editor, locate the bottom entry in the list of properties and hover the mouse pointer over the item. When the plus button appears, click on it to add a new entry to the list. From within the drop-down list of available keys, locate and select the Privacy – Siri Usage Description option as shown in Figure 42-4:

Figure 42-4

Within the value field for the property, enter a message to display to the user when requesting permission to use speech recognition. For example:

Siri support is used to send and review messages.

In addition to adding the Siri usage description key, a call also needs to be made to the requestSiriAuthorization() class method of the INPreferences class. Ideally, this call should be made the first time that the app runs, not only so that authorization can be obtained, but also so that the user learns that the app includes Siri support. For the purposes of this project, the call will be made within the onChange() modifier based on the scenePhase changes within the app declaration located in the SiriDemoApp.swift file as follows:

import SwiftUI
import Intents
 
@main
struct SiriDemoApp: App {
    
    @Environment(\.scenePhase) private var scenePhase
    
    var body: some Scene {
        WindowGroup {
            ContentView()
        }
        .onChange(of: scenePhase) { phase in
            INPreferences.requestSiriAuthorization({status in
                // Handle errors here
            })
        }
    }
}

Before proceeding, compile and run the app on an iOS device or simulator. When the app loads, a dialog will appear requesting authorization to use Siri. Select the OK button in the dialog to provide authorization.

Adding the Intents Extension

The next step is to add the Intents Extension to the project ready to begin the SiriKit integration. Select the Xcode File -> New -> Target… menu option and add an Intents Extension to the project. Name the product SiriDemoIntent, set the Starting Point menu to Messaging and make sure that the Include UI Extension option is turned off (this will be added in the next chapter) before clicking on the Finish button. When prompted to do so, activate the build scheme for the Intents Extension.

Supported Intents

In order to work with Siri, an extension must specify the intent types it is able to support. These declarations are made in the Info.plist files of the extension folders. Within the Project Navigator panel, select the Info.plist file located in the SiriDemoIntent folder and unfold the NSExtension -> NSExtensionAttributes section. This will show that the IntentsSupported key has been assigned an array of intent class names:

Figure 42-5

Note that entries are available for intents that are supported and intents that are supported but restricted when the lock screen is enabled. It might be wise, for example, for a payment based intent to be restricted when the screen is locked. As currently configured, the extension supports all of the messaging intent types without restrictions. To support a different domain, change these intents or add additional intents accordingly. For example, a photo search extension might only need to specify INSearchForPhotosIntent as a supported intent. When the Intents UI Extension is added in the next chapter, it too will contain an Info.plist file with these supported intent value declarations. Note that the intents supported by the Intents UI Extension can be a subset of those declared in the UI Extension. This allows the UI Extension to be used only for certain intent types.

Trying the Example

Before exploring the structure of the project it is worth running the app and experiencing the Siri integration. The example simulates searching for and sending messages, so can be safely used without any messages actually being sent.

Make sure that the SiriDemoIntent option is selected as the run target in the toolbar as illustrated in Figure 42-6 and click on the run button.

Figure 42-6

When prompted, select Siri as the app within which the extension is to run. When Siri launches experiment with phrases such as the following:

“Send a message with SiriDemo.”

“Send a message to John with SiriDemo.”

“Use SiriDemo to say Hello to John and Kate.”

“Find Messages with SiriDemo.”

If Siri indicates that SiriDemo has not yet been set up, tap the button located on the Siri screen to open the SiriDemo app. Once the app has launched, press and hold the home button to relaunch Siri and try the above phrases again.

In each case, all of the work involved in understanding the phrases and converting them into structured representations of the request is performed by Siri. All the intent handler needs to do is work with the resulting intent object.

Specifying a Default Phrase

A useful option when repeatedly testing SiriKit behavior is to configure a phrase to be passed to Siri each time the app is launched from within Xcode. This avoids having to repeatedly speak to Siri each time the app is relaunched. To specify the test phrase, select the SiriDemoIntent run target in the Xcode toolbar and select Edit scheme… from the resulting menu as illustrated in Figure 42-7:

Figure 42-7

In the scheme panel, select the Run entry in the left-hand panel followed by the Info tab in the main panel. Within the Info settings, enter a query phrase into the Siri Intent Query text box before closing the panel:

Figure 42-8

Run the extension once again and note that the phrase is automatically passed to Siri to be handled:

Figure 42-9

Reviewing the Intent Handler

The Intent Handler is declared in the IntentHandler.swift file in the SiriDemoIntent folder. Load the file into the editor and note that the class declares that it supports a range of intent handling protocols for the messaging domain:

class IntentHandler: INExtension, INSendMessageIntentHandling, 
  INSearchForMessagesIntentHandling, INSetMessageAttributeIntentHandling {
.
.
}

The above declaration declares the class as supporting all three of the intents available in the messaging domain.

As an alternative to listing all of the protocol names individually, the above code could have achieved the same result by referencing the INMessagesDomainHandling protocol which encapsulates all three protocols.

If this template were to be re-purposed for a different domain, these protocol declarations would need to be replaced. For a payment extension, for example, the declaration might read as follows:

class IntentHandler: INExtension, INSendPaymentIntentHandling, 
    INRequestPaymentIntent {
.
.
}

The class also contains the handler method, resolution methods for the intent parameters and the confirm method. The resolveRecipients method is of particular interest since it demonstrates the use of the resolution process to provide the user with a range of options from which to choose when a parameter is ambiguous.

The implementation also contains multiple handle methods for performing tasks for message search, message send and message attribute change intents. Take some time to review these methods before proceeding.

Summary

This chapter has provided a walk-through of the sample messaging-based extension provided by Xcode when creating a new Intents Extension. This has highlighted the steps involved in adding both Intents and UI Extensions to an existing project, and enabling and seeking SiriKit integration authorization for the project. The chapter also outlined the steps necessary for the extensions to declare supported intents and provided an opportunity to gain familiarity with the methods that make up a typical intent handler. The next chapter will outline the mechanism for implementing and configuring a UI Extension.