August 13, 2019
by Dan Segars

Hey Siri, tell me about Simple

Siri + Simple

As a financial technology company, Simple’s mission is to help you feel confident with your money. We want to accomplish this mission by providing practical, financial software tools for our customers. As an engineer, I also want to keep our sights on what is up-and-coming within the tech industry.

“Voice first” is a concept that has been around for a while, and using our voice is becoming an increasingly popular way to input and access small pieces of data. As an iOS Engineer, my bias is towards using voice with Siri, but you can clearly see the use of voice in other companies like Amazon Alexa, Google Assistant, and Microsoft’s Cortana.

With Siri you can use your voice to input and request information whether you’re using an iPhone®, iPad®, Apple Watch®, CarPlay®, HomePod®, AirPods®, or Mac®. In 2016, Apple added SiriKit which allowed 3rd party developers to hook into Siri, but this was limited to only a few types of apps. While initially providing a couple of options that we could have used at Simple, those options didn’t quite fit our brand or our built-in tools like Safe-to-Spend™ or Goals.

Fast forward to 2018. In iOS 12, Apple announced an expansion of SiriKit whose customization caught my eye as I considered Simple’s growing feature set of Shared Accounts, the Emergency Fund, Expenses, the Protected Goal Account, and the Shared Protected Goal Account. Before we dive into technical bits, why are we doing this?

Siri + Simple = More Convenience

Using your voice for banking may not yet be mainstream, but I wanted to begin exploring how an integration with Siri could help the banking experience feel easier, more personal, more relevant to your context, and help you save and spend money more confidently. As a first step we integrated Siri into Safe-to-Spend™, Goals and Expenses.

“Hey Siri, Safe-to-Spend”

Some people use Expenses to set aside money for things like recurring bills and Goals to stash money away for longer term savings. In this scenario, Safe-to-Spend™ is a great reflection of a lump of money that has not been allocated and is safe for a customer to use as they see fit. So, when a Simple customer needs to get a quick read on their discretionary spending, all they have to say is, “Hey Siri, Safe-to-Spend.”

“Hey Siri, Summer Vacation”

Goals are Simple’s trademark way of helping you automate saving money for bigger things like a trip to Hawaii, a new computer, or an engagement ring. Using Siri you can quickly find out how much you have saved in any of your Goals. So, if a customer wants to know their progress on a savings goal they can simply say, “Hey Siri, Vacation goal.”

“Hey Siri, Coffee Expense”

In addition to recurring bills, many Simple customers use Expenses for groceries or to better manage discretionary spending on things like eating out, grabbing coffee, or going to happy hour with friends. Discretionary spending usually happens when we’re on-the-go, so why not enable quick access to Expenses using Siri before arriving at your destination? So, the next time a customer is driving to the grocery store or their hands are full they can say, “Hey Siri, Grocery expense,” and Siri can tell them. Or the next time they are walking to the coffee shop, a customer can say, “Hey Siri, coffee expense,” and more quickly know their expense amount.

Adding SiriKit to Simple, an abbreviated how-to

By adding Siri to your app you are expanding your visibility. You will now show up in the Siri section of the Settings app, the Shortcuts app, and the Siri window. This means being mindful of new customer experiences outside of your app.

NSUserActivity

By implementing NSUserActivity you enable a customer to use Siri to navigate to a particular place and state within your app with absolutely no need to turn on the Siri Capability in your project. If you haven’t been introduced to NSUserActivity, I highly recommend reading about it. Apple has been beating this drum for about 4 years now, but they highly encourage using this API. Included in state restoration and handoff, setting an NSUserActivity object on a view controller, for example, is a very fast way to create a shortcut for your app. For example, on a UIViewController you could do something like: Screen Shot 2019-09-10 at 5.29.43 PM

After running the app with this code, if a customer navigates to this screen, a new shortcut will appear in the Siri section of the Settings app with the title, “Show 🏖 Vacation details.”

SC Blog SIRI howto 725x485

Tapping this shortcut will present a screen to add a voice shortcut, and invoking this voice shortcut will launch your app where you handle restoring them to this screen/state. To help define and better organize user activities in Simple, we created a subclass of NSUserActivity for each kind of activity a user might take. For example: Screen Shot 2019-09-10 at 5.50.49 PM

Of note, you must add any activities you create to the NSUserActivityTypes array in your Info.plist. Unfortunately, (unless you would like to go the route of using something like a linter), you will need to manually ensure the activityType string you use to instantiate the NSUserActivity is the same string you put in the Info.plist’s NSUserActivityTypes array. If you do not have a linter or a more bullet-proof solution, this is a great time for inline code documentation. 😀

INIntent

Whereas Siri can use an NSUserActivity object to help customers get into your app, an INIntent enables Siri to perform a more complex task on your customer’s behalf outside of your app. Fortunately, you don’t simply turn on the Siri capability for your app and call it done. I say “fortunately” because I cannot imagine having to predict all of the ways a customer might want to use Siri to get information from outside of the app! An INIntent, as Apple has defined it, is simply, “a request for you to fulfill using your Intents extension.” With SiriKit, you determine what can be asked of your app via an INIntent and how you will respond to that request. Having control over requests and the responses helps simplify things quite a bit for development and for customer experience.

Whereas Siri can use an NSUserActivity object to help customers get into your app, an INIntent enables Siri to perform a more complex task on your customer’s behalf outside of your app.

After adding an Intents extension (and optionally an IntentsUI extension) to your project, you will need to add an *.intentdefinition file where you define the INIntent(s) you want to support. For example, one of our INIntents is called Expense. Pro Tip: Do not include the word “Intent” in your custom intent names in the *.intentdefinition file. Xcode generates code based on this file, and will have already appended the word “Intent” to the generated class names.

For each intent you define in this file, you will need to determine requirements for fulfilling a customer’s request. These are all defined as Parameters in the *.intendefinition file. For example, if a customer wants to know their Safe-to-Spend in Simple, parameters must be set for everything needed to enable a request outside of the Simple app.

One thing we discovered on iOS 12 when configuring the custom intent is if you don’t have at least one Shortcut Type that uses all the Parameters that have been defined, when you invoke the voice shortcut for that intent the IntentHandler never gets called. You will stare at your screen sadly wondering why in the world you can’t see the fruits of your labor. So, make sure you have one parameter combination for each intent that uses all of the intent’s parameters.

INUIAddVoiceShortcutButton

When you originally defined your Intent in the *.intentdefinition file, you probably had a good idea of where in the UI you would inform the customer they could add a voice shortcut to Siri. For example, in the Simple app we wanted to let customers know they can create a voice shortcut to find out how much money is available in an Expense, without launching the Simple app. I would primarily recommend using the system provided INUIAddVoiceShortcutButton. However, there may be times that you want a custom solution. See the Human Interface Guidelines for Shortcuts for guidelines on when this is actually encouraged by Apple.

Using the INUIAddVoiceShortcutButton comes with many benefits. First, you get a familiar button provided by the system that customers have probably seen in other apps. The system manages the appearance of this button for you based on the existence of a voice shortcut (changing from “Add to Siri” to “Added to Siri”), even between app launches. Lastly, the system manages the behavior of this button (if it presents the INUIAddVoiceShortcutViewController or the INUIEditVoiceShortcutViewController). Now for a couple gotchas with this button.

There are a few “gotchas” to watch for when using the INUIAddVoiceShortcutButton. Apple’s documentation suggests adding a custom selector for the INUIAddVoiceShortcutButton that handles presenting the INUIAddVoiceShortcutViewController. Unfortunately, this is not what you want to do. Looking at the API for INUIAddVoiceShortcutButton you will see it has a delegate. You want to be the delegate for this button. Doing so lets the system tell the button to change its appearance and behavior. I have filed a documentation bug for Apple.

The second we discovered is a bug. It appears that apps whose minimum deployment target is iOS 11 do not see correct behavior with this button on a device running iOS 12. I filed a bug with the SiriKit team to hopefully get a fix. That bug aside, here’s how we created the button and what it looks like in the app. Screen Shot 2019-09-10 at 5.52.48 PM

SC Blog SIRI phone 725x664 (1)

Breaking down how this button is created, you’ll see a method called createIntent. We knew we would want to donate different types of intents depending on the activity of a customer. To make this easier and to consolidate code we created an extension on UIViewController. This extension has a couple different methods. One that creates and returns the type of intent the caller asks for and another that handles donating the intent to Siri.

Creating and donating an intent on our extension looks something like the following: Screen Shot 2019-09-10 at 5.58.16 PM

Lastly, the next building block for the INUIAddVoiceShortcutButton is setting its shortcut property. This is fairly straightforward but here lies a gotcha: the shortcut property, which although documented as optional in the API, seems to be required. In our experience, if you do not set a shortcut on this button, the INUIAddVoiceShortcutButtonDelegate methods will not be called even if the delegate has been set.

Handling an Intent

When you create an Intents extension, Xcode generates an IntentHandler class. This is where you want to return an object that will handle requests from a customer. I mentioned before that Xcode generates code based on your *.intentdefinition file. For each intent type you defined, a protocol is generated that declares how you will handle an intent. For our ExpenseIntent, this generated protocol was ExpenseIntentHandling. So, we created an object that conforms to this protocol and in the IntentHandler class we return that object if the intent being handled is an ExpenseIntent. It looks like the following: Screen Shot 2019-09-10 at 6.00.22 PM

In addition to enabling fairly easy shortcut creation, NSUserActivity is also very helpful to have when responding to a customer’s more complex requests with your intents. For example, after a customer invokes a voice shortcut, we display the result in the Siri window. At this point, they may want to tap the UI so they can be taken straight to that piece of information. Screen Shot 2019-09-10 at 6.17.22 PM

In the AppDelegate, we then route the user to where they need to go using the following delegate method: Screen Shot 2019-09-10 at 6.18.36 PM

Conclusion

In technology, there is a drive to make hardware smaller. You see this with the internals of computers like processors and storage. You also see this in consumer devices, like phones and watches. Moving into these smaller form factors is clearly pushing us towards other means of input and other means of retrieving data. As I consider voice as one of the means of interacting with smaller devices, it makes me ponder how different people with different abilities will experience voice interactions and what other layers of our financial technology service might change to accommodate vocal requests from a customer. Only time will tell as we continue to play and build in this area and explore newer features in Siri, but I have a few ideas… 😃

Disclaimer: Hey! Welcome to our disclaimer. Here’s what you need to know to safely consume this blog post: Any outbound links in this post will take you away from Simple.com, to external sites in the wilds of the internet; neither Simple or our partner bank, BBVA USA, endorse any linked-to websites; and we didn’t pay/barter with/bribe anyone to appear in this post. And as much as we wish we could control the cost of things, any prices in this article are just estimates. Actual prices are up to retailers, manufacturers, and other people who’ve been granted magical powers over digits and dollar signs.