Prerequisites

The prerequisites for this project are:

  • Xcode 9.3 or later

  • iOS 11 or later

  • Swift 4.1

The code for this guide can be found on Github: snipsco-samples/quickstart-swift.

Step 1: Create a new iOS project

Open Xcode, create a new iOS project using the "Single View App" template. Let's name it "HelloSnips". Save the project in a location of your choice.

Step 2: Add the Snips framework

Cocoapods
Carthage
Manual Setup

For this step, we assume that you have Cocoapods version 1.2.x or higher installed. If not, please follow the installation instructions at Cocoapods.org.

In order to set up your project for Cocoapods, close the current Xcode project. Open a terminal window, change the directory to your Xcode project, and type

pod init

This will generate a file named Podfile, which is used to manage the external dependencies of the project. Open this file with a text editor, and add the following lines at the top of the file:

source 'https://github.com/Snipsco/Specs.git'
platform :ios, '11.0'

Within the target 'HelloSnips' block, look for the section that starts with # Pods for HelloSnips and add the following line:

pod 'SnipsPlatform'

Save the file, and run the following command from the terminal:

pod install --repo-update

This will install the Snips framework as a project dependency. It will also create an Xcode workspace named HelloSnips.xcworkspace. Open this file (instead of the previously used HelloSnips.xcproject file) in order to continue with the Cocoapods dependencies.

Make sure you have Carthage version 0.17.x or higher.

If you prefer Carthage, you can add the following to your Cartfile. You will need to create the Cartfile in your Xcode project directory, then add the following line:

github "snipsco/snips-platform-swift"

Save your changes and then download and build the packages with this command in the terminal window:

carthage bootstrap

This process can take 10 minutes to complete and depending on your installed version of Xcode, you may see warning about compiler incompatibilities; these can be ignored.

After Carthage builds the Snips framework, drag SnipsPlatform.framework from the Carthage/Build/iOS directory to the “Linked Frameworks and Libraries” section of your Xcode project’s “General” settings tab. Click OK when prompted to copy the framework files.

If you haven't already, set up the project as a Git repository. Open a terminal window, change the directory to your Xcode project, and type:

git init

Add the snips-platform-swift repository as a submodule:

git submodule add https://github.com/snipsco/snips-platform-swift

Finally, drag SnipsPlatform.xcodeproj from the snips-platform-swift folder to the Xcode project. In the “Embedded Binaries” section of your Xcode project’s “General” settings tab, add SnipsPlatform.framework.

Step 3: Allow the app to access the microphone

  1. In Xcode, open Info.plist for your target (located under the HelloSnips group in the file navigator)

  2. Select Editor > Add Item

  3. Select Privacy - Microphone Usage Description for the entry's key value

  4. Enter an adequate description, for instance "HelloSnips requires access to your microphone in order to handle voice commands".

Step 4: Import the assistant model from the Console

If you haven't done so already, head over to console.snips.ai and create your assistant. If this is the first time you are creating an assistant, make sure to follow the Quick Start Console guide:

Once ready, download the assistant file, unzip it, and drag the folder to your project in Xcode. Make sure to check Copy items if needed, and that Create folder references is selected.

Step 5: Prepare the Xcode project

The Single View App template used in Step 1 generates some files which are not needed in this guide. Let's clean this up:

  1. In the Xcode file navigator, expand the HelloSnips group

  2. Locate Main.storyboard and delete it

  3. Open Info.plist, and find the entry named Main storyboard file base name. Remove it by pressing on the - icon next to it when highlighted

  4. Open AppDelegate.swift, find the method func application(_:didFinishLaunchingWithOptions:), and replace the block with the following:

func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
window = UIWindow(frame: UIScreen.main.bounds)
window?.makeKeyAndVisible()
window?.rootViewController = UINavigationController(rootViewController: ViewController())
return true
}

Step 6: Setup ViewController

Open ViewController.swift. At the top of the file, below the import UIKit statement, import the SnipsPlatform and AVFoundation frameworks:

import SnipsPlatform
import AVFoundation

Next, we will set up the Snips platform and an audio engine, which will give Snips access to the microphone data. Within the ViewController class, declare the following two instance variables:

class ViewController: UIViewController {
fileprivate let snips: SnipsPlatform
fileprivate let audioEngine: AVAudioEngine
// ...
}

Then, initialize the variables in the view controller's init() method. Copy the following code and place it after the variable declarations above:

init() {
let url = Bundle.main.url(forResource: "assistant", withExtension: nil)!
snips = try! SnipsPlatform(assistantURL: url)
audioEngine = try! ViewController.createAudioEngine(with: snips)
super.init(nibName: nil, bundle: nil)
}
required init?(coder aDecoder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}

The "assistant" key used in Bundle.main.url() should match the name of the folder that was imported in Step 4.

After the initialisers, add the following method, which creates the audio engine and binds its output to the Snips instance:

fileprivate class func createAudioEngine(with snips: SnipsPlatform) throws -> AVAudioEngine {
let audioEngine = AVAudioEngine()
let audioSession = AVAudioSession.sharedInstance()
try audioSession.setCategory(.playAndRecord, mode: .measurement, options: [.mixWithOthers, .allowBluetoothA2DP, .allowBluetooth])
try audioSession.setPreferredSampleRate(16_000)
try audioSession.setActive(true, options: .notifyOthersOnDeactivation)
let recordingFormat = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: 16_000, channels: 1, interleaved: true)
let input = audioEngine.inputNode
let downMixer = AVAudioMixerNode()
audioEngine.attach(downMixer)
audioEngine.connect(input, to: downMixer, format: nil)
downMixer.installTap(onBus: 0, bufferSize: 1024, format: recordingFormat) { (buffer, time) in
snips.appendBuffer(buffer)
}
audioEngine.prepare()
return audioEngine
}

Next, add the following method, which will bind to events from the Snips platform and display them on the screen accordingly:

fileprivate func setupHandlers() {
snips.onIntentDetected = { [weak self] intent in
DispatchQueue.main.sync {
self?.title = "Intent detected"
self?.logView.text = String(format:
"Query: %@\n" +
"Intent: %@\n" +
"Probability: %.3f\n" +
"Slots:\n\t%@",
intent.input,
intent.intent?.intentName ?? "",
intent.intent?.probability ?? 0,
intent.slots.map { "\($0.slotName): \($0.value)" }.joined(separator: "\n\t")
)
}
}
snips.onHotwordDetected = { [weak self] in
DispatchQueue.main.sync {
self?.title = "🔔"
}
}
snips.onListeningStateChanged = { [weak self] listening in
DispatchQueue.main.sync {
self?.title = listening ? "Listening..." : "Say the wake word"
}
}
snips.snipsWatchHandler = { message in
DispatchQueue.main.sync {
NSLog("Snips log: \(message)")
}
}
}

We are binding to the following events:

  • onIntentDetected: Triggered when an intent has been detected. It serves an intent object which contains information such as the intent name and the slot values

  • onHotwordDetected: Triggered when a wake word has been detected

  • onListeningStateChanged: Triggered when the listening state has changed. Its value is either true or false, depending on whether Snips is in active listening mode (e.g. after a hotword has been detected) or not (e.g. when a short silence has been detected)

  • snipsWatchHandler: Triggered each time Snips produces a new log message. Useful for debugging purposes.

Finally, we bind everything together in the viewDidLoad method. We first create an instance variable logView, which is a simple UITextView used to display log messages. Copy the following line and paste it after the declarations of the snips and audioEngine variables:

fileprivate lazy var logView = UITextView()

Copy the following method after the init methods. It will set up the log view, call the handler setup method defined above, and start Snips and the audio engine:

override func viewDidLoad() {
super.viewDidLoad()
title = "Say the wake word"
logView.font = UIFont.systemFont(ofSize: 16)
logView.frame = view.frame
logView.isUserInteractionEnabled = false
view.addSubview(logView)
setupHandlers()
do {
try snips.start()
try audioEngine.start()
} catch let e as SnipsPlatformError {
print("Snips error: \(e)")
} catch {
print("Unexpected error: \(error)")
}
}

To sum up, our ViewController file should look as follows:

//
// ViewController.swift
// HelloSnips
//
import UIKit
import SnipsPlatform
import AVFoundation
class ViewController: UIViewController {
fileprivate let snips: SnipsPlatform
fileprivate let audioEngine: AVAudioEngine
fileprivate lazy var logView = UITextView()
init() {
let url = Bundle.main.url(forResource: "assistant", withExtension: nil)!
snips = try! SnipsPlatform(assistantURL: url)
audioEngine = try! ViewController.createAudioEngine(with: snips)
super.init(nibName: nil, bundle: nil)
}
required init?(coder aDecoder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
override func viewDidLoad() {
super.viewDidLoad()
title = "Say the wake word"
logView.font = UIFont.systemFont(ofSize: 16)
logView.frame = view.frame
logView.isUserInteractionEnabled = false
view.addSubview(logView)
setupHandlers()
do {
try snips.start()
try audioEngine.start()
} catch let e as SnipsPlatformError {
print("Snips error: \(e)")
} catch {
print("Unexpected error: \(error)")
}
}
fileprivate class func createAudioEngine(with snips: SnipsPlatform) throws -> AVAudioEngine {
let audioEngine = AVAudioEngine()
let audioSession = AVAudioSession.sharedInstance()
try audioSession.setCategory(.playAndRecord, mode: .measurement, options: [.mixWithOthers, .allowBluetoothA2DP, .allowBluetooth])
try audioSession.setPreferredSampleRate(16_000)
try audioSession.setActive(true, options: .notifyOthersOnDeactivation)
let recordingFormat = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: 16_000, channels: 1, interleaved: true)
let input = audioEngine.inputNode
let downMixer = AVAudioMixerNode()
audioEngine.attach(downMixer)
audioEngine.connect(input, to: downMixer, format: nil)
downMixer.installTap(onBus: 0, bufferSize: 1024, format: recordingFormat) { (buffer, time) in
snips.appendBuffer(buffer)
}
audioEngine.prepare()
return audioEngine
}
fileprivate func setupHandlers() {
snips.onIntentDetected = { [weak self] intent in
DispatchQueue.main.sync {
self?.title = "Intent detected"
self?.logView.text = String(format:
"Query: %@\n" +
"Intent: %@\n" +
"Probability: %.3f\n" +
"Slots:\n\t%@",
intent.input,
intent.intent?.intentName ?? "",
intent.intent?.probability ?? 0,
intent.slots.map { "\($0.slotName): \($0.value)" }.joined(separator: "\n\t")
)
}
}
snips.onHotwordDetected = { [weak self] in
DispatchQueue.main.sync {
self?.title = "🔔"
}
}
snips.onListeningStateChanged = { [weak self] listening in
DispatchQueue.main.sync {
self?.title = listening ? "Listening..." : "Say the wake word"
}
}
snips.snipsWatchHandler = { message in
DispatchQueue.main.sync {
NSLog("Snips log: \(message)")
}
}
}
}

Step 7: Run the app

On the first run, you will be prompted for access to the microphone, which you should grant. The app is now listening, waiting for the wake word (for instance, "Hey Snips", or whatever wake word you defined when you created your assistant on the Snips Console). Say the wake word, followed by a query. You should see the logs appearing on the screen!

If you run into the following error while compiling:

'/path/to/SnipsPlatform' does not contain bitcode. You must rebuild it with bitcode enabled (Xcode setting ENABLE_BITCODE), obtain an updated library from the vendor, or disable bitcode for this target.

you may have to disable bitcode for your project. In order to do so, head over to your project's Build Settings, make sure All is selected in the filters, and find the entry named Enable Bitcode. Set it to No. The project should now build.

Next steps

Congratulations on creating your first entirely private voice app using Snips! It's now time to take it a step further. We've created guides for more advanced topics, check them out: