VietMapSpeech 2.0.1

VietMapSpeech 2.0.1

Maintained by nhatpv.



  • By
  • NhatPV

Mapbox Speech

Mapbox Speech connects your iOS application to the Mapbox Voice API. Take turn instructions from the Mapbox Directions API and read them aloud naturally in multiple languages. This library is specifically designed to work with MapboxDirections.swift as part of the Mapbox Navigation SDK for iOS.

Getting started

Specify the following dependency in your Carthage Cartfile:

github "mapbox/mapbox-speech-swift" ~> 0.0.1

Or in your CocoaPods Podfile:

pod 'MapboxSpeech', '~> 0.0.1'

Then import MapboxSpeech or @import MapboxSpeech;.

Usage

You’ll need a Mapbox access token in order to use the API. If you’re already using the Mapbox Maps SDK for iOS or macOS SDK, Mapbox Speech automatically recognizes your access token, as long as you’ve placed it in the MGLMapboxAccessToken key of your application’s Info.plist file.

The examples below are each provided in Swift (denoted with main.swift) and Objective-C (main.m).

Basics

The main speech synthesis class is SpeechSynthesizer (in Swift) or MBSpeechSynthesizer (in Objective-C). Create a speech synthesizer object using your access token:

// main.swift
import MapboxSpeech

let speechSynthesizer = SpeechSynthesizer(accessToken: "<#your access token#>")
// main.m
@import MapboxSpeech;

MBSpeechSynthesizer *speechSynthesizer = [[MBSpeechSynthesizer alloc] initWithAccessToken:@"<#your access token#>"];

Alternatively, you can place your access token in the MGLMapboxAccessToken key of your application’s Info.plist file, then use the shared speech synthesizer object:

// main.swift
let speechSynthesizer = SpeechSynthesizer.shared
// main.m
MBSpeechSynthesizer *speechSynthesizer = [MBSpeechSynthesizer sharedSpeechSynthesizer];

With the directions object in hand, construct a SpeechOptions or MBSpeechOptions object and pass it into the SpeechSynthesizer.audioData(with:completionHandler:) method.

// main.swift

let options = SpeechOptions(text: "hello, my name is Bobby")
speechSynthesizer.audioData(with: options) { (data: Data?, error: NSError?) in
    guard error == nil else {
        print("Error calculating directions: \(error!)")
        return
    }
    
    // Do something with the audio!
}
// main.m

MBSpeechOptions *options = [[MBSpeechOptions alloc] initWithText: "hello, my name is Bobby"];
[speechSynthesizer audioDataWithOptions:options completionHandler:^(NSData * _Nullable data,
                                                                    NSError * _Nullable error) {
    if (error) {
        NSLog(@"Error synthesizing speech: %@", error);
        return;
    }
    
    // Do something with the audio!
}];