A drop-in iOS widget library for integrating Telnyx AI Assistant functionality into your applications.
- 🎯 Drop-in Solution: Easy integration with minimal setup using SwiftUI
- 🎨 Multiple UI States: Collapsed, loading, expanded, and transcript views
- 🎵 Audio Visualizer: Real-time audio visualization during conversations
- 🌓 Theme Support: Light and dark theme compatibility
- 📱 Responsive Design: Optimized for various screen sizes
- 🔊 Voice Controls: Mute/unmute and call management
- 💬 Transcript View: Full conversation history with text input
- 🎨 Icon-Only Mode: Floating action button for minimal space usage
Add the following to your Package.swift file:
dependencies: [
.package(url: "https://github.com/team-telnyx/ios-telnyx-voice-ai-widget.git", from: "1.0.0")
]Or add it through Xcode:
- File → Add Package Dependencies
- Enter repository URL:
https://github.com/team-telnyx/ios-telnyx-voice-ai-widget.git - Select version and add to your target
Add the following to your Podfile:
pod 'TelnyxVoiceAIWidget', '~> 1.0.0'Then run:
pod installAdd these permissions to your Info.plist:
<key>NSMicrophoneUsageDescription</key>
<string>We need access to your microphone to enable voice conversations with the AI assistant</string>import SwiftUI
import TelnyxVoiceAIWidget
struct ContentView: View {
@State private var showWidget = false
var body: some View {
VStack {
AIAssistantWidget(
assistantId: "your-assistant-id",
shouldInitialize: showWidget
)
}
}
}The widget supports an icon-only mode that displays as a floating action button:
AIAssistantWidget(
assistantId: "your-assistant-id",
shouldInitialize: true,
iconOnly: true // Enables floating action button mode
)- Compact Design: Displays only the icon in a circular floating action button
- Direct Access: Tapping starts the call and opens directly into the full-screen transcript view
- No Expanded State: Skips the intermediate expanded widget state
- Error Handling: Shows a red error icon when there are connection issues
- Background Color: Uses the theme's primary color for the button background
| Feature | Regular Mode | Icon-Only Mode |
|---|---|---|
| Collapsed State | Button with text and icon | Circular floating button with icon only |
| Tap Behavior | Opens to expanded widget | Starts call and opens transcript view directly |
| Expanded State | Shows audio visualizer and controls | Skipped - goes directly to transcript |
| Error State | Shows detailed error card | Shows red error icon in floating button |
| Use Case | Full-featured integration | Minimal, space-efficient integration |
The shouldInitialize parameter controls when the widget establishes its network connection to Telnyx servers. This is crucial for controlling:
- Network Usage: Prevents unnecessary connections until needed
- User Consent: Initialize only after user grants permissions
- Performance: Defer connection for better app startup performance
- Conditional Loading: Connect based on user subscription, feature flags, etc.
false: Widget remains dormant with no network activity or UI displaytrue: Triggers socket connection and loads assistant configuration- State Change: Changing from
falsetotruewill initialize the connection - Active Sessions: Changing from
truetofalsedoes NOT disconnect active calls
- shouldInitialize = false: No socket connection, widget in idle state
- shouldInitialize = true: Socket connects to Telnyx, widget settings load
- User initiates call: WebRTC connection established for audio
- Call ends: WebRTC disconnects, socket remains for future calls
- shouldInitialize = false: Does NOT affect active socket (by design)
// 1. Initialize immediately (default behavior)
AIAssistantWidget(
assistantId: "your-assistant-id",
shouldInitialize: true // Can be omitted as it defaults to true
)
// 2. Conditional initialization based on user action
struct ConditionalWidget: View {
@State private var userWantsAssistant = false
var body: some View {
VStack {
Button("Enable AI Assistant") {
userWantsAssistant = true
}
AIAssistantWidget(
assistantId: "your-assistant-id",
shouldInitialize: userWantsAssistant
)
}
}
}
// 3. Initialize after permissions are granted
struct PermissionAwareWidget: View {
@State private var hasPermissions = false
var body: some View {
AIAssistantWidget(
assistantId: "your-assistant-id",
shouldInitialize: hasPermissions
)
.onAppear {
checkAudioPermissions { granted in
hasPermissions = granted
}
}
}
}
// 4. Deferred initialization for performance
struct DeferredWidget: View {
@State private var initializeWidget = false
var body: some View {
AIAssistantWidget(
assistantId: "your-assistant-id",
shouldInitialize: initializeWidget
)
.onAppear {
// Initialize after a delay
DispatchQueue.main.asyncAfter(deadline: .now() + 2) {
initializeWidget = true
}
}
}
}The widget supports extensive color customization through the WidgetCustomization struct:
let customization = WidgetCustomization(
audioVisualizerColor: "twilight", // Gradient preset name
transcriptBackgroundColor: .white, // Transcript background
userBubbleBackgroundColor: .blue, // User message bubbles
agentBubbleBackgroundColor: .gray, // Agent message bubbles
userBubbleTextColor: .white, // User message text
agentBubbleTextColor: .black, // Agent message text
muteButtonBackgroundColor: .blue, // Mute button default
muteButtonActiveBackgroundColor: .red, // Mute button when active
muteButtonIconColor: .white, // Mute button icon
widgetSurfaceColor: .white, // Widget background
primaryTextColor: .black, // Primary text
secondaryTextColor: .gray, // Secondary text
inputBackgroundColor: .lightGray // Input field background
)
AIAssistantWidget(
assistantId: "your-assistant-id",
shouldInitialize: true,
customization: customization
)Available gradient presets for the audio visualizer:
"verdant"- Green gradient"twilight"- Purple/blue gradient"bloom"- Pink/orange gradient"mystic"- Teal gradient"flare"- Red/orange gradient"glacier"- Blue/cyan gradient
For advanced UI customization, you can provide custom views that completely replace the default button components. These parameters accept AnyView type, so you need to wrap your custom views with AnyView():
AIAssistantWidget(
assistantId: "your-assistant-id",
shouldInitialize: true,
iconOnly: false,
customization: customization,
widgetButtonModifier: AnyView(
// This view completely replaces the button container's styling
RoundedRectangle(cornerRadius: 16)
.fill(Color.blue.opacity(0.1))
.overlay(
RoundedRectangle(cornerRadius: 16)
.stroke(Color.blue, lineWidth: 2)
)
),
buttonTextModifier: AnyView(
// This view completely replaces the default button text
Text("Start Conversation")
.font(.headline)
.foregroundColor(.blue)
),
buttonImageModifier: AnyView(
// This view completely replaces the default icon/image
Image(systemName: "mic.fill")
.resizable()
.frame(width: 32, height: 32)
.foregroundColor(.blue)
)
)Important: These parameters replace the entire view component, not just modify it. For example:
buttonTextModifierreplaces the entire text view (ignoringsettings.startCallText)buttonImageModifierreplaces the entire icon/logo view (ignoringsettings.logoIconUrl)widgetButtonModifierreplaces the button's container stylingexpandedWidgetModifierreplaces the expanded widget's container styling
| Parameter | Type | Description |
|---|---|---|
assistantId |
String | Your Telnyx Assistant ID (required) |
shouldInitialize |
Bool | Controls network connection lifecycle (default: true) |
iconOnly |
Bool | Enable floating action button mode (default: false) |
customization |
WidgetCustomization? | Custom color overrides (default: nil) |
widgetButtonModifier |
AnyView? | Custom view to replace the collapsed button's container styling (default: nil) |
expandedWidgetModifier |
AnyView? | Custom view to replace the expanded widget's container styling (default: nil) |
buttonTextModifier |
AnyView? | Custom view to completely replace the button's text (default: nil) |
buttonImageModifier |
AnyView? | Custom view to completely replace the button's icon/logo (default: nil) |
Note: All parameters except assistantId are optional. The view customization parameters completely replace their respective components and require wrapping your custom views with AnyView() - see the View Customization section above for examples.
The widget automatically transitions between different states:
- Regular Mode: Shows a compact button with customizable text and logo icon
- Icon-Only Mode: Shows a circular floating action button with only the icon
- Tap to initiate a call
- Shows loading indicator during initialization and connection
- Same behavior in both regular and icon-only modes
- Regular Mode: Audio visualizer, mute/unmute controls, agent status indicators
- Icon-Only Mode: This state is skipped - goes directly to transcript view
- Tap to open full transcript view (regular mode only)
- Full conversation history
- Text input for typing messages
- Audio controls and visualizer
- Regular Mode: Collapsible back to expanded view
- Icon-Only Mode: Primary interface for interaction
- Regular Mode: Shows detailed error card with retry button
- Icon-Only Mode: Shows red error icon in floating button
The widget automatically fetches configuration from your Telnyx Assistant settings, including:
- Custom button text
- Logo/icon URLs
- Theme preferences
- Audio visualizer settings
- Status messages
Check out the included example app in the SampleApp folder for a complete implementation:
cd SampleApp
open SampleApp.xcodeprojThe sample app demonstrates:
- Basic widget integration
- Permission handling
- Icon-only vs regular mode
- Assistant ID configuration
- Real-time widget state management
The SDK is organized into the following main components:
The main entry point for integrating the widget into your application.
Location: Views/AIAssistantWidget.swift
Purpose: Provides the complete UI and lifecycle management for AI Assistant interactions.
Key Parameters:
assistantId: String- Your Telnyx AI Assistant ID (required)shouldInitialize: Bool- Controls network connection lifecycleiconOnly: Bool- Toggle between full widget and floating action button modecustomization: WidgetCustomization?- Custom color overrides
Example:
AIAssistantWidget(
assistantId: "your-assistant-id",
shouldInitialize: true,
iconOnly: false,
customization: WidgetCustomization(
audioVisualizerColor: "twilight",
userBubbleBackgroundColor: .blue
)
)Manages the widget's state, business logic, and WebRTC connections.
Location: ViewModels/WidgetViewModel.swift
Purpose: Handles socket connections, call management, transcript updates, and state transitions.
Key Methods:
initialize(assistantId:iconOnly:customization:)- Initialize the widget with configurationstartCall()- Initiate a call to the AI assistantendCall()- Terminate the active calltoggleMute()- Toggle microphone mute statesendTextMessage(_:)- Send a text message during conversation
Published Properties:
widgetState: WidgetState- Current widget statewidgetSettings: WidgetSettings- Configuration from TelnyxtranscriptItems: [TranscriptItem]- Conversation historyaudioLevels: [Float]- Real-time audio visualization data
An enum representing all possible widget states.
Location: State/WidgetState.swift
States:
.idle- Initial state before initialization.loading- Loading during connection.collapsed(settings)- Collapsed button state.connecting(settings)- Initiating call.expanded(settings, isConnected, isMuted, agentStatus)- Active call with visualizer.transcriptView(settings, isConnected, isMuted, agentStatus)- Full transcript view.error(message, type)- Error state with details
Configuration struct for custom color theming.
Location: Models/WidgetCustomization.swift
Customizable Colors:
WidgetCustomization(
audioVisualizerColor: "twilight", // Gradient preset name
transcriptBackgroundColor: .white, // Transcript background
userBubbleBackgroundColor: .blue, // User message bubbles
agentBubbleBackgroundColor: .gray, // Agent message bubbles
userBubbleTextColor: .white, // User message text
agentBubbleTextColor: .black, // Agent message text
muteButtonBackgroundColor: .blue, // Mute button default
muteButtonActiveBackgroundColor: .red, // Mute button when active
muteButtonIconColor: .white, // Mute button icon
widgetSurfaceColor: .white, // Widget background
primaryTextColor: .black, // Primary text
secondaryTextColor: .gray, // Secondary text
inputBackgroundColor: .lightGray // Input field background
)Audio Visualizer Presets:
"verdant"- Green gradient"twilight"- Purple/blue gradient"bloom"- Pink/orange gradient"mystic"- Teal gradient"flare"- Red/orange gradient"glacier"- Blue/cyan gradient
AudioVisualizer (Views/Components/AudioVisualizer.swift)
- Real-time audio visualization with configurable gradient colors
- Responds to audio levels from WebRTC stream
TranscriptView (Views/Components/TranscriptView.swift)
- Full conversation history display
- Text input for typing messages
- Auto-scroll to latest messages
ExpandedWidget (Views/Components/ExpandedWidget.swift)
- Active call interface with audio visualizer
- Mute/unmute controls
- Agent status indicators
FloatingButton (Views/Components/FloatingButton.swift)
- Circular floating action button for icon-only mode
- Error state visualization
Configuration received from Telnyx AI Assistant settings:
struct WidgetSettings {
let agentThinkingText: String? // Text shown when agent is processing
let audioVisualizerConfig: AudioVisualizerConfig?
let defaultState: String? // Initial widget state
let logoIconUrl: String? // Custom logo URL
let startCallText: String? // Custom button text
let speakToInterruptText: String? // Interrupt instruction text
let theme: String? // Theme preference ("light"/"dark")
}Represents a single message in the conversation:
struct TranscriptItem {
let id: String // Unique identifier
let text: String // Message content
let isUser: Bool // true if from user, false if from agent
let timestamp: Date // When the message was sent
}Current state of the AI agent:
enum AgentStatus {
case idle // No active conversation
case thinking // Processing user input
case waiting // Ready and can be interrupted
}import TelnyxVoiceAIWidget
struct MyView: View {
var body: some View {
AIAssistantWidget(
assistantId: "your-assistant-id",
shouldInitialize: true
)
}
}struct AdvancedView: View {
let customization = WidgetCustomization(
audioVisualizerColor: "twilight",
userBubbleBackgroundColor: Color(hex: "#007AFF"),
agentBubbleBackgroundColor: Color(hex: "#E5E5EA"),
widgetSurfaceColor: .white
)
var body: some View {
AIAssistantWidget(
assistantId: "your-assistant-id",
shouldInitialize: true,
customization: customization
)
}
}struct ConditionalView: View {
@State private var hasPermissions = false
@State private var isEnabled = false
var body: some View {
AIAssistantWidget(
assistantId: "your-assistant-id",
shouldInitialize: hasPermissions && isEnabled
)
.onAppear {
checkMicrophonePermission { granted in
hasPermissions = granted
}
}
}
}The widget is built using:
- SwiftUI for modern UI
- Combine for reactive state management
- ObservableObject and @Published for state updates
- Telnyx WebRTC SDK for voice communication
Architecture Diagram:
┌─────────────────────────┐
│ AIAssistantWidget │ ← Main SwiftUI View
└───────────┬─────────────┘
│
↓
┌─────────────────────────┐
│ WidgetViewModel │ ← State & Business Logic
│ │
│ • TxClient (WebRTC) │ ← Telnyx SDK Integration
│ • Socket Connection │
│ • Call Management │
│ • Transcript Updates │
└───────────┬─────────────┘
│
↓
┌─────────────────────────┐
│ WidgetState │ ← State Machine
│ │
│ • idle → loading │
│ • collapsed → expanded │
│ • transcriptView │
│ • error │
└─────────────────────────┘
- iOS 13.0+
- Xcode 14.0+
- Swift 5.9+
swift buildswift testTo run the example app and test the widget:
- Open the workspace:
open TelnyxVoiceAIWidget.xcworkspace- Select the
SampleAppscheme - Build and run (⌘R)
This project includes:
- Automated Testing: Unit tests and build validation on every PR via Fastlane
- GitHub Actions: Continuous integration pipeline
- Test Reports: Generated test coverage and results in the
reportsdirectory
To run tests locally via Fastlane:
bundle install
bundle exec fastlane test- Fork the repository
- Create a feature branch
- Make your changes
- Add tests for new functionality
- Ensure all tests pass
- Submit a pull request
Widget not initializing:
- Verify your Assistant ID is correct
- Check network connectivity
- Ensure microphone permissions are granted
Audio not working:
- Check microphone permissions in Settings
- Verify device is not in silent mode
- Test with different audio routes (speaker/earpiece)
Build errors:
- Ensure iOS 13.0+ deployment target
- Verify Swift 5.0+ compatibility
- Check that all dependencies are properly resolved
Widget not responding:
- Verify
shouldInitializeis set totrue - Check console logs for WebRTC connection errors
- Ensure Assistant ID is valid and active
This project is licensed under the MIT License - see the LICENSE file for details.
For technical support and questions:
- 📧 Email: [email protected]
- 📖 Documentation: Telnyx Developer Portal
- 🐛 Issues: GitHub Issues