Brytecam is an SDK for building video conferencing enabled iOS apps using Brytecam service.
Requirements
- iOS 10.0+
- Xcode 11+
- Swift 5.1+
Installation
CocoaPods
CocoaPods is a dependency manager for Cocoa projects. For usage and installation instructions, visit their website. To integrate Brytecam SDK into your Xcode project using CocoaPods, specify it in your Podfile
:
pod 'Brytecam', '~> 0.2.0'
Getting Started
Checkout the sample app located in Example folder for a quick dive in. This guide provides you with an overview of the key objects you'll use in Live Interactive Video API to build your application with Brytecam Video SDK.
Concepts
- Room - A room represents a real-time audio, data, video and/or screenshare session, the basic building block of the Brytecam Video SDK
- Track - A track represents real-time audio, video and data media streams that are shared to a room
- Peer - A peer represents all participants connected to a room (other than the local participant)
- Publish - A local participant can share its audio, video and data tracks by "publishing" its tracks to the room
- Subscribe - A local participant can stream any peer's audio, video and data tracks by "subscribing" to their tracks
- Broadcast - A local participant can send any message/data to all peers in the room
Import modules & instantiate Brytecam Client
import Brytecam
let peer = BRTPeer(name: userName)
let room = BRTRoom(roomId: roomName)
let config = BRTClientConfig()
config.authToken = "INSERT TOKEN HERE"
let client = BRTClient(peer: peer, config: config)
Setup handlers
Add handler functions to listen to peers joining, transport opening (establishing connection to the server), peers publishing their streams etc.
client.onStreamAdd = { (room, peer, streamInfo) in
// A peer has published a stream. To subscibe to it we make a `subscribe` call
DispatchQueue.main.async {
client.subscribe(streamInfo, completion: { [weak self] (stream, error) in
// Query the stream for tracks and create a view to render the video track
})
}
}
client.onStreamRemove = { [weak self] (room, streamInfo) in
DispatchQueue.main.async {
// Stream got unpublished remove corresponding video view
}
}
Join a room
client.connect { (status, error) in
// handle error if any
client.join(room, completion: {(success, error) in
// this is a good place to publish local audio/video stream
})
}
Publish a local stream
Please Note
iOS requires that your app provide static messages to display to the user when the system asks for camera or microphone permission:
If your app uses device cameras, include the NSCameraUsageDescription key in your app’s Info.plist file.
If your app uses device microphones, include the NSMicrophoneUsageDescription key in your app’s Info.plist file.
For each key, provide a message that explains to the user why your app needs to capture media, so that the user can feel confident granting permission to your app.
Important
If the appropriate key is not present in your app’s
Info.plist
file when your app requests authorization or attempts to use a capture device, the system terminates your app.
To publish audio and video streams with default options:
let options = BRTPublishOptions()
options.shouldPublishAudio = true
options.shouldPublishVideo = true
// Additionaly you can configure video resolution and codec
client.publish(options, completion: { [weak self] (stream, error) in
guard let stream = stream else { return }
// get a video capturer for the stream
self?.videoCapturer = stream.videoCapturer
// save a reference to local tracks for later
self?.localAudioTrack = stream.audioTracks?.first
self?.localVideoTrack = stream.videoTracks?.first
DispatchQueue.main.async {
// initiate video capture
self?.videoCapturer?.startCapture()
}
})
Render a Video Track
let videoView = BRTVideoView()
videoView.setVideoTrack(track)
// Add the view to the hierachy as usual