AgoraARKit 1.0.5

AgoraARKit 1.0.5

Maintained by AgoraARKit, AgoraARKit.



 
Depends on:
AgoraRtcEngine_iOS>= 0
ARVideoKit~> 1.51
 

AgoraARKit

Enabling apps to live stream AR video streams. ARKit, uses the device's camera and motion sensors to project virtual conetent into a user's world. Agora.io, provides a video SDK for building real-time video and audio communications applications. By combining Agora.io's Video SDK and ARKit, it enables developers to create many different applications across many different use-cases.

This library provides three classes with managed user itnterfaces:

  • Lobby: the pre-channel UIView, provides a text input for users to define their channel name and their role (broadcaster and audience)
  • ARBroadcaster: User broadcasting their AR view in the live stream
  • ARAudience: User viewing the remote user's AR session.

Device Requirements

AgoraARKit requires a minimum version of iOS 12.2, and supports the following devices:

  • iPhone 6S or newer
  • iPhone SE
  • iPad (2017)
  • All iPad Pro models

iOS 12.2 can be downloaded from Appleā€™s Developer website.

Dependancies

AgoraARKit relies on the Agora.io Video SDK and ARVideoKit.

Support

Quick start guide

To get started with the AgoraARVideoKit, please follow the steps below to

Set up using CocoaPods (coming soon)

NOTE: CocoaPods is not currently set up (use Manual Setup below)

  1. Add to your podfile:

pod 'AgoraARKit'

  1. In Terminal, navigate to your project folder, then:

pod update

pod install

  1. Add NSCameraUsageDescription and NSMicrophoneUsageDescription to plist with a brief explanation (see demo project for an example)

Set up manually

  1. Add all files from the AgoraARKit directory to your project.
  2. Import ARVideoKit and Agora.io Video SDK SDKS
  3. Add NSCameraUsageDescription and NSMicrophoneUsageDescription to plist with a brief explanation (see demo project for an example)

Implementation

  1. once you have imported the AgoraARKit and its dependancies, open your ViewController.swift and add:
import AgoraARKit
  1. Next set your ViewController class to inherit from AgoraLobbyVC and set your Agora App Id with the loadView method. If you want to set a custom image for the Lobby view, set it using the bannerImage property.
override func loadView() {
    super.loadView()
    
    AgoraARKit.agoraAppId = ""

    
    // set the banner image within the initial view
    if let agoraLogo = UIImage(named: "ar-support-icon") {
        self.bannerImage = agoraLogo
    }
}

Customization

The AgoraARKit classes are extendtable so you can subclass them to customize them as needed.

LobbyVC

Since we are already inheriting from the AgoraLobbyVC, let's override the joinSession and createSession methods within our ViewController to set the images for the audience and broadcaster views.

Custom images in Audience view

@IBAction override func joinSession() {
    if let channelName = self.userInput.text {
        if channelName != "" {
            let arAudienceVC = ARAudience()
            if let exitBtnImage = UIImage(named: "exit") {
                arAudienceVC.backBtnImage = exitBtnImage
            }
            arAudienceVC.channelName = channelName
            arAudienceVC.modalPresentationStyle = .fullScreen
            self.present(arAudienceVC, animated: true, completion: nil)
        } else {
            // TODO: add visible msg to user
            print("unable to join a broadcast without a channel name")
        }
    }
}

Custom images in Broadcaster view

@IBAction override func createSession() {
    if let channelName = self.userInput.text {
        if channelName != "" {
            let arBroadcastVC = ARBroadcaster()
            if let exitBtnImage = UIImage(named: "exit") {
                arBroadcastVC.backBtnImage = exitBtnImage
            }
            if let micBtnImage = UIImage(named: "mic"),
                let muteBtnImage = UIImage(named: "mute"),
                let watermakerImage = UIImage(named: "agora-logo") {
                arBroadcastVC.micBtnImage = micBtnImage
                arBroadcastVC.muteBtnImage = muteBtnImage
                arBroadcastVC.watermarkImage = watermakerImage
            }
            
            arBroadcastVC.channelName = channelName
            arBroadcastVC.modalPresentationStyle = .fullScreen
            self.present(arBroadcastVC, animated: true, completion: nil)
        } else {
            // TODO: add visible msg to user
            print("unable to launch a broadcast without a channel name")
        }
    }
}

ARBroadcaster

The ARBroadcaster is a UIViewController that implements the ARKit Session and Render Delegates along with the Agora RTC Engine Delegate methods. For a full list of each please see the ARKit documentation.

The current ARBroadcaster class is setup for WorldTracking, but this can be easily updated to front facing. Below is an example of the ARBroadcaster extended for ARKit FaceTracking and also adds support for multiple broadcasters.

import ARKit

class FaceBroadcaster : ARBroadcaster {
    
    // placements dictionary
    var faceNodes: [UUID:SCNNode] = [:]           // Dictionary of faces
    
    override func viewDidLoad() {
        super.viewDidLoad() 
    }
    
    override func setARConfiguration() {
        print("setARConfiguration")        // Configure ARKit Session
        let configuration = ARFaceTrackingConfiguration()
        configuration.isLightEstimationEnabled = true
        // run the config to start the ARSession
        self.sceneView.session.run(configuration)
        self.arvkRenderer?.prepare(configuration)
    }
    
    // anchor detection
    override func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
        super.renderer(renderer, didAdd: node, for: anchor)
        guard let sceneView = renderer as? ARSCNView, anchor is ARFaceAnchor else { return }
        /*
         Write depth but not color and render before other objects.
         This causes the geometry to occlude other SceneKit content
         while showing the camera view beneath, creating the illusion
         that real-world faces are obscuring virtual 3D objects.
         */
        let faceGeometry = ARSCNFaceGeometry(device: sceneView.device!)!
        faceGeometry.firstMaterial!.colorBufferWriteMask = []
        let occlusionNode = SCNNode(geometry: faceGeometry)
        occlusionNode.renderingOrder = -1
        
        let contentNode = SCNNode()
        contentNode.addChildNode(occlusionNode)
        node.addChildNode(contentNode)
        faceNodes[anchor.identifier] = node
    }
}

...