TestsTested | ✗ |
LangLanguage | Obj-CObjective C |
License | MIT |
ReleasedLast Release | May 2017 |
Maintained by Kjartan Vestvik.
Depends on: | |
AFNetworking | ~> 3.0 |
SynqHttpLib | ~> 0.3 |
This is the SYNQ mobile SDK for iOS. It lets you easily integrate your mobile app with the SYNQ platform and the SYNQ Video API.
To run the example project, clone the repo, and run pod install
from the Example directory first. The example project features an app that utilizes the features of the SDK, and this shows how parts of the SDK are to be used.
SynqObjC is available through CocoaPods. If you do not have CocoaPods installed, you can install it with the following command:
$ gem install cocoapods
To integrate SynqObjC into your Xcode project, specify it in your Podfile
:
pod "SynqObjC"
Then run the following command to install:
$ pod install
If you get an error saying
[!] Unable to find a specification for <name of pod>
, try runningpod repo update
, and then runpod install
again. This should fix it.
The SDK is comprised of two parts: SynqUploader - for uploading videos to SYNQ, and SynqStreamer - for streaming live video.
This part consists of classes for fetching videos from the Photos library, exporting and uploading them to SYNQ. The SDK uses AFNetworking 3 for communicating with the server. It utilizes a background configured NSURLSession to manage video uploads. This makes the upload continue regardless of whether the app is in the foreground or background.
#import <SynqUploader/SynqUploader.h>
[[SynqUploader sharedInstance] setDelegate:self];
SQVideoUpload *video = [[SQVideoUpload alloc] initWithPHAsset:asset];
To do this, you must do two things:
In the example project, the SynqHttpLib pod and the example server (SYNQ-Nodejs-example-server) performs these two functions in one step via the function "createVideoAndGetParamsWithSuccess:"
[video setUploadParameters:parameters];
[[SynqUploader sharedInstance] uploadVideoArray:videoArray
exportProgressBlock:^(double exportProgress)
{
// Report progress to UI
[self.progressView setProgress:exportProgress];
}
uploadProgressBlock:^(double uploadProgress)
{
// uploadProgress is between 0.0 and 100.0
// Report progress to UI
[self.progressView setProgress:uploadProgress];
}];
The outcome of each upload is reported through the SQVideoUploadDelegate methods. These are the methods that are available, and how they should be used:
- (void) videoUploadCompleteForVideo:(SQVideoUpload *)video;
A video is successfully uploaded.
- (void) videoUploadFailedForVideo:(SQVideoUpload *)video;
There was an error uploading a video.
- (void) allVideosUploadedSuccessfully;
All videos were successfully uploaded.
The example project included in this repo contains functionality for playing back your uploaded videos. This consists of a table view that lists all uploaded videos. Selecting one of the videos will open a new view controller (an instance of AVPlayerViewController) with a video player configured to play the selected video. This example uses the HLS output format as source for the video playback. The various sources for video playback can be found under the "outputs" field of the video object:
"outputs": {
"hls": {
"url": "https://multicdn.synq.fm/projects/fb/ec/fbec62099ed94d7ba7692c7353d20435/derivatives/videos/0c/19/0c19b46991ae49be994cec9f3909329a/hls/0c19b46991ae49be994cec9f3909329a_hls.m3u8",
"state": "complete"
},
"mp4_360": {
"url": "https://multicdn.synq.fm/projects/fb/ec/fbec62099ed94d7ba7692c7353d20435/derivatives/videos/0c/19/0c19b46991ae49be994cec9f3909329a/mp4_360/0c19b46991ae49be994cec9f3909329a_mp4_360.mp4",
"state": "complete"
},
"mp4_720": {
"url": "https://multicdn.synq.fm/projects/fb/ec/fbec62099ed94d7ba7692c7353d20435/derivatives/videos/0c/19/0c19b46991ae49be994cec9f3909329a/mp4_720/0c19b46991ae49be994cec9f3909329a_mp4_720.mp4",
"state": "complete"
},
"mp4_1080": {
"url": "https://multicdn.synq.fm/projects/fb/ec/fbec62099ed94d7ba7692c7353d20435/derivatives/videos/0c/19/0c19b46991ae49be994cec9f3909329a/mp4_1080/0c19b46991ae49be994cec9f3909329a_mp4_1080.mp4",
"state": "complete"
},
"webm_720": {
"url": "https://multicdn.synq.fm/projects/fb/ec/fbec62099ed94d7ba7692c7353d20435/derivatives/videos/0c/19/0c19b46991ae49be994cec9f3909329a/webm_720/0c19b46991ae49be994cec9f3909329a_webm_720.webm",
"state": "complete"
}
}
Please note: the "url" field is only present when the state is "complete", i.e. when the transcoding of the video file is finished. The state might also read "submitted" or "progressing", meaning that the transcoding is not complete yet, hence no output url.
Having obtained the URL for the needed output format, presenting the video player view is accomplished by configuring an instance of the AVPlayerViewController:
// Convert url string to URL
NSString *urlString; // the url string fetched from the video object JSON
NSURL *videoUrl = [NSURL URLWithString:urlString];
// Configure AVPlayerViewController with an AVPlayer
AVPlayerViewController *avPlayerViewController = [[AVPlayerViewController alloc] init];
avPlayerViewController.player = [[AVPlayer alloc] initWithURL:videoUrl];
// Present the player view controller
[self presentViewController:avPlayerViewController animated:YES completion:^{
[avPlayerViewController.player play];
}];
This part of the SDK features a framework with the core streamer functionality and a resource bundle that contains a compiled storyboard with a fully configured view controller for video streaming. Functions for configuring the video stream and displaying the streamer view is exposed through the SynqStreamer.h header file.
Create an instance of SynqStreamer:
SynqStreamer *streamer = [[SynqStreamer alloc] init];
These are the steps needed to setup the video streamer:
- (void) setStreamUrl:(NSString *)streamUrl
- (AppNavigationController *) getStreamerViewWithNavigationController
presentViewController: animated: completion:
- (void) setStreamButtonEnabled:(BOOL)enabled
The example app included in this repo shows how you can create the video object and get the stream URL using the SynqHttpLib in connection with our NodeJS example server. We simply call this function in SynqHttpLib:
[client createVideoAndGetStreamParamsWithSuccess:^(NSDictionary *jsonResponse)
{
// Get stream URL from parameters
NSString *streamUrl = [jsonResponse objectForKey:@"stream_url"];
}
httpFailureBlock:^(NSURLSessionDataTask *task, NSError *error)
{
// An error occurred, handle error
}];
Set parameters to perform step 2 and 5:
[streamer setStreamUrl:streamUrl];
[streamer setStreamButtonEnabled:YES];
Configure and present the streamer view, step 3 and 4:
AppNavigationController *navController = [streamer getStreamerViewWithNavigationController];
[self presentViewController:navController animated:YES completion:nil];
Now you can start and stop the live video stream as you wish in the streamer view. There is also a settings view (press the cog icon) where you can configure video parameters like resolution, sample rate, audio channel count, etc.
The SDK is dependant on access to the SYNQ API to be able to create a video object and to fetch the upload parameters needed when calling the upload function. The SYNQ API is intended to be accessed from a server, this means that you should have your own server authenticating requests from the mobile client and making http calls to the SYNQ API. You will need to get an API key from the SYNQ admin panel, and use the key when making the calls to the SYNQ API. To get you started, you can use our NodeJS example server to see how the requests can be made. This also lets you try the functionality of the SDK.
For more info, please read the projects and api keys section in the docs.
This SDK requires iOS 9 or above
Kjartan Vestvik, [email protected]
SynqObjC is available under the MIT license. See the LICENSE file for more info.