iOS Swift/Objective-c adapter
Follow this step-by-step tutorial to implement the Watch Together video chat sample application.
While the client-side application will take care of most of the functionality, in order to make this sample application work, you will need to get an access token from the Cluster Authentication Server (CAS).
- To better understand the Watch Together architecture have a look at this guide - Watch Together overview
- The full code samples can be found here
To complete this guide successfully the following prerequisites are required:
- A Sceenic account
An Access Token is needed in order to allow a client to connect to a Session.
Note: It is important that the client application does not request an Access Token directly from the backend. By doing that you risk exposing the API_TOKEN and API_SECRET.
- To learn how to acquire an Access Token please look at the Cluster Authentication Server (CAS) reference
- To simplify the tutorial, in the section below you can see an example of getting an Access Token
cUrl(Bash)
curl -iL --request GET --url https://YOUR_CAS_URL/stream/token/v2/ --header 'auth-api-key: API_KEY' --header 'auth-api-secret: API_SECRET'
A successful response will look like that:
{
"token": "eyJ0eXAiOiJKV1QiLCJhbGciOiJIUzI1NiJ9...."
}
Note: Every Streaming Token corresponds to one specific Session only. To allow two different clients to connect to the same Session, the clients need to use the same
Access Token
.To go to production you will need to implement your own authentication server. Using the server the Access Token will be shared to your various clients (Web, Android, and iOS). With this valid Access Token you will be able to use the service.
For that you will need:
- Open Xcode and create a new project
- Choose a Single View application
- Configure your product organization, bundle, and team names
- Set the application location on your computer and press "create"
In case you require to use an Objective-c project, please add the WatchTogetherAdapter to your project
- Create a folder with the name "WatchTogether" in the root of the project
- Copy all files ("WatchTogether.framework","WatchTogether.podspec") to created folder
- [When using the Objective-c adapater] - do the following extra steps
- Create a folder named "WatchTogetherAdapter" in the root of the project
- Copy all the files unpacked from the SDK ("WatchTogetherAdapter.framework","WatchTogetherAdapter.podspec") to "WatchTogetherAdapter" folder
- Open folder with your project in Terminal
- Print the command "pod init"
- Print the command "open Podfile"
- In the opened file paste this string
pod 'WatchTogether', :path => './WatchTogether'
after "target 'Your project name' do" and save file - [When using the Objective-c adapater] - do the following extra steps
- In the opened file paste the following strings
pod 'WatchTogetherAdapter', :path => './WatchTogetherAdapter' pod 'WatchTogether', :path => './WatchTogether'
- In the terminal write the command "pod install"
- Disable application Bitcode.
- Build Settings → Build Options → Enable Bitcodes → Set to "NO"
- To mock a video for the simulator you will need to add an mp4 video file to your main bundle and rename it to simulator_mock.mp4

- Go to the info.plist file, add the following properties
- Add the property "
Privacy - Camera Usage Description
" and the value "$(PRODUCT_NAME) uses the camera" - Add the property "
Privacy - Microphone Usage Description
" and the value "$(PRODUCT_NAME) uses the microphone"
- Import the library to the view controller
Swift
Objective-c adapter
import WatchTogether
@import WatchTogetherAdapter;
- Set the minimal log level
Swift
Objective-c adapter
Session.setMinLogLevel(.debug)
[SessionAdapter setMinLogLevel:WTALogLevelDebug];
- Create a session instance
username
- a display namedelegate
- an object conforming to SessionDelegate protocol
Swift
Objective-c adapter
let session = SessionBuilder()
.withUsername(username)
.withVideoCodec(.H264)
.withVideoRenderer(.Metal)
.withDelegate(self)
.build()
SessionAdapter* session = [[SessionAdapter alloc] initWith:username];
[session buildSession];
session.delegate = self;
- Get the local participant instance
Swift
Objective-c adapter
let localParticipant = session.localParticipant
ParticipantAdapter* localParticipant = session.localParticipant;
- Connect to the session
Access token
- an authentication token received from the Authentication Server
Swift
Objective-c adapter
do {
try session.connect(with: token)// method can throw
} catch {
print(error)//error is WTError
//fallback logic
}
NSError* err = nil;
[_session connectWith:_token error:&err];
if (err != nil) {
//fallback logic
}
- Disconnect from the session
Swift
Objective-c adapter
session.disconnect()
[session disconnect];
Swift
Objective-c adapter
extension YourClass: SessionDelegate {
func onSessionConnected(sessionId: String, participants: [Participant]) {
print("Connected to \(sessionId)")
}
func onSessionDisconnect() {
print("Disconnect from session")
}
func onRemoteParticipantJoined(participant: Participant) {
print("New participant with id: \(participant.getId())")
}
func onRemoteParticipantStartMedia(participant: Participant) {
print("New media from id: \(participant.getId())")
}
func onRemoteParticipantStopMedia(participant: Participant) {
print("Stoped media from id: \(participant.getId())")
}
func onRemoteParticipantLeft(participant: Participant) {
print("Participant left with id: \(participant.getId())")
}
func onSessionError(error: Error) {
print(error.localizedDescription)
}
func onRemoteParticipantNotification(message: String, participantId: String) {
print("message: \(message) from participant \(participantId)"
}
}
- (void)onSessionConnectedWithSessionId:(NSString *)sessionId participants:(NSArray<ParticipantAdapter *> *)participants{
NSLog(@"Connected to session");
}
- (void)onSessionDisconnect{
NSLog(@"Disconnect from session");
}
- (void)onRemoteParticipantJoinedWithParticipant:(ParticipantAdapter *)participant{
NSLog(@"New participant");
}
- (void)onRemoteParticipantStartMediaWithParticipant:(ParticipantAdapter *)participant{
NSLog(@"New media");
}
- (void)onRemoteParticipantStopMediaWithParticipant:(ParticipantAdapter *)participant{
NSLog(@"Stoped media");
}
- (void)onRemoteParticipantLeftWithParticipant:(ParticipantAdapter *)participant{
NSLog(@"Participant left");
}
- (void)onSessionErrorWithError:(NSError *)error{
NSLog(@"Session failed with error");
}
-(void)onRemoteParticipantNotificationWithMessage:(NSString *)message participantId:(NSString *)participantId {
NSLog(@"message: %@ from participant %@", message, participantId);
}
- To get Participant video use the method
getVideo()
- To get Participant display name use the method
getDisplayName()
- To get Participant id use the method
getId()
Swift
Objective-c adapter
class ParticipantView: UIView {
var participant: Participant?
func setParticipant(_ participant: Participant) {
self.participant = participant
let participantVideo = participant.getVideo() // View with video from participant
let displayName = participant.getDisplayName() // Participant display name
participant?.delegate = self
let participantId:String = participant.getId()// Participant id
self.addSubview(participantVideo)
}
//MARK: Actions to change participant properties
@IBAction func onVolumeChanged(_ sender: UISlider) {
participant.volume = sender.value// values from 0.0 to 1.0
}
@IBAction func muteAudioAction(_ sender: UIButton) {
participant.isAudioEnabled = !participant.isAudioEnabled //true/false enable disable audio
}
@IBAction func muteVideoAction(_ sender: UIButton) {
participant.isVideoEnabled = !participant.isVideoEnabled //true/false
}
}
@property (weak, nonatomic) ParticipantAdapter* participant;
- (void) setParticipant: (ParticipantAdapter *)participant {
UIView* view = [participant getVideo];
NSString *participantId = [_participant getId]
_participantNameLabel.text = [_participant getDisplayName];
_participant.delegate = self
[self addSubview: view];
}
//MARK: Actions in participant view
- (IBAction)changeVolume:(UISlider *)sender {
_participant.volume = sender.value; //from 0.0 to 1.10
}
- (IBAction)enableAudio:(UIButton *)sender {
_participant.isAudioEnabled = !_participant.isAudioEnabled//YES/NO turn on/off
}
- (IBAction)enableAudio:(UIButton *)sender {
!_participant.isVideoEnabled = !_participant.isVideoEnabled
}
Swift
Objective-c adapter
extension ParticipantView: ParticipantDelegate{
func onReconnecting() {
print("Participant reconnecting")
}
func onReconnected() {
print("Participant reconnected")
}
func onChangeMediaSettings() {
print("Participant has changed media settings")
}
func onStreamUpdate(qualityInfo: StreamQualityInfo) {
print("Participant has new quality info \(qualityInfo.quality)")
}
}
- (void)onChangeMediaSettings{
NSLog(@"On participant cahnge media settings");
}
- (void)onStreamUpdateWithQualityInfo:(StreamQualityInfoAdapter *)qualityInfo{
NSLog(@"On participant update quality");
}
- (void)onReconnecting{
NSLog(@"On participant reconnecting");
}
- (void)onReconnected{
NSLog(@"On participant reconnected");
}
LocalParticipant class implements the Participant protocol
- Camera preview
Swift
Objective-c adapter
localParticipant.startCameraPreview()
//MARK: switch camera action
@IBAction func onSwitchCamera(_ sender: UIButton) {
localParticipant.cameraPosition == .front ? AVCaptureDevice.Position.back
: AVCaptureDevice.Position.front
}
func onResolutionChanged(to: VideoQuality, fps: Int32) {
localParticipant.videoQuality = VideoQuality.Low// VideoQuality.Hight, VideoQuality.Default
localParticipant.fps = fps // normal values in range from 15 to 45
}
[_localParticipant startCameraPreview];
//MARK: Actions
//@property (nonatomic) BOOL isFront;
- (IBAction)switchCamera:(id)sender {
_isFront = !_isFront;
if (_isFront){
[_localParticipant setCameraPosition: AVCaptureDevicePositionFront];
} else {
[_localParticipant setCameraPosition: AVCaptureDevicePositionBack];
}
}
- (IBAction)changeVideoQuality:(UISegmentedControl *)sender {
if(sender.selectedSegmentIndex == 0){
_localParticipant.videoQuality = VideoQualityAdapterLow;
} else if (sender.selectedSegmentIndex == 1){
_localParticipant.videoQuality = VideoQualityAdapterDefault;
} else {
_localParticipant.videoQuality = VideoQualityAdapterHigh;
}
}
}
- (IBAction)changeFrameRate:(UISlider *)sender {
_localParticipant.frameRate = sender.value; //normal values in range 15 to 45
}
}
Swift
Objective-c adapter
localParticipant.set(orientation: .landscapeRight)
// localParticipant.set(orientation: nil) //use to disable orientation lock
[_localParticipant setWithOrientation: AVCaptureVideoOrientationLandscapeLeft];
// [_localParticipant setWithOrientation: 0]; //use to disable orientation lock
Once coding is finished, you should be able to run the application.
- To understand better how to set up the Authentication please have a look at the Authentication overview.
Last modified 9mo ago