Example of iOS application managing media devices
This example allows to publish WebRTC stream on Web Call Server and demonstrates selection of source camera and specification of the following parameters for published and played video
...
View with controls for publishing settings is displayed when 'Local settings' button is tapped, and view with controls for playback settings - when 'Remote settings' button is tapped.
Work with code of the example
To analyze the code, let's take MediaDevices example version with hash 79a318b, which can be downloaded with corresponding build 2.25.2.
View classes
- class for the main view of the application: ViewController (header file ViewController.h; implementation file ViewController.m)
- class for view with publishing settings: WCSLocalVideoControlView (header file WCSLocalVideoControl.h; implementation file WCSLocalVideoControl.m)
- class for view with playback settings: WCSRemoteVideoControlView (header file WCSRemoteVideoControl.h; implementation file WCSRemoteVideoControl.m)
1. Import of API. ViewController.m, line 11 code
Code Block | ||||
---|---|---|---|---|
| ||||
#import <FPWCSApi2/FPWCSApi2.h> |
2. List available media devices.List of available media devices (cameras and microphones) is requested using method getMediaDevices, which returns FPWCSApi2MediaDeviceList object. WCSLocalVideoControl.m, line 12
FPWCSApi2 getMediaDevices code
Code Block | ||||
---|---|---|---|---|
| ||||
localDevices = [FPWCSApi2 getMediaDevices]; |
3. Default microphone and camera are selected
- microphone (WCSLocalVideoControl.m, line 25)selection
FPWCSApi2MediaDeviceList.audio[0] code
Code Block | ||||
---|---|---|---|---|
| ||||
_micSelector = [[WCSPickerInputView alloc] initWithLabelText:@"Mic" pickerDelegate:self];
//set default mic
if (localDevices.audio.count > 0) {
_micSelector.input.text = ((FPWCSApi2MediaDevice *)(localDevices.audio[0])).label;
} |
- camera (WCSLocalVideoControl.m, line 37)FPWCSApi2MediaDeviceList.video[0] code
Code Block | ||||
---|---|---|---|---|
| ||||
_camSelector = [[WCSPickerInputView alloc] initWithLabelText:@"Cam" pickerDelegate:self];
//set default cam
if (localDevices.video.count > 0) {
_camSelector.input.text = ((FPWCSApi2MediaDevice *)(localDevices.video[0])).label;
} |
34. Constraints for published stream.WCSLocalVideoControl
FPWCSApi2MediaConstraints.maudio, line 210- New FPWCSApi2MediaConstraints object is createdFPWCSApi2MediaConstraints.video code
Code Block | ||||
---|---|---|---|---|
| ||||
- (FPWCSApi2MediaConstraints *)toMediaConstraints { FPWCSApi2MediaConstraints *ret = [[FPWCSApi2MediaConstraints alloc] init]; |
- Constraint for audio specifying if published stream should have audio is set to the 'Send Audio' toggle switch value
Code Block | ||||
---|---|---|---|---|
| ||||
ret.audio = if ([_sendAudio.control isOn]; |
- If published stream will have video ('Send Video' toggle switch value is ON), video constraints are specified: source camera ID, width, height, FPS and bitrate
Code Block | ||||
---|---|---|---|---|
| ||||
if ([_sendVideo.control isOn])) { FPWCSApi2VideoConstraints FPWCSApi2AudioConstraints *videoaudio = [[FPWCSApi2VideoConstraintsFPWCSApi2AudioConstraints alloc] init]; for (FPWCSApi2MediaDevice *device in localDevices.video) { audio.useFEC = [_useFEC.control isOn]; audio.useStereo if= ([device_useStereo.label isEqualToString:_camSelector.input.text]) {control isOn]; video.deviceIDaudio.bitrate = device.deviceID[_audioBitrate.input.text integerValue]; }ret.audio = audio; } video.minWidth = video.maxWidth = if ([_videoResolutionsendVideo.width.textcontrol integerValueisOn];) { video.minHeight = video.maxHeight FPWCSApi2VideoConstraints *video = [_videoResolution.height.text integerValue[[FPWCSApi2VideoConstraints alloc] init]; video.minFrameRate = video.maxFrameRate = [_fpsSelector..input.text integerValue]; NSArray video.bitrate*res = [_bitratevideoResolutionSelector.input.text integerValuecomponentsSeparatedByString:@"x"]; video.quality video.minWidth = video.maxWidth = [_quality.input.textres[0] integerValue]; ret.video video.minHeight = video.maxHeight = video; } |
4. Constraints for played stream. WCSRemoteVideoControl.m, line 128
- New FPWCSApi2MediaConstraints object is created
Code Block | ||||
---|---|---|---|---|
| ||||
FPWCSApi2MediaConstraints *ret = [[FPWCSApi2MediaConstraints alloc] init]; |
- Constraint for audio specifying that played stream should have audio is added
Code Block | ||||
---|---|---|---|---|
| ||||
ret.audio = YES; |
- If played stream will have video ('Play Video' toggle switch value is ON), video constraints are specified: width, height, bitrate and quality
Code Block | ||||
---|---|---|---|---|
| ||||
if ([_playVideo.control isOn])[res[1] integerValue]; video.minFrameRate = video.maxFrameRate = [_fpsSelector.input.text integerValue]; video.bitrate = [_videoBitrate.input.text integerValue]; ret.video = video; } return ret; } |
5. Constraints for played stream.
FPWCSApi2MediaConstraints.audio, FPWCSApi2MediaConstraints.video code
Code Block | ||||
---|---|---|---|---|
| ||||
- (FPWCSApi2MediaConstraints *)toMediaConstraints { FPWCSApi2VideoConstraintsFPWCSApi2MediaConstraints *videoret = [[FPWCSApi2VideoConstraintsFPWCSApi2MediaConstraints alloc] init]; videoret.minWidthaudio = video.maxWidth = [_videoResolution.width.text integerValue]; [[FPWCSApi2AudioConstraints alloc] init]; if ([_playVideo.control isOn]) { video.minHeight = FPWCSApi2VideoConstraints *video.maxHeight = [_videoResolution.height.text integerValue[[FPWCSApi2VideoConstraints alloc] init]; video.bitrateminWidth = video.maxWidth = [_bitratevideoResolution.inputwidth.text integerValue]; video.qualityminHeight = video.maxHeight = [_qualityvideoResolution.inputheight.text integerValue]; retvideo.videobitrate = video; } |
5. Mute/unmute audio and video.
The following methods are used to mute/unmute audio and video in published stream
- [_localStream muteAudio]; (line 238)
- [_localStream unmuteAudio]; (line 240)
- [_localStream muteVideo]; (line 246)
- [_localStream unmuteVideo]; (line 248)
6. Connection to server.
ViewController method start is called when Start button is tapped. ViewController.m, line 224
Code Block | ||||
---|---|---|---|---|
| ||||
[self start]; |
In the method,
- object with options for connection session is created (ViewController.m, line 38)
Code Block | ||||
---|---|---|---|---|
| ||||
FPWCSApi2SessionOptions *options = [[FPWCSApi2SessionOptions alloc] init];
options.urlServer = _urlInput.text;
options.appKey = @"defaultApp"; |
The options include URL of WCS server and appKey of internal server-side application.
- new session is created with method createSession, which returns FPWCSApi2Session object (ViewController.m, line 42)
Code Block | ||||
---|---|---|---|---|
| ||||
_session = [FPWCSApi2 createSession:options error:&error]; |
- callback functions for processing session statuses are added (ViewController.m, line 61)
Code Block | ||||
---|---|---|---|---|
| ||||
[_session on:kFPWCSSessionStatusEstablished callback:^(FPWCSApi2Session *session){
[self startStreaming];
}];
[_session on:kFPWCSSessionStatusDisconnected callback:^(FPWCSApi2Session *session){
[self onStopped];
}];
[_session on:kFPWCSSessionStatusFailed callback:^(FPWCSApi2Session *session){
[self onStopped];
}]; |
If connection is successfully established, ViewController method startStreaming will be called to publish a new stream.
In case of disconnection, or connection failure, ViewController method onStopped will be called to make appropriate changes in controls of the interface.
- FPWCSApi2Session method connect is called to establish connection to server (ViewController.m, line 72)
Code Block | ||||
---|---|---|---|---|
| ||||
[_session connect]; |
7. Stream publishing.
When connection to the server is established,
- object with stream publish options is created (ViewController.m, line 79)
Code Block | ||||
---|---|---|---|---|
| ||||
FPWCSApi2StreamOptions *options = [[FPWCSApi2StreamOptions alloc] init];
options.name = [self getStreamName];
options.display = _videoView.local;
options.constraints = [_localControl toMediaConstraints]; |
The options include stream name, view for displaying video and constraints for audio and video.
WCSLocalVideoControlView method toMediaConstraints is used to get the constraints.
- new stream is created with FPWCSApi2Session method createStream, which returns FPWCSApi2Stream object (ViewController.m, line 84)
Code Block | ||||
---|---|---|---|---|
| ||||
_localStream = [_session createStream:options error:&error]; |
- callback functions for processing stream statuses are added (ViewController.m, line 109)
Code Block | ||||
---|---|---|---|---|
| ||||
[_localStream on:kFPWCSStreamStatusPublishing callback:^(FPWCSApi2Stream *stream){
[self startPlaying];
}];
[_localStream on:kFPWCSStreamStatusUnpublished callback:^(FPWCSApi2Stream *stream){
[self onStopped];
}];
[_localStream on:kFPWCSStreamStatusFailed callback:^(FPWCSApi2Stream *stream){
[self onStopped];
}]; |
If stream is successfully published, ViewController method startPlaying will be called to play the stream.
In case of failure, or when stream is unpublished, ViewController method onStopped will be called to make appropriate changes in controls of the interface.
- FPWCSApi2Stream method publish is called to publish the stream (ViewController.m, line 120)
Code Block | ||||
---|---|---|---|---|
| ||||
[_localStream publish:&error] |
8. Stream playback.
When stream is published,
- object with stream playback options is created (ViewController.m, line 139)
Code Block | ||||
---|---|---|---|---|
| ||||
FPWCSApi2StreamOptions *options = [[FPWCSApi2StreamOptions alloc] init];
options.name = [_localStream getName];
options.display = _videoView.remote;
options.constraints = [_remoteControl toMediaConstraints]; |
The options include stream name, view for displaying video and constraints for audio and video.
WCSRemoteVideoControlView method toMediaConstraints is used to get the constraints.
- new stream is created with FPWCSApi2Session method createStream, which returns FPWCSApi2Stream object (ViewController.m, line 144)
Code Block | ||||
---|---|---|---|---|
| ||||
_remoteStream = [_session createStream:options error:&error]; |
- callback functions for processing stream statuses are added (ViewController.m, line 162)
Code Block | ||||
---|---|---|---|---|
| ||||
[_remoteStream on:kFPWCSStreamStatusPlaying callback:^(FPWCSApi2Stream *stream){
[self onStarted];
}];
[_remoteStream on:kFPWCSStreamStatusStopped callback:^(FPWCSApi2Stream *rStream){
[_localStream stop:nil];
}];
[_remoteStream on:kFPWCSStreamStatusFailed callback:^(FPWCSApi2Stream *rStream){
if (_localStream && [_localStream getStatus] == kFPWCSStreamStatusPublishing) {
[_localStream stop:nil];
}
}]; |
If playback is successfully started, ViewController method onStarted will be called to make appropriate changes in controls of the interface.
In case of failure, or when playback is stopped, FPWCSApi2Stream method stop will be called for the published stream to stop its publication.
Code Block | ||||
---|---|---|---|---|
| ||||
[_localStream stop:nil]; |
- FPWCSApi2Stream method play is called to play the stream (ViewController.m, line 174)
Code Block | ||||
---|---|---|---|---|
| ||||
[_remoteStream play:&error] |
8. Disconnection. ViewController.m, line 36
FPWCSApi2Session method disconnect is called to close connection to the server.
Code Block | ||||
---|---|---|---|---|
| ||||
[_session disconnect[_bitrate.input.text integerValue]; video.quality = [_quality.input.text integerValue]; ret.video = video; } return ret; } |
6. Local camera and microphone testing
FPWCSApi2 getMediaAccess, AVAudioRecorder record, AVAudioRecorder stop code
Code Block | ||||
---|---|---|---|---|
| ||||
- (void)testButton:(UIButton *)button {
if ([button.titleLabel.text isEqualToString:@"Test"]) {
NSError *error;
[FPWCSApi2 getMediaAccess:[_localControl toMediaConstraints] display:_videoView.local error:&error];
[_testButton setTitle:@"Release" forState:UIControlStateNormal];
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryRecord error:&error];
NSURL *url = [NSURL fileURLWithPath:@"/dev/null"];
NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithFloat: 44100.0], AVSampleRateKey,
[NSNumber numberWithInt: kAudioFormatAppleLossless], AVFormatIDKey,
[NSNumber numberWithInt: 1], AVNumberOfChannelsKey,
[NSNumber numberWithInt: AVAudioQualityMax], AVEncoderAudioQualityKey,
nil];
_recorder = [[AVAudioRecorder alloc] initWithURL:url settings:settings error:&error];
[_recorder prepareToRecord];
_recorder.meteringEnabled = YES;
[_recorder record];
_levelTimer = [NSTimer scheduledTimerWithTimeInterval: 0.3 target: self selector: @selector(levelTimerCallback:) userInfo: nil repeats: YES];
} else {
[FPWCSApi2 releaseLocalMedia:_videoView.local];
[_testButton setTitle:@"Test" forState:UIControlStateNormal];
[_levelTimer invalidate];
[_recorder stop];
}
} |
7. Session creation and connection to the server.
FPWCSApi2 createSession, FPWCSApi2Session connect code
The options include:
- URL of WCS server
- appKey of internal server-side application (defaultApp)
Code Block | ||||
---|---|---|---|---|
| ||||
- (void)start {
if (!_session || [_session getStatus] != kFPWCSSessionStatusEstablished || ![[_session getServerUrl] isEqualToString:_urlInput.text]) {
...
FPWCSApi2SessionOptions *options = [[FPWCSApi2SessionOptions alloc] init];
options.urlServer = _urlInput.text;
options.appKey = @"defaultApp";
NSError *error;
_session = [FPWCSApi2 createSession:options error:&error];
...
[_session connect];
} else {
[self startStreaming];
}
} |
8. Stream publishing.
FPWCSApi2Session createStream, FPWCSApi2Stream publish code
Object with next stream options is passed to createStream method:
- stream name
- view to display video
- video constraints
Code Block | ||||
---|---|---|---|---|
| ||||
- (void)startStreaming {
FPWCSApi2StreamOptions *options = [[FPWCSApi2StreamOptions alloc] init];
options.name = [self getStreamName];
options.display = _videoView.local;
options.constraints = [_localControl toMediaConstraints];
NSError *error;
_localStream = [_session createStream:options error:&error];
...
if(![_localStream publish:&error]) {
UIAlertController * alert = [UIAlertController
alertControllerWithTitle:@"Failed to publish"
message:error.localizedDescription
preferredStyle:UIAlertControllerStyleAlert];
UIAlertAction* okButton = [UIAlertAction
actionWithTitle:@"Ok"
style:UIAlertActionStyleDefault
handler:^(UIAlertAction * action) {
[self onStopped];
}];
[alert addAction:okButton];
[self presentViewController:alert animated:YES completion:nil];
}
} |
9. Preview stream playback
FPWCSApi2Session createStream, FPWCSApi2Stream play code
Object with next stream options is passed to createStream method:
- stream name
- view to display video
- video constraints
Code Block | ||||
---|---|---|---|---|
| ||||
- (void)startPlaying {
FPWCSApi2StreamOptions *options = [[FPWCSApi2StreamOptions alloc] init];
options.name = [_localStream getName];
options.display = _videoView.remote;
options.constraints = [_remoteControl toMediaConstraints];
NSError *error;
_remoteStream = [_session createStream:options error:&error];
...
if(![_remoteStream play:&error]) {
UIAlertController * alert = [UIAlertController
alertControllerWithTitle:@"Failed to play"
message:error.localizedDescription
preferredStyle:UIAlertControllerStyleAlert];
UIAlertAction* okButton = [UIAlertAction
actionWithTitle:@"Ok"
style:UIAlertActionStyleDefault
handler:^(UIAlertAction * action) {
if (_localStream && [_localStream getStatus] == kFPWCSStreamStatusPublishing) {
[_localStream stop:nil];
}
}];
[alert addAction:okButton];
[self presentViewController:alert animated:YES completion:nil];
}
} |
10. Mute/unmute audio and video.
FPWCSApi2Stream muteAudio, unmuteAudio, muteVideo, unmuteVideo code
Code Block | ||||
---|---|---|---|---|
| ||||
- (void)controlValueChanged:(id)sender {
if (sender == _localControl.muteAudio.control) {
if (_localStream) {
if (_localControl.muteAudio.control.isOn) {
[_localStream muteAudio];
} else {
[_localStream unmuteAudio];
}
}
} else if (sender == _localControl.muteVideo.control) {
if (_localStream) {
if (_localControl.muteVideo.control.isOn) {
[_localStream muteVideo];
} else {
[_localStream unmuteVideo];
}
}
}
} |
11. Stream playback stop.
FPWCSApi2Stream stop code
Code Block | ||||
---|---|---|---|---|
| ||||
- (void)startButton:(UIButton *)button {
button.userInteractionEnabled = NO;
button.alpha = 0.5;
_urlInput.userInteractionEnabled = NO;
if ([button.titleLabel.text isEqualToString:@"Stop"]) {
if (_remoteStream) {
NSError *error;
[_remoteStream stop:&error];
} else {
NSLog(@"No remote stream, failed to stop");
}
} else {
//start
[self start];
}
} |
12. Stream publishing stop.
FPWCSApi2Stream stop code
Code Block | ||||
---|---|---|---|---|
| ||||
[_remoteStream on:kFPWCSStreamStatusStopped callback:^(FPWCSApi2Stream *rStream){
[self changeStreamStatus:rStream];
[_localStream stop:nil];
_useLoudSpeaker.control.userInteractionEnabled = NO;
}]; |