Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Table of Contents

Example of streamer with access to media devices

This streamer can be used to publish the following types of or playback WebRTC streams on Web Call Server

  • WebRTC
  • RTMFP
  • RTMP

and allows to select media devices and parameters for the published video

...

On the screenshot below a stream is being published from the client.

Image RemovedImage Added

Two videos video elements are played displayed on the page

  • 'Local' - video from the camera
  • 'PreviewPlayer' - the video as received from the server

Code of the example

The path to the source code of the example on WCS server is:

...

Here host is the address of the WCS server.

...

Analyzing the code

To analyze the code, let's take the version of file manager.js whith hash 66cc393 ecbadc3, which is available here and and can be downloaded with corresponding build 2.0.5.28.2753.133212.

1. Initialization of the API.

Flashphoner.init()   code

Code Block
languagejs
themeRDark
        Flashphoner.init({
            screenSharingExtensionId: extensionId,
            flashMediaProviderSwfLocation: '../../../../media-provider.swf',
            mediaProvidersReadyCallback: function (mediaProviders) {
                //hide remote video if current media provider is Flash
                if (mediaProviders[0] == "Flash"if (Flashphoner.isUsingTemasys()) {
                    $("#fecForm#audioInputForm").hide();
                    $("#stereoForm#videoInputForm").hide();
                    $("#sendAudioBitrateForm").hide();}
            }
          $("#cpuOveruseDetectionForm").hide();
        })

2. List available input media devices.

Flashphoner.getMediaDevices() code

When input media devices are listed, drop-down lists of microphones and cameras on client page are filled.

Code Block
languagejs
themeRDark
    Flashphoner.getMediaDevices(null, true).then(function (list) {
        list.audio.forEach(function (device) }{
                if (Flashphoner.isUsingTemasys()) {...
        });
            $("#audioInputForm").hide();list.video.forEach(function (device) {
            ...
        $("#videoInputForm").hide()});
        ...
    }).catch(function    }(error) {
        $("#notifyFlash").text("Failed to get media }devices");
        });

23. List available input output media devices.

Flashphoner.getMediaDevices()   code

When input output media devices are listed, drop-down lists of microphones spakers and cameras headphones on client page are filled.

Code Block
languagejs
themeRDark
    Flashphoner.getMediaDevices(null, true, MEDIA_DEVICE_KIND.OUTPUT).then(function (list) {
        list.audio.forEach(function (device) {
            ...
        });
        list.video.forEach(function (device) {
            ...
        });
        ...
    }).catch(function (error) {
        $("#notifyFlash").text("Failed to get media devices");
    });

3. List available output media devices

Flashphoner.getMediaDevices() code

When output media devices are listed, drop-down lists of spakers and headphones on client page are filled.4. Get audio and video publishing constraints from client page

getConstraints() code

Publishing sources:

  • camera (sendVideo)
  • microphone (sendAudio)
Code Block
languagejs
themeRDark
    Flashphoner.getMediaDevices(null, true, MEDIA_DEVICE_KIND.OUTPUT).then(function (list) constraints = {
        list.audio.forEach(function (device) {audio: $("#sendAudio").is(':checked'),
        video: $("#sendVideo").is(':checked'),
    };

Audio constraints:

  • microphone choise (deviceId)
  • error correction for Opus codec (fec)
  • stereo mode (stereo)
  • audio bitrate (bitrate)
Code Block
languagejs
themeRDark
    if (constraints.audio) {
        constraints.audio = {
            deviceId: $('#audioInput').val()
        };
        if ($("#fec").is(':checked'))
            constraints.audio.fec = $("#fec").is(':checked');
        if ($("#sendStereoAudio").is(':checked'))
            constraints.audio.stereo = $("#sendStereoAudio").is(':checked');
        if (parseInt($('#sendAudioBitrate').val()) > 0)
            constraints.audio.bitrate = parseInt($('#sendAudioBitrate').val());
    }

Video constraints:

  • camera choise (deviceId)
  • publishing video size (width, height)
  • minimal and maximal video bitrate (minBitrate, maxBitrate)
  • FPS (frameRate)
Code Block
languagejs
themeRDark
            constraints.video = {
                deviceId: {exact: $('#videoInput').val()},
                width: parseInt($('#sendWidth').val()),
                height: parseInt($('#sendHeight').val())
            };
            if (Browser.isSafariWebRTC() && Browser.isiOS() && Flashphoner.getMediaProviders()[0] === "WebRTC") {
                constraints.video.deviceId = {exact: $('#videoInput').val()};
            }
            if (parseInt($('#sendVideoMinBitrate').val()) > 0)
                constraints.video.minBitrate = parseInt($('#sendVideoMinBitrate').val());
            if (parseInt($('#sendVideoMaxBitrate').val()) > 0)
                constraints.video.maxBitrate = parseInt($('#sendVideoMaxBitrate').val());
            if (parseInt($('#fps').val()) > 0)
                constraints.video.frameRate = parseInt($('#fps').val());

5. Get access to media devices for local test

Flashphoner.getMediaAccess() code

Audio and video constraints and <div>-element to display captured video are passed to the method.

Code Block
languagejs
themeRDark
    Flashphoner.getMediaAccess(getConstraints(), localVideo).then(function (disp) {
        $("#testBtn").text("Release").off('click').click(function () {
            $(this).prop('disabled', true);
            stopTest();
        }).prop('disabled', false);
        ...
        testStarted = true;
    }).catch(function (error) {
        $("#testBtn").prop('disabled', false);
        testStarted = false;
    });

6. Connecting to the server

Flashphoner.createSession() code

Code Block
languagejs
themeRDark
    Flashphoner.createSession({urlServer: url, timeout: tm}).on(SESSION_STATUS.ESTABLISHED, function (session) {
        ...
    }).on(SESSION_STATUS.DISCONNECTED, function () {
        ...
    }).on(SESSION_STATUS.FAILED, function () {
        ...
    });

7. Receiving the event confirming successful connection

ConnectionStatusEvent ESTABLISHED code

Code Block
languagejs
themeRDark
    Flashphoner.createSession({urlServer: url, timeout: tm}).on(SESSION_STATUS.ESTABLISHED, function (session) {
        setStatus("#connectStatus", session.status());
        onConnected(session);
        ...
    });

8. Stream publishing

session.createStream(), publishStream.publish() code

Code Block
languagejs
themeRDark
    publishStream = session.createStream({
        name: streamName,
        display: localVideo,
    });
    cacheLocalResources: true,
   ...
    }).catch(function (error) { constraints: constraints,
        $("#notifyFlash").text("Failed to get media devices");
    });

4. Get audio and video publishing constraints from client page

getConstraints() code

Publishing sources:

  • camera (sendVideo)
  • microphone (sendAudio)
Code Block
languagejs
themeRDark
mediaConnectionConstraints: mediaConnectionConstraints,
        sdpHook: rewriteSdp,
       constraints =transport: {transportInput,
        audio: $("#sendAudio").is(':checked')cvoExtension: cvo,
        stripCodecs: strippedCodecs,
        video: $("#sendVideo").is(':checked'),videoContentHint: contentHint
        ...
    });

Audio constraints:

  • microphone choise (deviceId)
  • error correction for Opus codec (fec)
  • stereo mode (stereo)
  • audio bitrate (bitrate)
    publishStream.publish();

9. Receiving the event confirming successful streaming

StreamStatusEvent PUBLISHING code

Code Block
languagejs
themeRDark
    if (constraints.audio) {
publishStream = session.createStream({
        ...
   constraints.audio = }).on(STREAM_STATUS.PUBLISHING, function (stream) {
            deviceId: $('#audioInput'"#testBtn").val()prop('disabled', true);
        }var video = document.getElementById(stream.id());
        //resize local if ($("#fec").is(':checked'))resolution is available
        if (video.videoWidth > 0 constraints.audio.fec = $("#fec").is(':checked');
&& video.videoHeight > 0) {
           if ($("#sendStereoAudio").is(':checked'))
resizeLocalVideo({target: video});
        }
      constraints.audio.stereo = $("#sendStereoAudio").is(':checked'enablePublishToggles(true);
        if (parseInt($('#sendAudioBitrate'"#muteVideoToggle").valis()) > 0)":checked")) {
            constraints.audio.bitrate = parseInt($('#sendAudioBitrate').val()muteVideo();
     }

Video constraints:

  • camera choise (deviceId)
  • publishing video size (width, height)
  • minimal and maximal video bitrate (minBitrate, maxBitrate)
  • FPS (frameRate)
Code Block
languagejs
themeRDark
   }
        if ($("#muteAudioToggle").is(":checked")) {
            muteAudio();
      constraints.video = {}
        //remove resize listener in case this video was deviceId: {exact: $('#videoInput').val()},cached earlier
        video.removeEventListener('resize', resizeLocalVideo);
       width: parseIntvideo.addEventListener($('#sendWidth').val()),resize', resizeLocalVideo);
        publishStream.setMicrophoneGain(currentGainValue);
        height: parseInt($('#sendHeight').val())setStatus("#publishStatus", STREAM_STATUS.PUBLISHING);
        onPublishing(stream);
    };
   ).on(STREAM_STATUS.UNPUBLISHED, function () {
        ...
 if (Browser.isSafariWebRTC() && Browser}).isiOS() && Flashphoner.getMediaProviders()[0] === "WebRTC"on(STREAM_STATUS.FAILED, function () {
        ...
    });
    constraints.video.deviceId = {exact: $('#videoInput').val()};
publishStream.publish();

10. Stream playback

session.createStream(), previewStream.play() code

Code Block
languagejs
themeRDark
    previewStream        }= session.createStream({
            if (parseInt($('#sendVideoMinBitrate').val()) > 0)
name: streamName,
        display: remoteVideo,
        constraints: constraints.video.minBitrate = parseInt($('#sendVideoMinBitrate').val());
,
        transport: transportOutput,
      if (parseInt($('#sendVideoMaxBitrate').val()) > 0) stripCodecs: strippedCodecs
        ...
    });
    constraints.video.maxBitrate = parseInt($('#sendVideoMaxBitrate').val());
     previewStream.play();

11. Receiving the event confirming successful playback

StreamStatusEvent PLAYING code

Code Block
languagejs
themeRDark
    previewStream = session.createStream({
       if (parseInt($('#fps').val()) > 0) ...
    }).on(STREAM_STATUS.PLAYING, function (stream) {
         constraints.video.frameRateplayConnectionQualityStat.connectionQualityUpdateTimestamp = new parseIntDate($('#fps').valvalueOf());

5. Get access to media devices for local test

Flashphoner.getMediaAccess() code

Audio and video constraints and <div>-element to display captured video are passed to the method.

Code Block
languagejs
themeRDark
;
        Flashphoner.getMediaAccess(getConstraints()setStatus("#playStatus", localVideo)stream.then(function (disp) {status());
        onPlaying(stream);
        $("#testBtn").text("Release").off('click').click(function (document.getElementById(stream.id()).addEventListener('resize', function (event) {
            $(this"#playResolution").prop('disabled', truetext(event.target.videoWidth + "x" + event.target.videoHeight);
            stopTestresizeVideo(event.target);
        }).prop('disabled', false);
        ...
   //wait for incoming stream
     testStarted = true;
    }).catch(function (errorif (Flashphoner.getMediaProviders()[0] == "WebRTC") {
            $("#testBtn").prop('disabled', false);
setTimeout(function () {
          testStarted = false;
    });

6. Connecting to the server

Flashphoner.createSession() code

Code Block
languagejs
themeRDark
if(Browser.isChrome()) {
     Flashphoner.createSession({urlServer: url, timeout: tm}).on(SESSION_STATUS.ESTABLISHED, function (session) {
        setStatus("#connectStatus", session.status() detectSpeechChrome(stream);
            onConnected(session);
    }).on(SESSION_STATUS.DISCONNECTED, function () else {
        setStatus("#connectStatus", SESSION_STATUS.DISCONNECTED);
            onDisconnecteddetectSpeech(stream);
     }).on(SESSION_STATUS.FAILED, function () {
           }
            setStatus("#connectStatus"}, SESSION_STATUS.FAILED3000);
         onDisconnected(}
        ...
    });
    }previewStream.play();

7. Receiving the event confirming successful connection

ConnectionStatusEvent ESTABLISHED 12. Stop stream playback

stream.stop() code

Code Block
languagejs
themeRDark
    Flashphoner.createSession({urlServer: url, timeout: tm}).on(SESSION_STATUS.ESTABLISHED, function (session$("#playBtn").text("Stop").off('click').click(function () {
        setStatus("#connectStatus", session.status())$(this).prop('disabled', true);
        onConnectedstream.stop(session);
        ...
    }}).prop('disabled', false);

8. Stream publishing

session.createStream(), publishStream.publish() 13. Receiving the event confirming successful playback stop

StreamStatusEvent STOPPED code

Code Block
languagejs
themeRDark
    publishStreampreviewStream = session.createStream({
        name: streamName,
     ...
   display: localVideo, }).on(STREAM_STATUS.STOPPED, function () {
        cacheLocalResources: true,setStatus("#playStatus", STREAM_STATUS.STOPPED);
        constraints: constraints,onStopped();
        mediaConnectionConstraints: mediaConnectionConstraints,...
    });
    sdpHook: rewriteSdp,
        transport: transportInput,previewStream.play();

14. Stop stream publishing

stream.stop() code

Code Block
languagejs
themeRDark
    $("#publishBtn").text("Stop").off('click').click(function () {
        cvoExtension: cvo,
        stripCodecs: strippedCodecs$(this).prop('disabled', true);
        ...
    }stream.stop();
    publishStream}).publish(prop('disabled', false);

915. Receiving  Receiving the event confirming successful streamingpublishsing stop

StreamStatusEvent PUBLISHING UNPUBLISHED code

Code Block
languagejs
themeRDark
    publishStream = session.createStream({
        ...
    }).on(STREAM_STATUS.PUBLISHINGUNPUBLISHED, function (stream) {
        $("#testBtn").prop('disabled', true);
        var video = document.getElementById(stream.id());
        //resize local if resolution is available setStatus("#publishStatus", STREAM_STATUS.UNPUBLISHED);
        if (video.videoWidth > 0 && video.videoHeight > 0) {onUnpublished();
        ...
    });
        resizeLocalVideo({target: video});publishStream.publish();

16. Mute publisher audio

stream.muteAudio() code:

Code Block
languagejs
themeRDark
function muteAudio() {
    if    }(publishStream) {
        enablePublishTogglespublishStream.muteAudio(true);
        if ($("#muteVideoToggle").is(":checked")) {
           }
}

17. Mute publisher video

stream.muteVideo() code:

Code Block
languagejs
themeRDark
function muteVideo();
        } {
        if ($("#muteAudioToggle").is(":checked"))publishStream) {
            muteAudiopublishStream.muteVideo();
        }
}

18. Show WebRTC stream publishing statistics

stream.getStats() code:

Code Block
languagejs
themeRDark
        //remove resize listener in case this video was cached earlier
publishStream.getStats(function (stats) {
            if (stats && video.removeEventListener('resize', resizeLocalVideo);stats.outboundStream) {
        video.addEventListener('resize', resizeLocalVideo);
       if publishStream.setMicrophoneGain(currentGainValue);(stats.outboundStream.video) {
        setStatus("#publishStatus", STREAM_STATUS.PUBLISHING);
        onPublishing(stream);
    }).on(STREAM_STATUS.UNPUBLISHED, function () {showStat(stats.outboundStream.video, "outVideoStat");
        ...
    }).on(STREAM_STATUS.FAILED, function () {
     let vBitrate = (stats.outboundStream.video.
bytesSent - videoBytesSent) * })8;
    publishStream.publish();

10. Stream playback

session.createStream(), previewStream.play() code

Code Block
languagejs
themeRDark
    previewStream = session.createStream(                if ($('#outVideoStatBitrate').length == 0) {
        name:  streamName,
        display: remoteVideo,
     let html = constraints"<div>Bitrate: constraints,
        transport: transportOutput," + "<span id='outVideoStatBitrate' style='font-weight: normal'>" + vBitrate + "</span>" + "</div>";
        stripCodecs: strippedCodecs
        ...
    });
    previewStream.play();

11. Receiving the event confirming successful playback

StreamStatusEvent PLAYING code

Code Block
languagejs
themeRDark
    previewStream = session.createStream({
$("#outVideoStat").append(html);
           ...
    }).on(STREAM_STATUS.PLAYING, function (stream) {
  } else {
    playConnectionQualityStat.connectionQualityUpdateTimestamp = new Date().valueOf();
        setStatus("#playStatus", stream.status());
        onPlaying(stream$('#outVideoStatBitrate').text(vBitrate);
        document.getElementById(stream.id()).addEventListener('resize', function (event) {
         }
   $("#playResolution").text(event.target.videoWidth + "x" + event.target.videoHeight);
                 videoBytesSent  resizeVideo(event.target)= stats.outboundStream.video.bytesSent;
        });
         //wait for incoming stream...
        if (Flashphoner.getMediaProviders()[0] == "WebRTC") {
          }

          setTimeout(function () {
    if (stats.outboundStream.audio) {
          detectSpeech(stream);
            }showStat(stats.outboundStream.audio, 3000"outAudioStat");
        }
        ...
    });
let aBitrate  = previewStream.play();

12. Stop stream playback

stream.stop() code

Code Block
languagejs
themeRDark
    $("#playBtn").text("Stop").off('click').click(function () {(stats.outboundStream.audio.bytesSent - audioBytesSent) * 8;
        $(this).prop('disabled', true);
        stream.stop();
    }).propif ($('disabled#outAudioStatBitrate', false);

13. Receiving the event confirming successful playback stop

StreamStatusEvent STOPPED code

Code Block
languagejs
themeRDark
 .length == 0) {
   previewStream = session.createStream({
        ...
    }).on(STREAM_STATUS.STOPPED, function () {
    let html = "<div>Bitrate: setStatus("#playStatus + ", STREAM_STATUS.STOPPED);
        onStopped()<span id='outAudioStatBitrate' style='font-weight: normal'>" + aBitrate + "</span>" + "</div>";
        ...
    });
    previewStream.play();

14. Stop stream publishing

stream.stop() code

Code Block
languagejs
themeRDark
    $("#publishBtn").text("Stop").off('click').click(function () {
        $(this"#outAudioStat").prop('disabled', trueappend(html);
        stream.stop();
       }).prop('disabled', false);

15. Receiving the event confirming successful publishsing stop

StreamStatusEvent UNPUBLISHED code

Code Block
languagejs
themeRDark
    publishStream} =else session.createStream({
         ...
    }).on(STREAM_STATUS.UNPUBLISHED, function () {
        setStatus("#publishStatus", STREAM_STATUS.UNPUBLISHED$('#outAudioStatBitrate').text(aBitrate);
        onUnpublished();
        ...
    });
    publishStream.publish();

16. Mute publisher audio

code:

Code Block
languagejs
themeRDark
                audioBytesSent if ($("#muteAudioToggle").is(":checked")) {= stats.outboundStream.audio.bytesSent;
            muteAudio();
    }
    }

17. Mute publisher video

code:

Code Block
languagejs
themeRDark
        if ($("#muteVideoToggle").is(":checked")) { }
            muteVideo();...
        });

1819. Show WebRTC stream publishing playback statistics

stream.getStats()   code:

Code Block
languagejs
themeRDark
        publishStreampreviewStream.getStats(function (stats) {
            if (stats && stats.outboundStreaminboundStream) {
                if (stats.outboundStreaminboundStream.video) {
                    showStat(stats.outboundStreaminboundStream.video, "outVideoStatinVideoStat");
                    let vBitrate = (stats.outboundStreaminboundStream.video.bytesSentbytesReceived - videoBytesSentvideoBytesReceived) * 8;
                    if ($('#outVideoStatBitrate#inVideoStatBitrate').length == 0) {
                        let html = "<div>Bitrate: " + "<span id='outVideoStatBitrateinVideoStatBitrate' style='font-weight: normal'>" + vBitrate + "</span>" + "</div>";
                        $("#outVideoStat#inVideoStat").append(html);
                    } else {
                        $('#outVideoStatBitrate#inVideoStatBitrate').text(vBitrate);
                    }
                    videoBytesSentvideoBytesReceived = stats.outboundStreaminboundStream.video.bytesSentbytesReceived;
                    ...
                }

                if (stats.outboundStreaminboundStream.audio) {
                    showStat(stats.outboundStreaminboundStream.audio, "outAudioStatinAudioStat");
                    let aBitrate = (stats.outboundStreaminboundStream.audio.bytesSentbytesReceived - audioBytesSentaudioBytesReceived) * 8;
                    if ($('#outAudioStatBitrate#inAudioStatBitrate').length == 0) {
                        let html = "<div>Bitrate<div style='font-weight: bold'>Bitrate: " + "<span id='outAudioStatBitrateinAudioStatBitrate' style='font-weight: normal'>" + aBitrate + "</span>" + "</div>";
                        $("#outAudioStat#inAudioStat").append(html);
                    } else {
                        $('#outAudioStatBitrate#inAudioStatBitrate').text(aBitrate);

                    }
                    audioBytesReceived = stats.inboundStream.audio.bytesReceived;
         }
       }
              audioBytesSent = stats.outboundStream.audio.bytesSent;
            }
    }
      });

20. Speech detection using ScriptProcessor interface (any browser except Chrome)

audioContext.createMediaStreamSource(), audioContext.createScriptProcessor() code

Code Block
languagejs
themeRDark
function detectSpeech(stream, level, latency) {
    var mediaStream = document.getElementById(stream.id()).srcObject;
   }
 var source = audioContext.createMediaStreamSource(mediaStream);
    var processor =  ...audioContext.createScriptProcessor(512);
    processor.onaudioprocess = handleAudio;
  });

19. Show WebRTC stream playback statistics

stream.getStats() code:

Code Block
languagejs
themeRDark
  processor.connect(audioContext.destination);
    processor.clipping = false;
  previewStream.getStats(function (stats) {  processor.lastClip = 0;
    // threshold
    processor.threshold =  if (stats && stats.inboundStream) {
   level || 0.10;
    processor.latency = latency || 750;

    processor.isSpeech =
        iffunction (stats.inboundStream.video) {
) {
            if (!this.clipping) return false;
            if showStat((stats.inboundStream.video, "inVideoStat")this.lastClip + this.latency) < window.performance.now()) this.clipping = false;
            return this.clipping;
        };

 let vBitrate = (stats.inboundStream.video.bytesReceived - videoBytesReceived) * 8;
    source.connect(processor);

    // Check speech every 500 ms
    speechIntervalID = setInterval(function () {
        if ($('#inVideoStatBitrate').length == 0processor.isSpeech()) {
            $("#talking").css('background-color', 'green');
        }   let html = "<div>Bitrate: " + "<span id='inVideoStatBitrate' style='font-weight: normal'>" + vBitrate + "</span>" + "</div>"else {
            $("#talking").css('background-color', 'red');
        }
    }, 500);
}

Audio data handler code

Code Block
languagejs
themeRDark
function handleAudio(event) {
    var buf    $("#inVideoStat").append(html= event.inputBuffer.getChannelData(0);
    var bufLength = buf.length;
    var x;
    for (var i = }0; elsei {
< bufLength; i++) {
        x = buf[i];
          if $('#inVideoStatBitrate').text(vBitrate);
   Math.abs(x) >= this.threshold) {
            this.clipping     }
        = true;
            videoBytesReceivedthis.lastClip = statswindow.inboundStream.video.bytesReceivedperformance.now();
        }
    }
}

21. Speech detection using incoming audio WebRTC statistics in Chrome browser

stream.getStats() code

Code Block
languagejs
themeRDark
function detectSpeechChrome(stream, level, latency) {
    statSpeechDetector.threshold = level || 0..
    010;
    statSpeechDetector.latency = latency || 750;
    statSpeechDetector.clipping = false;
    statSpeechDetector.lastClip = }0;

    speechIntervalID = setInterval(function() {
         if (stats.inboundStream.audiostream.getStats(function(stat) {
            let audioStats       showStat(stats= stat.inboundStream.audio, "inAudioStat");
                    let aBitrate = (stats.inboundStream.audio.bytesReceived - audioBytesReceived) * 8;
if(!audioStats) {
                return;
        if ($('#inAudioStatBitrate').length == 0) {}
            // Using audioLevel WebRTC stats parameter
          let html = "<div style='font-weight: bold'>Bitrate: " + "<span id='inAudioStatBitrate' style='font-weight: normal'>" + aBitrate + "</span>" + "</div>";
      if (audioStats.audioLevel >= statSpeechDetector.threshold) {
                statSpeechDetector.clipping = true;
                statSpeechDetector.lastClip  $("#inAudioStat").append(html= window.performance.now();
            }
        } else {
  if ((statSpeechDetector.lastClip + statSpeechDetector.latency) < window.performance.now()) {
                $('#inAudioStatBitrate').text(aBitrate)statSpeechDetector.clipping = false;
            }
        }
    if (statSpeechDetector.clipping) {
              audioBytesReceived = stats.inboundStream.audio.bytesReceived $("#talking").css('background-color', 'green');
            } else   }{
                ...$("#talking").css('background-color', 'red');
            }
        });
    },500);
}