Jump to content
  • 0

Scaling Webrtc Conference Call


vardhaman surana
 Share

Question

It is observed that receive bandwidth increases as more and more users join the call. 

Even if new users who joins has their camera and mic off.
Which causes call drop for participants which have low bandwidth.

Expected Behaviour: receive bandwidth should be low if all users have their mic and camera turned off.
Actual Behaviour: in a 7 user conference call, 600kbps is minimum bandwidth even if all users have their camera and mic off i.e. 100 kbps per publisher.

Possible reason : track.enabled = false,  is used in JS SDK. which does not stop the bandwidth usage.

What can be done to reduce this bandwidth usage so that more participant can join the call ?

Link to comment
Share on other sites

  • Answers 7
  • Created
  • Last Reply

Top Posters For This Question

Top Posters For This Question

Posted Images

7 answers to this question

Recommended Posts

  • 0

I tried the video track null approach but it created some problems (Maybe i did it the wrong way). Then i tried stopping the tracks (track.stop()) and adding a new track( using get user media) in the video sender (using replace track). This worked well but there was one problem on subscriber side:

if subscriber started playing the video when both video track and audio track in publisher stream is stopped then the behavior is: if the publisher unmute the local mic then subscriber can't hear. If publisher start the video track once and then turn off. After that everything works fine.

Maybe i am doing something wrong. 

Can you please make the desired changes in JS SDK and update here in this conversation. I am using ant media cluster on aws.

Apart from the changes on client side i am also curious about one thing, Can ant media server be programmed in a way that it will not send data to subscriber if mic and camera both are turned off ? It will be nice feature to add.

Link to comment
Share on other sites

  • 0

Hİ Maans

I tried your point, it is correct that the traffic doesn't decrease to zero but it almost decreases to zero, see the below image when the microphone and camera is turned on the spikes happen in bits/s ;

To be honest I am not sure if we can decrease this to zero, however, I don't think this much usage will be problematic. Although I will try to remove tracks in a test environment and see if it can work and make the sender traffic zero. Something can be adjusted as it seems, but it won't make a big difference I assume.

 

 

 

843117384_Screenshotfrom2021-01-1916-59-11.thumb.png.0ce8a98a308e2d81d3f90a5e9c40a6ed.png

Link to comment
Share on other sites

  • 0

For users having good bandwidth like 3-4 Mbps it is not an issue i accept. But For people having bandwidth around 2Mbps there is a problem.

I inspected and found that per publisher 100Kbps is received even if camera and mic off. 

So for a client having 2Mbps, if he wants to be a part of a 10 people conference call then , 900Kbps will be used even if everyone has camera and mic off. And also if bandwidth fluctuation caused the bandwidth to drop below 900Kbps then he will drop out from the call.

 If this could be improved then in my conference solution i will be able to host around 10 participant meeting even for clients with low bandwidth.

 

Link to comment
Share on other sites

  • 0

Hi,

For a publishing scenario with the ability of open and close the camera and mic separately, our current solution is the one makes the most sense since we need to continue the publishing status if the user closed only their camera or mic, which means sending the data of course as you might guess. If we stop the camera track it does no visible effect on the bandwidth. However, I can suggest a pretty good solution for you when the user has disabled both their camera and microphone off. 

What I can offer you in the case that some users disable their camera and microphone is that, you can just call webRTCAdaptor.stop(publishStreamId) for their publishing status. You can keep track of their mic and camera( or maybe add a new button it is your choice) and if it is both turned off you can stop the publishing status, which means the user is switched to the playOnly mode and sending no data to the server only receiving. Than you can call webRTCAdaptor.publish() again if they want to publish some stream. For example I tried stop publishing if user turns camera off ( beware that this is just for example, it will also stop microphone if you do like this );

function turnOffLocalCamera() {
webRTCAdaptor.turnOffLocalCamera();
isCameraOff = true;
handleCameraButtons();
sendNotificationEvent("CAM_TURNED_OFF");
webRTCAdaptor.stop(publishStreamId);
}

This will completely shut down the sender traffic but will continue to receive unless you call webRTCAdaptor.leaveFromRoom(). You need to also configure some of the publish_finished callback like this one don't forget that ; https://github.com/ant-media/StreamApp/pull/130/files

I think this will solve your problem decisively, hope that it will work for you.

Regards

 

Link to comment
Share on other sites

  • 0

This solution will work But it will be a bit different from other sfu conference solutions like google meet. Anyways i think current antmedia server is designed for live streaming and works quite well for that case.

this issue (more of a feature request) will be addressed when ant media team will plan for having large conference rooms (50-100 active participants at once).

Thanks tahir for sharing the solution and looking into the issue. 

Regards
Link to comment
Share on other sites

  • 0

Thank you for your report and comments. I know it is a bit different but it will be a stable solution for sure and easy to implement. Actually the main difference from other SFU solutions is we don't aim for the end user we are aiming for streaming infrastructure, that is why the front end pages are just sample pages to show that the product is working and to give some insight about how to code it. They are not meant for end users that is why there might be of course differences with google meets. Our product is actually the server software, SDKs and pages are just for supporting the customers and showing the intention of good will. But sure, we keep these kind of requests and points in our mind when we are developing next versions you can be sure of that. With the customer's insight and requests we are shaping the product at this point. 

Cheers

Link to comment
Share on other sites

 Share

×
×
  • Create New...