Jump to content
  • 0

Is it possible to build a few-to-many live streaming system with Ant?


Yokemuratakeshi
 Share

Question

Hi, there.

I'm planning to build a service in which a few people (up to 4) chat over voice and at the same time their conversation is broadcasted to numerous audiences.
Very low latency is required for the streams between "a few people", because their conversation should be natural, in other words, there shouldn't be any unnatural time gaps between the conversation.
On the other hand, a bit more latency is allowed until the stream is delivered to the audiences. Maybe latency of around 10-15secs at worst,  of course faster is better.

Is this kind of system can be built with Ant, or any combination with Ant?

Any info is appreciated.

Thanks!





Link to comment
Share on other sites

  • Answers 14
  • Created
  • Last Reply

Top Posters For This Question

14 answers to this question

Recommended Posts

  • 0
Hi yokemuratakeshi

Thank you for your interest. We have conference.html in this case. You can test our test page in here https://test2.antmedia.io:5443/WebRTCAppEE/conference.html 
Also here test 1-1 WebRTC Ultra Low Latency https://antmedia.io/livedemo/ 

I'm hoping that I was able to answer your queries, yokemuratakeshi.
Please feel free to get back to us if you still have any questions.
Selim.
Link to comment
Share on other sites

  • 0
Hi Yokemuratakeshi,

We can provide a solution for your case in these 2 ways.

1- The actors uses WebRTC with sub-second latency( average 0,5 seconds ) within each other.  Listeners have stream through HLS with 8-10 seconds latency.  Scaling listeners with HLS is much more easy and cost effective.

2- The actors uses WebRTC with sub-second latency( average 0,5 seconds ) within each other. Listeners can 
have about 0.5 second latency as well via HLS. In this model, scaling is not very easy and it costs more.


Ant Media Server WebRTC solution uses SFU(Selective Forwarding Unit) structure. So that each stream is delivered to listeners and actors as a different stream.  It's not mixed in server side. In this solution, listeners and actors can mute any actor and listen others. 

Ref:


I'm hoping that I was able to answer your queries. If you have any question, please feel free to ask us. Thanks.

Best Regards,
Selim



webrtc-sfu.png.f2c70c999f503f3ca2b40d6db8fce3c6.png

Link to comment
Share on other sites

  • 0
Seems like pattern 1 works better for me.

I want to test the setup but
pattern 1 and 2 both need the Enterprise edition to setup SFU, correct?
If that's the case I will need a trial version or a demo with this specific structure 
so that I can see if it works as expected.

Thanks!
----
  Takeshi Yokemura
Link to comment
Share on other sites

  • 0
Selim,

Let me ask a very basic question.
How can I setup the few-to-many live streaming?

- Edit the server app source and build?
- Edit a configuration file?
- Just check some options on Dashboard?
- Full automatic, just connect from the clients with certain options?

I haven't been able to find a doc as there's no direct reference as words like "few-to-many" or "SFU"
If you have an exact doc for that please give me the pointer to it.

Thanks!

Link to comment
Share on other sites

  • 0
Hi yokemuratakeshi,

- Edit the server app source and build?

You can able to package and build in below links. You can able fork and build your side.


Also you can customize your Application in below tutorial links.

You can edit or create pages use in Ant Media Server default pages. For example https://test2.antmedia.io:5443/WebRTCAppEE/index.html you can change everything in these html pages in serverRootFolder/webapps/application(LiveApp or etc.)/index.html etc.

- Edit a configuration file & - Just check some options on Dashboard ( Same Answer )
Applications Server configruation file is serverRootFolder/webapps/application(LiveApp or etc.)/WEB.INF/red5-web.properties Also you can change settings in Dashboard/Application/Settings section.

- Full automatic, just connect from the clients with certain options?
Please give more detail about your question. Clients mean viewers or publishers ?
You can create some controls without player html side. Please check our live demo page in https://antmedia.io/livedemo/

-I haven't been able to find a doc as there's no direct reference as words like "few-to-many" or "SFU"
If you have an exact doc for that please give me the pointer to it.

We've using default structure is SFU mode. You don't have to activate anything.
Please review our blogs in SFU mode:


I hope, I could help you.

Best Regards,
Selim.

Link to comment
Share on other sites

  • 0
Selim, 

I'm sorry my previous question was confusing.
I just wanted to know 

- How to setup N to N conversation (<- This can be done with the "conference" example, correct?)
- From client side, how to grab the HLS stream which contains all voices from N speakers?

And my previous question was just asking how deep should I go to do this setup.

Link to comment
Share on other sites

  • 0
Hi yokemuratakeshi,

1- Conference html using WebRTC -> WebRTC. Yes Conference html using in N to N structure. Maybe you can add control credentials in other php page to access publishers page.

2- I think you should have 2 pages for publishers and viewers.

Publsihers using conference html and they publish with WebRTC

Viewers using some other pages that have N Stream have Flowplayer, you can review our https://test2.antmedia.io:5443/WebRTCAppEE/play.html pages. It uses HLS Play one stream in Flowplayer. You should add some flowplayer for your N Streams.

For example you have 4 WebRTC publsihers.
Per WebRTC Broadcasters Url is like https://test2.antmedia.io:5443/LiveApp/streams/streamId.m3u8

You can add 4 flowplayer in created other html page. 

I hope, I could help you

Best Regards,
Selim.
Link to comment
Share on other sites

  • 0
Selim,
Thanks for the detail!

The the audience-side app should explicitly open N HLS streams, correct?

If that's case, will the synchronization of those N streams guaranteed or not?
In other words, this kind of thing can happen or not? : The voices from Actor A and Actor B overlaps at Audience X but doesn't at Audience Y

Thanks!
Link to comment
Share on other sites

  • 0
Hi yokemuratakeshi,

The the audience-side app should explicitly open N HLS streams, correct?

Yes, correct

If that's case, will the synchronization of those N streams guaranteed or not?

There may be some minor issues with synchronization.

The delay will be reflected similar to the end users because all of the HLS resources pass through the same network route.

In order to make it more deterministic, The resources of HLS broadcasts over the CDN and The minimum value (1) of the HLS segment time parameter can provide the delay to be equal to each viewer.

If you want premium support for this use case, please contact us from contact@antmedia.io

I hope, I could help you

Best Regards,
Selim
Link to comment
Share on other sites

 Share

×
×
  • Create New...