So looking into the examples and explanations in this Antmedia article: https://antmedia.io/how-to-merge-live-stream-and-canvas-in-webrtc-easily/
We want to broadcast a 'canvas' which has a manipulated version of the video stream drawn onto it - however the above article does not seem relevant to the current antmedia versions and demo files.
ie: there is no "initWebRTCAdaptor(localStream);" in all the demo's, they intialize differently - so how do you attach a canvas instead of a video stream now in the current versions ?
anyone have the basic demo code working to do this ?
Question
Alex A
So looking into the examples and explanations in this Antmedia article: https://antmedia.io/how-to-merge-live-stream-and-canvas-in-webrtc-easily/
Link to comment
Share on other sites
Top Posters For This Question
2
Popular Days
Mar 29
1
Mar 30
1
Top Posters For This Question
Alex A 2 posts
Popular Days
Mar 29 2021
1 post
Mar 30 2021
1 post
1 answer to this question
Recommended Posts