I'm trying to modify the Android LiveVideoBroadcaster app to stream video from my MediaProjection for screen capture.
I'm converting the returned Images to YV12 and I'm using MediaCodec to encode with the same settings as per the Antmedia example.
Connecting with RTMPMuxer and sending via writevideo, I can see that the broadcast is registered on my Linux Server but unlike the original Camera version which has a green icon with the broadcasting message in LiveApp, just reads "Broadcasting" in the Status without the green icon. If i click "Play" i get the screen "Stream will start playing automatically when it is live" and VLC registers as a Viewer but is not playing. Also the same if I use the Android app for the Camera and try to play the stream it never starts.
I can write the MediaProjection bytes to file ".mp4" successfully.
Are there certain configurations that are needed to Broadcast video other than from a Camera (which works fine)?
I can't use WebRTC as this app will have potentially 100's viewing a stream at a time.
I'm quite keen to get this working so if I need to clarify anything for someone to share some thoughts and ideas, please let me know.
Question
Drew Buckley
Link to comment
Share on other sites
Top Posters For This Question
2
Popular Days
Sep 6
2
Top Posters For This Question
Drew Buckley 2 posts
Popular Days
Sep 6 2019
2 posts
1 answer to this question
Recommended Posts