Jump to content
  • 0

SRT using Gstreamer


matmccann
 Share

Question

I am using gstreamer and I want to send SRT to media (enterprise version).  The tutorial uses OBS or FFmpeg; however, we are doing real-time streamming analytics and cannot introduce such a tool as the AI pipeline introduces a lot of lag between ingestion from cameras to sending to antmedia server (we program things ourself).

 

1. Is there any examples on how to send SRT using gstreamer to AntMedia server using `gst-launch-1.0`?

2. What types of connections do you support (caller, sender, rendezvous) and does ant media server act as the UDP server?

 

I have struggled for quick a while now and searched through all your documentation but it seems that most of the suggestions for ingestion involve using programs like FFMPEG or OSB.
For me, I am programmatically creating gstreamer pipelines and need to configure many of the underlaying properties.
- here is a simple pipeline I use to test sending SRT to the antmedia server (dockerized, and in 

# exports in linux terminal
export LOCAL_IP=172.23.0.11
export LOCAL_PORT=20001
export HOST_IP=172.23.0.4
export HOST_PORT=4200
export STREAM=stream
# attempt to launch a simplified pipeline using gstreamer's gst-launch to test connectivity (before programmatically creating it)
GST_DEBUG=3 gst-launch-1.0 -v filesrc location=$VID ! qtdemux ! queue ! h264parse ! avdec_h264 ! videoconvert ! x264enc tune=zerolatency ! "video/x-h264, profile=high" ! mpegtsmux alignment=7 ! srtsink uri="srt://$HOST_IP:$HOST_PORT/$STREAM?streamid=WebRTCAppEE/stream1" latency=1000 localaddress=$LOCAL_IP localport=$LOCAL_PORT

 

- this fails with `cannot get pbkeylen` and `Connection does not exist`

 

rad_perc@c97612813b15:/inference$ GST_DEBUG=srt*:5 gst-launch-1.0 -v filesrc location=$VID ! qtdemux ! queue ! h264parse ! avdec_h264 ! videoconvert ! x264enc tune=zerolatency ! "video/x-h264, profile=high" ! mpegtsmux alignment=7 ! srtsink uri="srt://$HOST_IP:$HOST_PORT/$STREAM?streamid=WebRTCAppEE/stream1" latency=1000 localaddress=$LOCAL_IP localport=$LOCAL_PORT
0:00:00.050725205 17058 0x562f763be0d0 DEBUG              srtobject gstsrtobject.c:186:gst_srt_object_new:<GstSRTSink@0x562f763c09f0> Starting up SRT
0:00:00.050808991 17058 0x562f763be0d0 DEBUG              srtobject gstsrtobject.c:561:gst_srt_object_set_uri:<GstSRTSink@0x562f763c09f0> set uri to (host: 127.0.0.1, port: 7001) with 0 query strings
0:00:00.050830451 17058 0x562f763be0d0 DEBUG              srtobject gstsrtobject.c:561:gst_srt_object_set_uri:<srtsink0> set uri to (host: 172.23.0.4, port: 4200) with 1 query strings
Setting pipeline to PAUSED ...
0:00:00.053295991 17058 0x562f763be0d0 DEBUG              srtobject gstsrtobject.c:999:gst_srt_object_open_full:<srtsink0> Opening SRT socket with parameters: application/x-srt-params, poll-timeout=(int)-1, latency=(int)1000, mode=(GstSRTConnectionMode)GST_SRT_CONNECTION_MODE_CALLER, localaddress=(string)172.23.0.11, localport=(uint)20001;
0:00:00.053333036 17058 0x562f763be0d0 DEBUG              srtobject gstsrtobject.c:869:gst_srt_object_connect:<srtsink0> Binding to 172.23.0.11 (port: 20001)
Pipeline is PREROLLING ...
00:06:56.040383/SRT:RcvQ:worker*E:SRT.c: HS VERSION=5 but no handshake extension found!
00:06:56.040424/SRT:RcvQ:worker*E:SRT.c: processAsyncConnectRequest: REJECT reported from HS processing, not processing further.
00:06:56.040428/SRT:RcvQ:worker*E:SRT.c: RendezvousQueue: processAsyncConnectRequest FAILED. Setting TTL as EXPIRED.
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)3.1, profile=(string)high, codec_data=(buffer)0164001fffe100196764001facd9405005ba10000003001000000303c0f183196001000668ebe3cb22c0, width=(int)1280, height=(int)720, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)3.1, profile=(string)high, codec_data=(buffer)0164001fffe100196764001facd9405005ba10000003001000000303c0f183196001000668ebe3cb22c0, width=(int)1280, height=(int)720, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)3.1, profile=(string)high, codec_data=(buffer)0164001fffe100196764001facd9405005ba10000003001000000303c0f183196001000668ebe3cb22c0, width=(int)1280, height=(int)720, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true
Redistribute latency...
/GstPipeline:pipeline0/avdec_h264:avdec_h264-0.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)3.1, profile=(string)high, codec_data=(buffer)0164001fffe100196764001facd9405005ba10000003001000000303c0f183196001000668ebe3cb22c0, width=(int)1280, height=(int)720, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1, interlace-mode=(string)progressive, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true
/GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)3.1, profile=(string)high, codec_data=(buffer)0164001fffe100196764001facd9405005ba10000003001000000303c0f183196001000668ebe3cb22c0, width=(int)1280, height=(int)720, framerate=(fraction)30/1, pixel-aspect-ratio=(fraction)1/1
/GstPipeline:pipeline0/avdec_h264:avdec_h264-0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)1280, height=(int)720, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:src: caps = video/x-raw, format=(string)I420, width=(int)1280, height=(int)720, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)30/1
Redistribute latency...
/GstPipeline:pipeline0/GstX264Enc:x264enc0.GstPad:sink: caps = video/x-raw, format=(string)I420, width=(int)1280, height=(int)720, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)30/1
/GstPipeline:pipeline0/GstVideoConvert:videoconvert0.GstPad:sink: caps = video/x-raw, format=(string)I420, width=(int)1280, height=(int)720, interlace-mode=(string)progressive, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono, pixel-aspect-ratio=(fraction)1/1, chroma-site=(string)mpeg2, colorimetry=(string)bt709, framerate=(fraction)30/1
Redistribute latency...
/GstPipeline:pipeline0/GstX264Enc:x264enc0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)3.1, profile=(string)high, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)3.1, profile=(string)high, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono
/GstPipeline:pipeline0/MpegTsMux:mpegtsmux0.GstPad:sink_65: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)3.1, profile=(string)high, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono
/GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)3.1, profile=(string)high, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-site=(string)mpeg2, multiview-mode=(string)mono, multiview-flags=(GstVideoMultiviewFlagsSet)0:ffffffff:/right-view-first/left-flipped/left-flopped/right-flipped/right-flopped/half-aspect/mixed-mono
0:00:00.087775977 17058 0x562f762c4c00 DEBUG                srtsink gstsrtsink.c:248:gst_srt_sink_set_caps:<srtsink0> setcaps video/mpegts, systemstream=(boolean)true, packetsize=(int)188
0:00:00.087785618 17058 0x562f762c4c00 DEBUG                srtsink gstsrtsink.c:256:gst_srt_sink_set_caps:<srtsink0> 'streamheader' field not present
0:00:00.087789673 17058 0x562f762c4c00 DEBUG                srtsink gstsrtsink.c:285:gst_srt_sink_set_caps:<srtsink0> Collected streamheaders: 0 buffers
/GstPipeline:pipeline0/MpegTsMux:mpegtsmux0.GstPad:src: caps = video/mpegts, systemstream=(boolean)true, packetsize=(int)188
/GstPipeline:pipeline0/GstSRTSink:srtsink0.GstPad:sink: caps = video/mpegts, systemstream=(boolean)true, packetsize=(int)188
0:00:00.087881974 17058 0x562f762c4c00 DEBUG                srtsink gstsrtsink.c:248:gst_srt_sink_set_caps:<srtsink0> setcaps video/mpegts, systemstream=(boolean)true, packetsize=(int)188, streamheader=(buffer)< 47400030a600ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff0000b00d0001c100000001e020a2c32941, 474020308b00ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff0002b0280001c10000e041f00c050448444d5688040ffffcfc1be041f00a050848444d56ff1b443f5a3175c0 >
0:00:00.087891468 17058 0x562f762c4c00 DEBUG                srtsink gstsrtsink.c:264:gst_srt_sink_set_caps:<srtsink0> 'streamheader' field holds array
/GstPipeline:pipeline0/MpegTsMux:mpegtsmux0.GstPad:src: caps = video/mpegts, systemstream=(boolean)true, packetsize=(int)188, streamheader=(buffer)< 47400030a600ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff0000b00d0001c100000001e020a2c32941, 474020308b00ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff0002b0280001c10000e041f00c050448444d5688040ffffcfc1be041f00a050848444d56ff1b443f5a3175c0 >
0:00:00.087908432 17058 0x562f762c4c00 DEBUG                srtsink gstsrtsink.c:285:gst_srt_sink_set_caps:<srtsink0> Collected streamheaders: 2 buffers
/GstPipeline:pipeline0/GstSRTSink:srtsink0.GstPad:sink: caps = video/mpegts, systemstream=(boolean)true, packetsize=(int)188, streamheader=(buffer)< 47400030a600ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff0000b00d0001c100000001e020a2c32941, 474020308b00ffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffffff0002b0280001c10000e041f00c050448444d5688040ffffcfc1be041f00a050848444d56ff1b443f5a3175c0 >
Pipeline is PREROLLED ...
Setting pipeline to PLAYING ...
0:00:00.088338476 17058 0x562f762c4c00 DEBUG              srtobject gstsrtobject.c:1223:gst_srt_object_send_headers:<srtsink0> Sending 2 stream headers
New clock: GstSystemClock
0:00:00.310989454 17058 0x562f762c4c00 WARN               srtobject gstsrtobject.c:1252:gst_srt_object_send_headers:<srtsink0> error: Connection does not exist
ERROR: from element /GstPipeline:pipeline0/GstSRTSink:srtsink0: Could not write to resource.
Additional debug info:
gstsrtobject.c(1252): gst_srt_object_send_headers (): /GstPipeline:pipeline0/GstSRTSink:srtsink0:
Connection does not exist
Execution ended after 0:00:00.223051365
Setting pipeline to NULL ...
0:00:00.311523261 17058 0x562f763be0d0 DEBUG              srtobject gstsrtobject.c:1194:gst_srt_object_wakeup:<srtsink0> waking up SRT
0:00:00.311786383 17058 0x562f763be0d0 DEBUG              srtobject gstsrtobject.c:1194:gst_srt_object_wakeup:<srtsink0> waking up SRT
0:00:00.319534877 17058 0x562f763be0d0 DEBUG              srtobject gstsrtobject.c:1032:gst_srt_object_close:<srtsink0> Closing SRT socket (0x1dbbcc3a)
Freeing pipeline ...
0:00:00.319668441 17058 0x562f763be0d0 DEBUG              srtobject gstsrtobject.c:224:gst_srt_object_destroy:<srtsink0> Destroying srtobject
0:00:00.322096578 17058 0x562f763be0d0 DEBUG              srtobject gstsrtobject.c:231:gst_srt_object_destroy:<srtsink0> Cleaning up SRT


As SRT ingestion and extremely low latency WebRTC are two features that are offered with Ant Media, it was an easy choice of technology; however, the SRT ingestion is difficult to work out.  Can I get some support on this? Has anyone else sent SRT to AntMedia with gstreamer before?

Link to comment
Share on other sites

  • Answers 4
  • Created
  • Last Reply

Top Posters For This Question

Top Posters For This Question

Posted Images

4 answers to this question

Recommended Posts

  • 0

Thank you! I have found a way by doing udp -> srt-live-transmit, but it introduces latency.  Perhaps we could jump on a quick call sometime and chat about it?  I'm actively developing on this and willing to put time into finding a good solution that we can share with others 🙂

Link to comment
Share on other sites

  • 0

I got a baseline solution working using srt-live-transmit. 

For demonstration purposes, I have included a simplified version of my gstreamer pipeline.  

## PIPELINE 1
- this creates a single UDP stream with multiple video substreams (4 for this diagram)
RTSP (10.0.0.5) ---> encoding ---> mpegtsmux ---> rtpmp2tpay  ---> udpsink (udp://224.224.255.255:4210)
RTSP (10.0.0.6) ---> encoding ---> |
RTSP (10.0.0.7) ---> encoding ---> |
RTSP (10.0.0.8) ---> encoding ---> |
RTSP (10.0.0.9) ---> encoding ---> |
RTSP (10.0.0.10) ---> encoding --->|


## PIPELINE 2
- This packages SRT (with multiple substreams) for AntMediaServer
./srt-live-transmit 'udp://224.224.255.255:4210' 'srt://172.23.0.4:4200?streamid=WebRTCAppEE/stream1'


Questions:
(1) I can send SRT to antmedia, see in the UI that it has an available broadcast, but I cannot watch it if the SRT stream has multiple substreams (works fine with one single video substream).  Is there a way to send a protocol stream  (SRT, RTSP) to antmedia and view the substreams in the gui? (please view antmedia_view.png)

(2) Can I send an SRT stream with multiple substreams to antmedia and have the SRT stream usable via WebRTC ( i.e. can the WebRTC client select the substream it wants or select all substreams with one websocket connection)?

(3) Can antmedia ingest an SRT with mutliple substreams and save to disk?

Resources:
(A) launch.sh: read the instructions at the top of the file.  This will create PIPELINE 1 and it shows the command to run PIPELINE 2 which will forward the udp to antmedia
(B) PAUSED_PLAYING.png: this shows the pipeline created with launch.sh
(C) launch_test.sh: this is similar to launch.sh, but it creates a video (output.ts) so you can inspect the video file and prove that creation of the video sub-streams in a single mpegts container works.
(D) specs.ts: I used launch_test.sh to create output.ts, then used an ffmpeg tool  (ffprobe) to get a print out of the video and its substreams.  View this file to look at the the substreams (Stream #0:0 --> Stream #0:5)
(E) antmedia_view.png: this shows that I can get the SRT stream to antmedia, but I could not get it to show the video.  If I send SRT with a single substream video, then it works; however, having more than one substream video doesn't work. 
 

PAUSED_PLAYING.png

antmedia_view.png

install_gstreamer.sh install_srt.sh launch.sh launch.sh launch_test.sh specs.txt

Link to comment
Share on other sites

 Share

×
×
  • Create New...