Jump to content
  • 0

HLS Live Stream with S3FS

Andreas H


Our live streams do usually have lengths of 3-4h. We do record the full live stream to disk in order to have DVR (seekable videos). We do use Adaptive Bitrate (ABR) with multiple renditions. Currently we do use a large local disk to store the HLS segments and later upload them to S3, where we serve them using AWS Cloudfront. Having to upload these recordings after a live stream makes operation processes more complex, time-consuming and error prone, so we would prefer to have a more seamless workflow and avoid the large local disks.

Has anyone successfully used S3FS or Goofys as a file system backend in production and is willing to share some experiences?

Risks that I see currently:

  1. Does direct uploading to S3 increase the latency for LL-HLS/HLS or even create problems?
  2. Can S3FS keep up with the large number of small files from LL-HLS and multiple Adaptive Bitrate renditions? – Maybe this is just a matter of tuning S3FS.
  3. Can AntMedia Server deal with S3's "eventual consistency" limitations?

An alternative could be to send data to AWS MediaStore directly using HTTP PUT API and the app setting settings.hlsHttpEndpoint if that proves to be compatible


I'm eager to hear your experiences.

Link to comment
Share on other sites

  • Answers 0
  • Created
  • Last Reply

Top Posters For This Question

Popular Days

Top Posters For This Question

0 answers to this question

Recommended Posts

There have been no answers to this question yet


  • Create New...