Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Scalable, Ultra Low Latency and Adaptive WebRTC Streaming (antmedia.io)
43 points by kerrarbone on Oct 4, 2020 | hide | past | favorite | 24 comments


This appears to be a fork of the Red5 media sever, looking at the Github repository. Would be interesting to know why this has been forked instead of contributing. At first glance, Red5 still seems to be actively maintained.

- https://github.com/ant-media/ant-media-server

- https://github.com/Red5/red5-server


This is another product than the Red5. There are too many additions to the Red5.

In the community edition there are MP4 Recording, IP Camera Support, HLS Support, WebRTC Ingestion, web panel, etc.

In enterprise features, it's totally different than the Red5. There is no single line of source code of Red5 in the enterprise features of the project. Everything is implemented from scratch.


Related: Low-Latency HLS (an extension of HLS, invented by Apple) is now part of the IETF HLS specification.

[1] https://www.wowza.com/blog/ietf-incorporates-low-latency-hls... [2] https://www.wowza.com/blog/apple-low-latency-hls


I was very surprised to hear that Apple actually acted upon some of the community feedback. In particular, LHLS was supposed to rely on HTTP/2 Push (leading to issues with most CDNs) while the community solution makes use of HTTP/1.1 Chunked Transfer.

- https://github.com/video-dev/hlsjs-rfcs/blob/lhls-spec/propo...

- https://developer.apple.com/videos/play/wwdc2020/10228/


CMAF support is coming in Ant Media Server v2.2. It supports both HLS and Dash in a low latency way.


Does the nginx RTMP module support LHLS already?


Isn't nginx-rtmp unmaintained for several years ?


It seems that way. I hadn't noticed. Is there some alternative that's being maintained?


I built a project for my company a little over a year ago, the project is live now. It's a recruitation tool, where the candidates fill out a questionnary with several questions and three types of question: multiple choice, text typing and video recording.

When searching for a tool to use to record the videos, I was very happy to find this open source server, so I wouldn't be locked to some service provider. The documentation is quite good and I didn't have difficulty integrating it to our project, despite the fact that I'm mostly a front-end and NodeJS developer.

Now that the project is live, we are having a few problems with some users. The videos that our users record are stored and watched later by specialists, we don't watch them live. But after the user start and stop recording, sometimes the .mp4 video file is not found on the server. But it is very nice that Ant Media has a admin page with CPU and memory usage, and it also reports when there's been a server crash.

It asks my email address and the support service receives the log files and reply at most a few days later. We're in contact right now, I'm sending them all the information I can to help resolve this issue. Of course we tested the integration ourselves, with 5 concurrent users recording, but not a single one of our tests has failed the way our users are reporting.


Only superficially familiar with WebRTC - what does this thing do to get the claimed 'low latency'?


Quite simply, this isn't low latency.

The way you drop latency is by reducing or eliminating buffers, that's pretty much it.

There's a trade-off with h264 and hevc/265, as with most video codecs - the larger the buffer you have, the more delta frames you can have between key frames.

This means lower bandwidth and better compression.

So it's a constant struggle between compression and latency.

The lowest latency that webrtc can get is about 4ms one way, 8ms round trip, at least based on the project I started to do realtime steaming with https://github.com/3DStreamingToolkit/3DStreamingToolkit

We proved you could do round trip times in the real world from cloud to customer in under 25ms.

But it comes with a lot of tradeoffs and cost and complexity to make it work.

Google Stadia fundamentally uses webrtc under the covers, although they use QUIC instead of TCP/ICE.

If you ripped out all of the congestion management from the video engine in webrtc, you'd reduce another several ms roundtrip at the cost of basically no network resilience.


I'm interested in this too, and why the "low latency" mode is enabled in everything but the community version.


The same reason they ask for the monthly fees for a self-hosted solution. They really want to be paid.


I'd assumed that, so to be more specific I'm wondering if there are any technical or patent-related reasons why this is restricted to paid/commercial editions.


I don’t consider 500ms “low latency”. You should be below 250ms before you start touting it as a key selling point.


Live streaming video usually has a latency of 6 to 30 seconds, so in that context anything below a second is very low latency

Edit: the topic of my bachelor's thesis included low latency live streaming, so I hope I have at least some knowledge :)


this application is for real-time communication more so than live streaming.


You are extremely lucky if you get a contemporary online livestream under 10 seconds, this includes AFAIK all of YouTube, and the vast majority of HLS-based streams - which are often 30 seconds latency or worse.

Getting buffers down to 500ms is already pretty insane assuming reliability doesn't suffer. 250ms is well within the envelope of a single drop on a 3G network killing playback


There are plenty of twitch streams that have latency under 2 seconds in low latency mode. You have to watch the source stream and the streamer can't use larger buffers, but it has already been done for well over a year or more. If a streamer has the chat window on screen you can type something in and see it on stream in two seconds. I have seen private mixer streams have 2 second latency also.


According to many sources, sub-second is considered to be ultra low latency. 500ms is good for now but 250ms would be great.


WebRTC is low latency by itself, otherwise you couldn’t have a video call. The challenge is to serve one WebRTC stream to 1,000,000 viewers while maintaining low latency


I always wondered what optimization techniques Zoom applies to provide a smooth experience. It’s one of the best live streaming products I have used


Half a SECOND does not sound like “low” latency, much less “ultra low”...


As I know sub-second is considered as a Ultra Low Latency for now... 250ms would be great




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: