-
-
Notifications
You must be signed in to change notification settings - Fork 1.1k
This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Low-Latency HLS (LLHLS) has been released #766
Comments
Browsers in same device share a single tcp connection when connecting to the same server over HTTP/2. |
|
In LLHLS, |
Amazing work ! thanks ! |
@getroot - Is this in the v0.13.2 release? |
Hello |
@IanMitchell77 No. It was released today in pre-alpha version. This is currently available in the master branch. It will be fully tested and will be included in the next release. |
Hi, great work! Amazing how active development is! |
LLHLS should also work well on the edge. Of course, I recommend using a different HTTP cache server or CDN (like CloudFront) rather than OME as an edge for LLHLS. Please share your Server.xml and ovenmediaengine.log files. |
Sure, here they are. Let me know if I can do anything else. Log: ovenmediaengine.log Config: Server_edge.txt |
@heye Thanks to you I found the cause of LLHLS not working in OME Edge. I will fix this problem. Until then, please test in Origin. |
@getroot congratulations and thank you for the hard work! Testing now and I see OME crashes with LLHLS added to my configuration. There is an output profile for Thumbnails in the same application that is using LLHLS and that is where the crash seems to happen:
|
@bchah The |
I've fixed LLHLS to work in edge mode. To operate LLHLS in Edge mode, the following settings are required.
|
@bchah I solved this problem. Please confirm. |
@getroot The crashes are fixed but the Thumbnail publisher now has a new problem:
Calls to the thumbnail URL (e.g. https://myapp.xyz:8000/live/helloWorld_preview/thumb.jpg) return a 200 response but the contents of x.jpg are empty. I noticed a new |
@bchah This problem was that the Thumbnail publisher did not work over HTTP2. I solved this. Thank you very much! |
@getroot It works very well now, amazing. Thank you for the fast fix! |
@getroot just confirming a few tests for you: Thumbnail publisher fixed on HTTP/2 ✅ When trying to load LL-HLS with SRT Provider, I see this in the OME log:
I hope this provides some good clues 🥇 |
Thanks for the hard testing! |
@getroot Here you go! One more clue I found is that I could not produce the above error in the log with SRT Provider, unless SignedPolicy was also enabled. Without SignedPolicy there is no error in the logs (but the symptom of playback failed is the same). |
For our use case, we can sometimes have multiple streams/players on one page, does this mean the log will only show one connection per device, no matter how many streams they watch? That seems like it could be significant performance improvement over HLS with multiple requests per player |
To be precise, that means watching the same stream on multiple players (browsers) on one device. That is, even if sessions share the same TCP connection, if each session plays a different stream, OME can distinguish them individually. Oh, that's how it's implemented now, and that doesn't mean it's forever impossible. I have a few ideas and will experiment with them. |
Is it possible to set up the LLHLS port without TLS and use a CDN/Load balencer for TLS support? And still get the benefits of LLHLS? |
https://datatracker.ietf.org/doc/html/draft-pantos-hls-rfc8216bis I misunderstood in this document that using HTTP/2 in LLHLS is mandatory. But it was a recommendation. I have modified some code and tested LLHLS on HTTP/1.1, and it has been confirmed that it works well. When playing LLHLS over HTTP/1.1, chrome (hls.js) connects 4 connections at the same time. So HTTP/2 has a high performance advantage. However, there will be cases where HTTP/1.1 must be used. Several CDNs, including CloudFront for example, still only support HTTP/1.1 between CDNs and Origin. I'll let you know when this is done. @Adam1901 After this is done, it should work fine on non-TLS ports as well. Of course, it will work with HTTP/1.1. |
[Update]
|
Hi @getroot! Working great on my end if I disable CORS in Chrome ( |
@cwpenhale Try to change your applicaition name (app -> edge) in your edge_conf.xml file |
@getroot this did it! |
@getroot You can see the difference in the picture Thank you!!!! |
@danruser Great! You also have the same problem that I fixed today. |
I changed the default chunk duration (part segment) to 0.5. In many of my tests, I found 0.5 to be the most stable. At this time, the latency is about 2-3 seconds. 0.2 is excessively high as Requests Per Seconds is 20. Of course, if you have a very good network and free from cost, setting 0.2 will work just fine. |
I noticed while testing today that hls.js doesn't use #EXT-X-PRELOAD-HINT.(OvenPlayer uses hls.js) And this is not yet implemented as in the following task.(video-dev/hls.js#3988) If this is implemented, latency will be reduced by 1 part segment duration. THEO Player implements #EXT-X-PRELOAD-HINT, so you can experience lower latency. https://www.theoplayer.com/ll-hls-test-page I hope hls.js will be updated soon. |
@JIEgOKOJI I recently fixed an issue causing TLS to crash. Could you please check if your problem is related to this? |
Hey @getroot! I'm continuing to look at the architecture for LLHLS with OME, and I'm wondering if I get any benefit by using a caching proxy like NGINX using the Origin(1) <- Concentrator(1) <- Edge(n) pattern I mentioned in #774. If I request a ABR LLHLS playlist from OME at the edge, and each edge already has an OME edge server running on it, do I gain anything from adding a caching proxy? I would imagine the answer would be that I would gain viewer capacity at the expense of latency. From viewing the logs, it looks like OME already "caches" the stream it's consuming via OVT, which would mean my concentrator has a bandwidth requirement of Here is the NGINX config snippet I'm using: https://gist.github.com/cwpenhale/2d606e2d62f0519cf04aae55f736cb23 Thanks! |
In your structure, it would be effective to use an HTTP Reverse Proxy Server instead of an OvenMediaEngine for Edge.
Just do the LLHLS packaging on the Relay Server once and let the HTTP Reverse Proxy Server (Edge) take and deliver it. This allows the edge server to operate with lower resources (because the edge does not do packaging for LLHLS) and all the edges have the same chunk and segment, so the player can play it naturally when it moves to another edge. And in this structure, you can hand over the role to a CDN like CloudFront instead of your edge. And on the Relay server, set the |
does enabling http/2 come with keep-alive? |
For clarity, the latest OME supports: HTTP/1.0 Keep-Alive Note that Keep-Alive is an unofficial specification that only exists in HTTP/1.0. |
I tested the compatibility of videojs with OME and found it to work well. You can also test with videojs. |
Hello. I copied the minimal configuration and I am running: docker run -p 1935:1935 -p 8080:8080 -p 3333:3333 -p 3478:3478 -p 8081:8081 -p 10006-10010:10006-10010/udp -v ${PWD}/examplellhls.xml:/opt/ovenmediaengine/bin/origin_conf/Server.xml airensoft/ovenmediaengine:latest but I get the error Do you have any idea what the problem could be? |
@javier171188
|
Thank you very much @dimiden , that was the problem and your commands solved it. |
Hello, I am trying to implement llhls , and I'm using OvenPlayer. I tried the configuration let player = OvenPlayer.create(
I would appreciate any help or guidance that you can provide in this regard, thank you. |
@javier171188 Hi. Your configuration should be working properly. let player = OvenPlayer.create(${ remoteConferenceName }, {
sources: [
{
type: "hls",
file: http://localhost:8080/app/${remoteConferenceName}/llhls.m3u8,
label: "conference",
},
],
autoStart: true,
autoFallback: true,
loop: true,
}); The minimum requirements are:
And any questions
What does it mean to work sometimes? And we'll add in the manual a clear configuration for low-latency hls streams like you suggested. Thank you. |
Thanks for your answer @SangwonOh. I hope this is only a latency problem, do you have any recommendations about what parameters I should try to adjust? I would like to add that when I comment the bypass, codec, and bitrate for the audio in the configuration, the stream does not have sound, but the video loads perfectly all the time. I am really sorry but I am not sure how to express in a technical language the expression "work sometimes", so I will try to give an example. After starting the server, one user starts streaming localInput.startStreaming( then, in three different browsers, three users start to receive the stream let player = OvenPlayer.create( it takes less than 30 seconds to load the video and everything works fine. After that, the viewers stop the player player.remove(); and the streamer stops transmission localInput.remove(); I want to repeat the same process, but this time it takes more than a minute to load the stream. Since I never thought about a latency problem, I just closed everything assuming that something went wrong. Thanks for your time, I hope this is only a configuration issue. |
@javier171188 Your system uses webrtc input. In this case(like conference), it is better to use the webrtc output. LLHLS takes some time to create the initial segments, so the stream will start taking more than ten seconds depending on your config. And LLHLS does not support opus audio sent by webrtc, so you have to encode it as aac in configuration. And LLHLS has a latency of around 3 seconds, so use webrtc playback which is less than 1 second. |
@javier171188 And this is not a LLHLS issue, so please create a new issue |
You can easily try OvenMediaEngine LLHLS on THEO Player's Demo page. |
This issue was moved to a discussion.
You can continue the conversation there. Go to discussion →
LLHLS is now available in the latest master branch.
LLHLS is a low-latency streaming protocol that aims at latency of about 2 to 4 seconds, unlike WebRTC's less than 1 second. But, as you know, it's HTTP-based, so you can deploy with an existing CDN.
LLHLS requires HTTP/2. And browsers only support TLS-based H2 soTLSPort
is essential.[2022.05.21 edited] LLHLS runs much higher performance over HTTP/2. Therefore, it is recommended to use TLS Port. OME supports LLHLS over HTTP/1.1 as some CDNs only use HTTP/1.1 to connect to Origin.
LLHLS playback URLs are in the following format:
https://domain[:TLS Port]/<App Name>/<Stream Name>/llhls.m3u8
The Server.xml example below is the minimum setting for RTMP Input / LLHLS Output.
If HTTP2 or LLHLS is set to<Enable>false</Enable> in <Modules>
, it will not work, so please check this part. If<Modules>
is not in Server.xml, it is enabled by default.You can test with the following players.
OvenPlayer: https://demo.ovenplayer.com
THEO Player: https://www.theoplayer.com/ll-hls-test-page
Mac, iOS Safari Browser : Older versions do not support LLHLS, so please use the latest version of Safari.
And many other players support LLHLS.
Thanks a lot for your feedback!
The text was updated successfully, but these errors were encountered: