-
-
Notifications
You must be signed in to change notification settings - Fork 965
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Microstuttering due to wrong capture or encoding rate: 59.94Hz instead of 60Hz #2286
Comments
I also increased sunshine's log verbosity (to debug and even verbose), but the only reference to a framerate that I could fine are lines like this:
I have also now found #1998 which has some superficial similarities. But that issue is for Windows with nvidia and there are apparently log messages that Sunshine reduces its capture rate on purpose. There are no such messages here and it's a completely different capture and encoding pipeline. |
Newest moonlight logs:
BUT:
Yet, sunshine's video stream is still at 59.94 fps instead of the expected 60.00. |
This time I'm using h264 software encoding on the host, instead of VA-API h265. The resulting stream still has that (almost) 59.94 fps framerate:
I really don't get it. If my host's display were configured to refresh at 60Hz, I'd suspect that it's lying and that somehow the true refresh rate is 59.94Hz and due to some unintended syncing while capturing the stream ends up at 59.94Hz instead of a clean 60. But as in the previous comment the screen currently stays at 165Hz while streaming. Is it possible that some games or emulators output their frames at 59.94Hz and some unintended(?) locking/synchronizing is happening when sunshine captures the frames? Or is sunshine maybe misconfiguring the encoders somehow, so that they're expecting 59.94 fps but sunshine is giving them 60? |
In order to rule out an unfortunate choice of streamed games/emulators, I now tested a simple stream of my desktop showing glxgears. The host screen has been manually set to 1920x1080@60Hz. On the host glxgear runs as perfect ass possible at 60fps. (It's mostly showing 59.999, 60.000 or 60.001 fps).
(This time the host config was: X11, KMS capture, AMD VA-API h265 encoding) The difference in framerates between, let's say, 59.94Hz and 60Hz is such that a difference of one whole frame builds up after each period of 17 seconds or approximately 4 times per minute. |
Here are the best practices in regards to getting a smooth stream:
Before determining that there are issues with your stream, make sure to double check the game is not dropping frames itself before trying to troubleshoot. It is suggested to use RTSS or any other benchmarking software (such as MANGOHUD for LINUX) to display frametimes so it is obvious if the frame drops are happening on the client side instead of host side. Closing this issue as it is not a Sunshine bug or issue |
Thanks for posting the suggestions. Though I'm still not convinced that there is not a capture/encoding issue lurking somewhere:
That's quite fishy in my opinion. But I do understand that the initial report may have come across as too unspecific or smells of user error (which I can't exclude entirely). I won't insist unless I have something major to add. PS the smooth stream suggestion could make a valuable addition to the official documentation. |
I guess I am having - if not the same - a very similar issue hence I would keep this bug open for further analysis. I posted my issue both on Reddit and the Discord server but never opened a report here on Github: Original issueI've been unable to achieve a smooth video stream, which contrasts sharply with the seamless experience provided by Steam Link. Issue Summary:
Troubleshooting Steps Taken:
Technical Details:
Configuration:
Given the steps I've already taken and the configuration details provided, I'm at a loss for what to try next. Additionally, the Reddit thread shows other users in the same boat. |
I did some more testing, but I'll spare you the details. (tcpdump, compute timestamp differences of image bursts, do some basic statistics). Not only is the average frame time slightly higher than the expected 16.6666 ms (for 60Hz), but also the median is longer. (Streaming of regular desktop stream from a host in an xorg sesssion.) I inspected the code in kmsgrab.cpp, in particular this part here: Sunshine/src/platform/linux/kmsgrab.cpp Lines 1411 to 1425 in de97eac
I'm definitely not an expert on this, but I am wondering how precise that Again, not an expert, but wouldn't it be possible to do something like
|
@gschintgen I think it's worth trying. If you put up a PR it would be easier for people to test. |
Ok, will do.
…On March 28, 2024 5:43:54 PM GMT+01:00, ReenigneArcher ***@***.***> wrote:
@gschintgen I think it's worth trying. If you put up a PR it would be easier for people to test.
--
Reply to this email directly or view it on GitHub:
#2286 (comment)
You are receiving this because you were mentioned.
Message ID: ***@***.***>
|
I'm definitely on to something ;-)
The quality of the stream (frame rate, microstutters) improves subjectively and objectively. Given the absolutely perfect 60Hz, I'm convinced that the one slight stutter was actually just host-side and streamed as-is. The PR will come over the weekend. (The 59.94 Hz NTSC frequency was just a coincidence after all.) @peperunas I'm not sure any of this is related to nvidia-related codepaths, but I'm very grateful that you got the issue reopened! |
@gschintgen That's great news! I did not do anything; you are doing the heavy lifting :-) Please feel free to contact me; I would be glad to test and assist with the development. I did not have time to dig deeper as you did, but now that we have a place to focus on, I'd gladly help however possible. Great job and thanks @ReenigneArcher for re-opening the issue! |
I filed PR #2333. |
Must be logged into GitHub and can grab most Linux artifacts from here. https://github.com/LizardByte/Sunshine/actions/runs/8493409909#artifacts That will have all, except AppImage and Flatpak. |
I recently switched from Windows to Arch and was noticing this stuttering issue every few seconds. I tried a bunch of things listed above I already knew about and it didn't help. I tested the latest build on the related PR to this issue with X11 capture and it fixed all stuttering issues for me. Just thought you might find this feedback helpful. |
Thank you very much for your feedback! How do you encode the captured frames? In hardware I suppose? AMD/Intel/Nvidia?
…On April 2, 2024 10:08:08 AM GMT+02:00, kstorbakken ***@***.***> wrote:
I recently switched from Windows to Arch and was noticing this stuttering issue every few seconds. I tried a bunch of things listed above I already knew about and it didn't help. I tested the latest build on the related PR to this issue with X11 capture and it fixed all stuttering issues for me. Just thought you might find this feedback helpful.
--
Reply to this email directly or view it on GitHub:
#2286 (comment)
You are receiving this because you were mentioned.
Message ID: ***@***.***>
|
Ahh shoot, yes, I meant to mention that. I'm on Nvidia and using NVENC. I haven't played around with patching NvFBC yet, but I'll probably test that and report back if nobody else gets to it before me. |
Great job @gschintgen and thanks for your contribution! I briefly tested your fork and it seems the microstutters are gone / reduced significantly. Looks great for me! |
Perhaps a stupid question, but does this affect Windows as well? I've always had microstutters on my Windows 11 Pro build using an Nvidia RTX3080. My performance overlay also shows the dreaded 59.94fps despite using a 60hz display client (Logitech G Cloud), but I haven't looked into this enough to understand what causes it. |
I don't think so. The Windows capture loop (or are there multiple different ones?) seems to be this one: Sunshine/src/platform/windows/display_base.cpp Lines 159 to 186 in bb7c2d5
It is quite a bit more involved than the one in Linux, but it does seem to compute a specific theoretical target time for each frame that is to be captured. This is done specifically in order to not accumulate a sleep overshot, like it still does on Linux. See PR #1288 (which I only found now using git blame). So in essence the present type of bug (accumulating overshoot of waiting times) has been fixed for Windows ca. 8 months ago. Then again, I'm quite new to this entire codebase and my C++ is rusty to say the least. In my experience the onscreen latency stats fluctuate a bit too much to properly distinguish e.g. 59.94 from 60.00Hz. The first step would be to ensure that you're not inadvertently synchronizing to 59.94Hz on your host (e.g. vsync to a display at that refresh rate). I also found the moonlight-qt console output (which I've just saved as a logfile) quite helpful since it gives global stats instead of a momentary snapshot. Once you have noteworthy evidence you can file a new issue to properly document under what conditions the stream doesn't achieve the expected framerate. If you are constantly updating the screen at a precise 60fps (via vsync or framecap) and request a 60fps stream, then the stream should indeed have 60.0 fps, as illustrated here. (Caveat: I read multiple times that apparently on Windows frames will only be captured if the actual display content has changed, hence the need to constantly update the screen contents if you want to replicate this testing methodology.) |
Thanks for confirming. I will get some testing done and collect enough evidence to open a new issue if applicable. |
I think related to #754 |
Is there an existing issue for this?
Is your issue described in the documentation?
Is your issue present in the nightly release?
Describe the Bug
I'm experiencing microstuttering on my moonlight clients. This can be alleviated using the frame pacing option, but I still decided to investigate further. I found that even though both the sunshine host system and the moonlight client system use 60.00Hz refresh rates, the stream is captured at 59.94Hz, i.e. the old NTSC-based framerate.
According to multiple reports online this discrepancy can induce periodic pacing irregularities.
For example, here's a discussion on moonlight: https://www.reddit.com/r/cloudygamer/comments/px88e0/moonlight_how_to_reconcile_host_and_client/
while here's a recent discussion about Playstation Portal with the exact same problem (59.94 vs 60): https://www.reddit.com/r/PlaystationPortal/comments/198mohz/stutter_cause_ps5_outputting_5994_fpshz/
The example video linked in the second reddit discussion above nicely illustrates the kind of microstuttering that I'm also observing.
Expected Behavior
Sunshine captures and transmits at 60.00Hz if that's the frequency asked for by moonlight.
Additional Context
My report is based on moonlight-qt 5.0.1 running on a Pi4 without underlying desktop environment. (I can gladly provide more details.) Since I did not trust my TV to properly distinguish between 59.94Hz and 60Hz in its OSD I checked using the
tvservice
command:I also checked the TV's edid:
Apparently the TV doesn't even support [email protected]. So the client should indeed be running at 60.00Hz.
On the host (AMD / Wayland / Gnome), I'm using gnome-randr.py to change my resolution and refresh rate on the fly. This tool is working great for Gnome/Wayland. Here is the relevant output:
The asterisk indicates the active resolution.
Yet, here's an excerpt of moonlight's log:
As you can see, sunshine seems to transmit at the historic NTSC framerate of 59.94Hz instead of the 60 Hz used by the client, the host and configured in moonlight. If I leave the stream running for longer, it averages at exactly 59.94. This can't be a coincidence, right?
In the same logfile I also have these entries. (excerpt of
grep 60
)If you need more info, please tell me.
Host Operating System
Linux
Operating System Version
Ubuntu 22.04.4
Architecture
64 bit
Sunshine commit or version
0.22.2
Package
Linux - deb
GPU Type
AMD
GPU Model
AMD RX6650XT
GPU Driver/Mesa Version
mesa 23.3.6
Capture Method (Linux Only)
KMS
Config
Apps
No response
Relevant log output
The text was updated successfully, but these errors were encountered: