-
Notifications
You must be signed in to change notification settings - Fork 4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Test new -tune=ssimulacra2
#17
Comments
Here are my findings on a dataset from https://www.compression.cc with multiple codecs at default encoding effort. Let me know if you expect significantly different results at other speeds. Web-quality range and 4:2:0Restricting AVIF bpp to [0.03-1.03] range and images to 2-3 megapixels to match the 10th to 90th percentiles at https://github.com/webmproject/codec-compare/wiki/Bits-per-pixel-of-Internet-images. For the same Butteraugli score
For the same DSSIM score
High-quality range 70-90 and 4:4:4Restricting to AVIF quality range 70 to 90 is arbitrary. For the same Butteraugli score
For the same DSSIM score
|
Thanks @y-guyon for running the tests! I took a quick look, and I can already tell they're very informative. Tune ssimulacra2's changes mainly deal with bit redistribution, so I wouldn't expect wildly different results at other speeds. That said: I'd also recommend testing speed 3, as it enables rectangular partitions, larger transforms, and restoration filters; while still being fast enough for some production scenarios. It'd be interesting to see how the tune's tweaks interact with the larger available tooling repertoire. Edit: I'd also suggest testing 10-bit speed 6, as the additional internal precision can make a significant difference at preventing banding, which SSIMULACRA 2 heavily penalizes. At very high quality, SSIMULACRA 2 scores can increase by a whole point or more. |
Here's another comment: AOMediaCodec/libavif#2412 (comment) |
At speed 3 the gains are in the same ballpark: Web-quality range and 4:2:0Restricting AVIF bpp to [0.03-1.03] range and images to 2-3 megapixels to match the 10th to 90th percentiles at https://github.com/webmproject/codec-compare/wiki/Bits-per-pixel-of-Internet-images. For the same Butteraugli score
For the same DSSIM score
High-quality range 70-90 and 4:4:4Restricting to AVIF quality range 70 to 90 is arbitrary. For the same Butteraugli score
For the same DSSIM score
My dataset is 8-bit. Do you have a 10-bit corpus with a permissive license to recommend? |
Thanks @y-guyon for the follow-up! It's reassuring to see the gains for SSIMULACRA 2 hold up proportionally for s3.
My apologies, I should've clarified. By "10-bit", I was thinking of using the CLIC dataset, but converted to 10-bit before encoding (e.g. by specifying For example, this image of a flower against a blurry background from CLIC:
The 10-bit image is 9% smaller, yet it scores 2 points higher. Both images were encoded at speed 6, with tune SSIMULACRA2. Close visual inspection of the 8-bit image reveals subtle banding/blocking that's not present in the converted 10-bit image:
Here are the source and encoded files for reference: 8bit vs 10bit.zip |
Recently, libaom landed a new SSIMULACRA2 tune, specifically optimized for still pictures: https://aomedia-review.googlesource.com/c/aom/+/194662. It's currently available on the
main
branch.The new tune consists of a set of optimized defaults (QMs, deltaq, rdmult and luma/chroma allocation tweaks, etc) that improve SSIMULACRA 2 scores, and in our preliminary testing, it also has a more favorable subjective quality profile.
It'd be nice to have SSIMULACRA 2 results (in addition to the default tune) so we can have a more complete look at how other metrics react (especially with butteraugli and DSSIM).
The text was updated successfully, but these errors were encountered: