TLDR how is that bad?
TLDR how is that bad?
Well, if you encode at high bit depth, the removal of the noise won’t create visible banding (at most barely visible at 10 bpc, completely invisible at 12bpc), which was my point. But the generated noise can still prevent banding during playback in case the player doesn’t dither (which they usually don’t by default).
Denoise-noise-level sets a denoise strength, which affects the denoising done on every frame in case of denoise-noise-level > 0. The denoised frame is then compared with the unaltered frame in some (sadly very unsuitable) way, and then noise is generated based on that calculated difference, and applied to the frame after it is encoded. Because the implementation is so shitty, the visual energy removed during denoising, and the visual energy added with noise synthesis, can diverge drastically.
So, no matter what denoise-noise-level you choose, the result will be far from optimal. And stronger levels won’t just create unnecessary noise, but also create ugly grain patterns, which can become quite obvious beyond denoise-noise-level 10 or so.
If AV1 noise synthesis “removes” banding, that banding was never part of the video in the first place, but your video player system created it during bit depth reduction, since you’re viewing on an 8-bit display. This can be prevented with dithering, which AV1 noise synthesis can substitute for.
Webm is just a video container, not a format. WebP uses quite outdated image compression from the VP8 video codec, which may perform quite a bit better than JPEG at very low quality, but at near-transparent quality, which images are usually encoded to, it very often doesn’t even beat JPEG.
It’s entirely an encoder issue…
That’s the dumbest thing I’ve read today
now that’s delusional…
Whether 2-pass has benefits for the “constant quality” mode in AV1 encoding, depends entirely on the encoder. For aomenc, you want to always use 2-pass for slow offline encoding, as otherwise important features are disabled.
Removed by mod