Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Lossy compression is definitely anti-dither, that’s true. I think a lot of the banding you see in streaming is actually bad color handing and/or miscalibrated devices… very dark colors are especially prone to banding and it’s often due to TVs being turned up brighter than they should be, and sometimes due to compression algorithms ignoring color space & gamma.


what you generally want is rather than lossy compression on top of dithering, to instead use lossy compression of higher bit depth data, and then the decoder can add the dithering to represent the extra bits.


Sure, of course, technically, but that circles back to @quietbritishjim’s point: if you have the bandwidth for, say, 8 bits per channel, then just display it without dithering. Dithering would reduce the quality. If you don’t have the bandwidth for 8 bits per channel, then dithering won’t help, it’s already too low and dithering will make it lower. In other words, dithering always reduces the color resolution of the signal, so when a compression constraint is involved, it’s pretty difficult to find a scenario where dithering makes sense to use. This is why dithering isn’t used for streaming, it’s mainly used to meet the input color resolution constraints of various media or devices, or for stylistic reasons.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: