Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Opencv does use linear interpolation by default. What you'd need to do is to use something that helps against aliasing, for example first blurring the image with a kernel of the appropriate size or to use a scaling method like opencv's INTER_AREA.


What actually helps here is to use linear colorspace for downscaling and to correctly detect the source image's colorspace.


Colorspaces are an issue with scaling/averaging, but it's not what's happening here.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: