Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Doesn't that assume no compression in the transfer? Surely, if streaming compression is getting better, there's some lossless compression that can be used for display signals?


What does your video card/monitor do when it's asked to display a frame that can't be compressed?


Generally these things would have to be done in hardware, bumping up the price by a non-negligible amount. Additionally, it would increase the input lag on the monitor, something manufacturers are trying to avoid.


It would have to be real-time (60 frames a second) and I just don't see significant savings being possible anytime soon.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: