Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't think holding human perception as a standard for video games holds up. For audio and video it makes sense, but we are a long way of for video.

For video games there is a huge "storage waste" factor. A factor that I think is more important than the human perception limit. If you look at modern games you can probably throw away double digit percentage signs of most games if a capable team would have the time to optimize for disk space. It's simply not done because it has little advantage. I think this wastage factor will scale with the complexity of video games regardless of graphical fidelity.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: