Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I got a 3090 in the depths of the 2021 GPU shortage and for my purposes (mainly JRPGs) I can still run basically every new game at 4k max settings. I don't really see much need to upgrade.

Wonder if the AI rush will result in a situation where the state of the art is so far beyond what's needed for gaming that gpus won't be a bottleneck anymore.



Any additional capability will be quickly filled with bloat, especially in the video game industry, where optimization is for post-release.

Expect a lot of creatively bankrupt tech demos with eye-watering hardware requirements.


Man I sure love some blurry mess of pixels that looks like it’s straight out of the Wii U catalogue and requires a 4070 to run a 1080p60

Graphics programming is a lost art, buried deep below an *unreal* amount of abstraction layers


I see what you did there.


Oh and DLSS 4.0. A lot of AI frames, with actual responsive frame rate entering into cinematic territory...


I got a 3080 which I managed to pre-order at MSRP, up until ~1.5 year ago that thing was selling for more than I payed for it in the used market.

> Wonder if the AI rush will result in a situation where the state of the art is so far beyond what's needed for gaming that gpus won't be a bottleneck anymore.

I dunno, it seems the scaling is different for AI. Like AI is more about horizontal scaling and gaming is more about vertical scaling (after you get to native 4k resolutions).


An RTX 5090 is nowhere near enough if you want to play graphically demanding games on the latest high-resolution VR headsets. Many are close to 4k*4k per eye, so essentially an 8k screen that needs to run at a stable 90 fps not to cause nausea, and ideally higher.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: