> Gaming was supposed to be one of the best drivers for 8K adoption.
While the step from 1080p 1440p to 4K is a visible difference, I don't think going from 4K to 8K would be a visible since the pixels are already invisible at 4K.
However the framerate drop would be very noticeable...
OTOH, afaik for VR headsets you may want higher resolutions still due to the much larger field of vision
I usually still play at 1080p on my Steam box because my TV is like nine feet away and I cannot tell a difference between 1080p and 4k for gaming, and I would rather have the frames.
AAA games have been having really bad performance issues for the last few years while not looking much better. If you wanna game in 8K you are gonna need something like a NASA supercomputer.
AAA games are struggling for a lot of reason, and consoles are struggling as well. PC gamers tend to use a more traditional monitor setup and won't buy a gigantic television. At least, not for gaming.
We can’t render modern games at decent frame rates at 4k without going down the path of faking it with AI upscaling and frame generation.
There was no hope of actual 8k gaming any time soon even before the AI bubble wrecked the PC hardware market.
Attempting to render 33 million pixels per frame seems like utter madness, when 1080p is a mere 2 million, and Doom/Quake were great with just 64000. Lets have more frames instead?
(Such a huge pixel count for movies while stuck at a ‘cinematic’ 24fps, an extremely low temporal resolution, is even sillier)
Yeah, not only the huge required jump in raw fill rate, but to get the most out of a 4K TV you need higher detail models and textures and that means you also need a huge jump in VRAM, which never materialised.
I don't see a future in which we play at 4K at top settings either without AI upscaling/interpolation. Even if it were theoretically possible to do so, the performance budget the developers have going forward will be assuming that frame generation and upscaling is used.
So anyone who wants only "real frames" (Non upscaled, non generated) will need to lower their settings or only play games a few years old. But I think this will be something that becomes so natural that no one even thinks about it. Disabling it will belike someone lowering AA settings or whatever. Something only done by very niche players, like the CS community does today where some are playing 4:3 screens, lowering AA settings for maximum visibility not fidelity and so on.
> While the step from 1080p 1440p to 4K is a visible difference
I even doubt that. My experience is, on a 65" TV, 4K pixels become indistinguishable from 1080p beyond 3 meters. I even tested that with friends on the Mandalorian show, we couldn't tell 4K or 1080p apart. So I just don't bother with 4K anymore.
Of course YMMV if you have a bigger screen, or a smaller room.
For reasonable bitrate/resolution pairs, both matter. Clean 1080P will beat bitrate starved 4K, especially with modern upscaling techniques, but even reasonable-compression 4K will beat good 1080P because there's just more detail there. Unfortunately, many platforms try to mess with this relationship, like YouTube forcing 4K uploads to get better bitrates, when for many devices a higher rate 1080P would be fine.
I'm curious, for the same mb per second, how is the viewing quality of 4k vs 1080p? I mean, 4k shouldn't be able to have more detail per se in the stream given the same amount of data over the wire, but maybe the way scaling and how the artifacts end up can alter the perception?
If everything is the same (codec, bitrate, etc), 1080P will look better in anything but a completely static scene because of less blocking/artifacts.
But that’s an unrealistic comparison, because 4K often gets a better bitrate, more advanced codec, etc. If the 4K and 1080P source are both “good”, 4K will look better.
Yeah, I have a hard time believing that someone with normal eyesight wouldn't be able to tell 1080p and 4k blu-rays apart. I just tested this on my tv, I have to get ridiculously far before the difference isn't immediately obvious. This is without the HDR/DV layer FWIW.
10 feet is pretty far back for all but the biggest screens, and at closer distances, you certainly should be able to see a difference between 4K and 1080P.
For the 30 to 40 degree FoV as recommended by SMPTE, 10ft is further back than is recommended for all but like a 98in screen, so yes, it’s too far back.
It very much depends on the particular release. For many 4K releases you don't actually get that much more detail because of grain and imperfect focus in the original film.
there are so many tricks you can do as well, resolution was never really the issue, sharpness and fidelity isn't the same as charming and aesthetically pleasing
> While the step from 1080p 1440p to 4K is a visible difference
It really isn't.
What you are likely seeing is HDR which is on most (but not all!) 4K content. The HDR is a separate layer and unrelated to the resolution.
4K versions of films are usually newly restored with modern film scanning - as opposed to the aging masters created for the DVD era that were used to churn out 1st generation Blu-Rays.
The difference between a 4K UHD without HDR and a 1080p Blu-Ray that was recently remastered in 4K from the same source is basically imperceptible from any reasonable viewing distance.
The "visible difference" is mostly better source material, and HDR.
Of course people will convince themselves what they are seeing justifies the cost of the upgrade, just like the $200 audiophile outlet and $350 gold-plated videophile Ethernet cable makes the audio and video really "pop".
I know the thread is about tvs, but since gaming has come up, worth noting that at computer viewing distances the differences between 1080p/1440p and 4k really are very visible (though in my case I have a 4k monitor for media and a 1440p monitor for gaming since there’s 0 chance I can run at 4k anyway)
While the step from 1080p 1440p to 4K is a visible difference, I don't think going from 4K to 8K would be a visible since the pixels are already invisible at 4K.
However the framerate drop would be very noticeable...
OTOH, afaik for VR headsets you may want higher resolutions still due to the much larger field of vision