Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Outside of networked multiplayer games, I can’t think of any reason this would be “bad.”


Even slow paced games will experience tearing. Here, the parent post has inadvertently made their setup beyond non-ideal.

They picked a random 24 FPS target (maybe based off old movies for some reason?), but their monitor refreshes at 165Hz. The mis-match of frames and refreshes is so off, they will experience tearing even with mostly-static backgrounds.

Basically, nothing the parent poster is doing makes sense. Neither for power savings nor viewing pleasure.


> They picked a random 24 FPS target (maybe based off old movies for some reason?)

Not random, and not "old movies". Modern day movies are still 24fps, it's part of what makes movies look the way they do, motion blur.

> but their monitor refreshes at 165Hz

So? It refreshes the image 165 times a second, regardless of video framerate. It just so happens that many of those 165 refreshes in this case will be the same image over and over. Pause a video, so 0 fps, or 1 fps depending on your take, the monitor is still refreshing 165 times a second because it doesn't have anything to do with the video in this case.


The problem is the mis-alignment of frames being pushed into the buffer and the monitor drawing them.

You will experience tearing with this much of mis-alignment. It's why things like GSync/FreeSync even exist.

> motion blur

You can enable motion blur in most game settings...


I thought tearing occurred from trying to run display frame rates beyond the refresh rate of the monitor? A frame rate below the refresh rate should have ample opportunity to flush correctly, or I’m not understanding the tech in play maybe.

If the game is a rpg or any casual genre where timing is not important and lack of it doesn’t impact gameplay or competitiveness, what’s it matter if they run at 24fps? Maybe they run that low to screen capture at a lower frame rate to not needlessly burn cpu converting 60 or 160 fps down to 24? And most games cap at 30 or 60, yet higher refresh rate monitors don’t have tearing issues or everyone would be throwing a fit. My son has a 120hz monitor and plays countless games capped at 60. Is it the fact they all divide into the refresh rate that stops the issue?


https://www.displayninja.com/what-is-screen-tearing/

It happens in both directions.

Yes, staying at some multiple of your refresh rate is the general advice. So 60 would be ok on a 120Hz screen.

The OP's comment that started all this, is doing 24fps at 165hz, which is 6.875. They claim to be in the gaming world too, which makes their bizarre decision even more puzzling. Just don't do this.


Cargo culting is the best, which is what you are doing here. /s

Even if it is exactly a multiple or even the same number, without vsync, you’ll still get tearing. Why? Because fps is a dynamic number. It’s a measure of how many frames your card can generate in a second. By the time you measure, the frames are already output and gone from the buffer.

The refresh rate on the monitor is a constant, never changing value.

Even if you set the fps to 30, you’ll sometimes render 30.1 or 29.9 frames in a second due to random jitter from the geometry/shader complexity quickly changing on the screen.

There is no such thing as constant fps.


That article badly misunderstands what vsync is. All vsync does is tell the GPU to wait until the monitor is done displaying a frame before switching buffers. That’s it. Turning on vsync eliminates tearing no matter what frame rate or what form of multiple buffering is used.


Not tearing. It can still utilize vertical sync. If done naïvely the time between frames would jitter a bit though. At 165Hz it wouldn’t be too bad.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: