Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

On my PC, I limit my FPS to 24 (via graphics card settings). It uses far less power, and runs considerably cooler.


This is bad, in a gaming setting, for several reasons.

You should at least make your frame limit be equal to your screen refresh rate.


Meh, gsync keeps it sane. But no, my screen refresh rate is 165hz. If it were to be running that many fps, it would result in ~200 watts of power be utilized. Maybe this is "good" for gaming, but it is def bad for my wallet who has to pay the power bill. Maybe when electricity prices go back down, I'll turn it up.


> Meh, gsync keeps it sane.

GSync isn't magic. It can only do so much. In this situation, where you've tied your frames to 24 (for some reason?) and your monitor refresh rate is 165Hz, GSync isn't going to save you. GSync will help you if there are sudden, brief drops in FPS below the refresh rate and keep things in sync. It will not save you if your refresh rate is 165Hz and your frame rate is 24...

> Maybe when electricity prices go back down, I'll turn it up

I don't think you'll notice the extra $5 a month it costs to run that a couple hours a week.

We should at least use a real reason why you'd do this. Saving a smidge of power is not one of them.

You should, instead, under-clock your monitor refresh rate to something low, like 75 or 60Hz, then frame-limit to that number as well.


Gsync/variable refresh rate should be able to make that work. Even if the display doesn’t go down to 24Hz, it only needs to drop to 144Hz to be an integer multiple of 24Hz - you’d just get each frame being displayed for 6 frames.


How high the electricity cost in your area for you to notice an extra 200 watts during gaming? Assuming $0.3/kwh, if you play 8 hours everyday (which is a lot), you'll increase your bill by $14.4. If you only play 2 hours everyday on average, the increase is only $3.6.


As I mentioned in another comment down-thread, my costs (after taxes + fees) is €0.60 a kWh.


Ok so $7 a month then. What you're doing to "save costs" is absurd, and the approach is entirely incorrect.

Downclock your monitor to something like 60Hz, then pin your frames there too.


I'm not sure how you reached $7...

0.2 kw * 8 hours * 30 days * €0.60 =

1.6 kw/h/d * 30 days * €0.60 =

48 kw/h * €0.60 =

€28.80 per month

I game about 3-6 hours a day, but I also do a lot of (unreal) programming on the same computer. So according to measurements, it draws about 2-3kw/h a day. Without limiting the framerate, it's closer to 10kw/h per day. We're talking a savings of ~€4 per day. That's over €100 per month in savings.

> Downclock your monitor to something like 60Hz, then pin your frames there too.

I'm not sure what my monitor rate has to do with fps. They are two totally independent measurements that are very rarely synced if you are doing any kinds of graphically intensive work, even if you pin them to the same rate. The monitor will still double-post frames every so often, or even skip a frame. Another good example is a paused video which is 0fps, but the monitor doesn't care. It just keeps showing the same frame over and over again. The same thing happens here, and with (g|v)sync, there's never any tearing.


Monitor refresh rate and FPS your GPU can generate are very related[1].

Again, your setup is so far from ideal, you should reconsider. Reduce your refresh rate to some multiple of 24 if you insist on 24 for some unknown reason.

It's one of those situations where you are being so clever you're hurting yourself and not realizing it.

[1] https://www.displayninja.com/what-is-screen-tearing/


I think you are lacking some basic fundamentals and/or not reading what I’m writing.

Gsync is enabled. There is no tearing. Monitor refresh rate and fps have literally nothing to do with each other when the fps is less than the refresh rate.


I has to be a divisor of the frame rate, not necessarily equal. Eg with 30fs in a 60fps screen you will do fine. But with a high refresh rate screen and gsync/freesync on top of that, most probably it does not matter at all because the refresh rate of the screen should adjust.


Outside of networked multiplayer games, I can’t think of any reason this would be “bad.”


Even slow paced games will experience tearing. Here, the parent post has inadvertently made their setup beyond non-ideal.

They picked a random 24 FPS target (maybe based off old movies for some reason?), but their monitor refreshes at 165Hz. The mis-match of frames and refreshes is so off, they will experience tearing even with mostly-static backgrounds.

Basically, nothing the parent poster is doing makes sense. Neither for power savings nor viewing pleasure.


> They picked a random 24 FPS target (maybe based off old movies for some reason?)

Not random, and not "old movies". Modern day movies are still 24fps, it's part of what makes movies look the way they do, motion blur.

> but their monitor refreshes at 165Hz

So? It refreshes the image 165 times a second, regardless of video framerate. It just so happens that many of those 165 refreshes in this case will be the same image over and over. Pause a video, so 0 fps, or 1 fps depending on your take, the monitor is still refreshing 165 times a second because it doesn't have anything to do with the video in this case.


The problem is the mis-alignment of frames being pushed into the buffer and the monitor drawing them.

You will experience tearing with this much of mis-alignment. It's why things like GSync/FreeSync even exist.

> motion blur

You can enable motion blur in most game settings...


I thought tearing occurred from trying to run display frame rates beyond the refresh rate of the monitor? A frame rate below the refresh rate should have ample opportunity to flush correctly, or I’m not understanding the tech in play maybe.

If the game is a rpg or any casual genre where timing is not important and lack of it doesn’t impact gameplay or competitiveness, what’s it matter if they run at 24fps? Maybe they run that low to screen capture at a lower frame rate to not needlessly burn cpu converting 60 or 160 fps down to 24? And most games cap at 30 or 60, yet higher refresh rate monitors don’t have tearing issues or everyone would be throwing a fit. My son has a 120hz monitor and plays countless games capped at 60. Is it the fact they all divide into the refresh rate that stops the issue?


https://www.displayninja.com/what-is-screen-tearing/

It happens in both directions.

Yes, staying at some multiple of your refresh rate is the general advice. So 60 would be ok on a 120Hz screen.

The OP's comment that started all this, is doing 24fps at 165hz, which is 6.875. They claim to be in the gaming world too, which makes their bizarre decision even more puzzling. Just don't do this.


Cargo culting is the best, which is what you are doing here. /s

Even if it is exactly a multiple or even the same number, without vsync, you’ll still get tearing. Why? Because fps is a dynamic number. It’s a measure of how many frames your card can generate in a second. By the time you measure, the frames are already output and gone from the buffer.

The refresh rate on the monitor is a constant, never changing value.

Even if you set the fps to 30, you’ll sometimes render 30.1 or 29.9 frames in a second due to random jitter from the geometry/shader complexity quickly changing on the screen.

There is no such thing as constant fps.


That article badly misunderstands what vsync is. All vsync does is tell the GPU to wait until the monitor is done displaying a frame before switching buffers. That’s it. Turning on vsync eliminates tearing no matter what frame rate or what form of multiple buffering is used.


Not tearing. It can still utilize vertical sync. If done naïvely the time between frames would jitter a bit though. At 165Hz it wouldn’t be too bad.


That's kinda like saying to save money you eat only plain rice, uncooked. Like yes it will technically work but most people would not be willing to subject themselves to that.


That sounds horrifying. I can't comfortably watch video at 24 fps, let alone play an interactive game. But to each his own, who am I to dictate your preferences?


I'm not the GP, but it makes sense to me. I can't even see a difference between 24 and 60 FPS. It has to get down into the teens before I can tell. I haven't capped my graphics like that, but that's more because it never occurred to me than because it would bother me.


Honest question, not intended to be disparaging: do you not see a difference between the various animations on [1], for example? To me, they are very apparent.

[1] https://www.testufo.com/


This website does things a poor justice and doesn't seem very accurate. For example, I tried this on my linux work computer (which doesn't offer vsync, or a graphics card) and compared it to my gaming pc. The non-vsync version looked much better than the vsynced version.

I went outside and took a 4k video at 24fps and played it back. It looks smooth as butter, just like a movie. Then I took the same video, but recorded at 60fps and re-encoded it in 24fps. It looks like the example on this website, where it looks jumpy.

I assume this is because of motion blur.


The bottom one looks different. Like I said, I can tell once the FPS gets into the teens. The top two look exactly the same to me. I realize they are different, but I can't see it.


https://www.testufo.com/framerates#count=4&background=stars&...

This is better, because we're talking about the lower range right now anyway.

That said, I think the difference between 144Hz and 72Hz is a lot more subtle and can't be seen if you try to glance at them directly. Try viewing them with peripheral vision and there's a difference in smoothness. That shows up in things like moving a window around (or even just moving a mouse). It makes a huge difference in input latency/response, but the difference can't be seen easily by just staring directly at a moving image.


Wow! I can easily see the difference between 120 Hz and 60 Hz. 30 Hz (bottom one) looks absolutely horrible to me. I guess there is a large gap between how various people perceive the world.


About 10 fps is my lower limit. I can’t tell the difference for anything higher than 18 fps.


How much money could it possibly save? It might be more worth it to sell your GPU & buy a cheaper one. You might even be able to run at a higher framerate with those savings!


10-20 Euro per month, just from gaming. From all things, it makes a difference of about 50-100 Euro a month.

20fps uses 70 watts, full speed is 200 watts. Times 5 hours, times 30 days, times €0.60.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: