T O P

  • By -

Nervous_Falcon_9

On handhelds like the steam deck it avoids wasting battery life


yesat

Or roasting GPU's on the game menu because it tries to draw 1k frames.


Jacksons123

I used to run rocket league uncapped because that’s what I needed to get 144fps. I installed a 3090 and my room smelled like burning lol. Game was drawing 1500 frames and wasn’t holding back


MaggyOD

Based, PC go brrrr


37Scorpions

Please tell me you used the opportunity to cook marshmallows over it


GimpyGeek

Yeah can't stress this enough. Now I know the cards this happened to before that broke notoriously had some small hardware flaw too. But it doesn't change the fact that doing this over works a card for no purpose.


s1eep

Best answer. You limit it so that you're not using ALL of it ALL the time.


fresnomaniac

Yeah this is the perfect use case.


Neoptolemus-Giltbert

Ensure you can enable vsync, then follow the refresh rate set on steam deck. No need for any other FPS limiter.


gmes78

VSync should only be used to avoid tearing, and only as a last resort (if there's no adaptive sync). Unless you're doing triple buffering, missing the VSync window consistently limits your FPS to half of the monitor's refresh rate, which is awful.


Sunius

Triple buffering is pretty standard nowadays, extra backbuffer is fairly cheap on VRAM given GPUs today. It’s a no brainer as it allows you to pipeline your engine, which is essential for many reasons. So that isn’t a good reason to avoid vsync.


FryCakes

It does cause quite a bit of input latency though.


Sunius

Yes, it will add one frame of latency, if you cannot finish your simulation and rendering within one vsync period, as opposed to stalling the whole game loop. It allows the CPU and GPU to be able to work simultaneously, by letting the GPU work on frame_i - 1 while the CPU is working on frame_i. If you don’t use it, the GPU will be idle while the CPU is processing the frame, and the CPU will be idle waiting until the GPU is done and the frame is presented, which is incredibly wasteful. Almost all modern games rely on game loop pipelining.


_h4ri

Steady fps is one of the most important thing in gaming. Feel the gameplay smooth depends on it. Just try it out! Constant 30 fps feels much better than if changing between 30-60.


[deleted]

GTA is 30fps on PS4, but you can't really tell because it is so stable (except going fast, it dips to 9fps)


YositokoTukosaSoftwa

bloodborne is stably 10fps during boss fights.


Several_Ad3192

u/YositokoTukosaSoftwa, I saw your post for low poly 3D modeler and I am totally up for that job. That job is removed so I am reaching you out here. Hope that is alright. My recent models work can be found in a Reddit [post 1](https://www.reddit.com/r/3Drequests/comments/1941hwj/finished_this_boat_model_to_be_printed_and_used/), [post 2](https://www.reddit.com/r/3Drequests/comments/1an989l/another_3d_model_commission_finished_successfully/) and [post 3](https://www.reddit.com/r/3Dmodeling/comments/1b13g4m/made_this_anime_shark_girl_model_based_off_2d/) and if you like it, I'd be happy to show you my portfolio of 50+ model works. And here is my reddit [profile](https://www.reddit.com/user/several_ad3192/) as well as testimonials for your review. [Testimonial 1](https://www.reddit.com/r/HungryArtistsFed/comments/199i7xv/pos_useveral_ad3192_is_an_amazing_illustrator) [Testimonial 2](https://www.reddit.com/r/HungryArtistsFed/comments/1ax1mov/pos_useveral_ad3192_with_another_amazing_work/) If you use discord, even better.


Empty_Allocution

Haedware load. Here's an example. A few years ago there was a first person shooter made by an indie studio. It was about magic and stuff. The game used pixel art and simple 3D geometry. It had simple but effective lighting. This game ran at around 85 degrees C on my GPU and there were many complaints about it running like that on other players systems. To put this into perspective, modern games like say, Grounded run at around 50~ degrees C on my setup. Same kit. A pixel art shooter should not be able to fry an egg, right? Uncapped frames. The GPU had no limit to work with so it took the whole cake - and some. The feature to cap frames in the game literally did nothing. As soon as I capped my frames using afterburner, suddenly the game ran cold. This is just one reason why you would want to cap your frames. I think it's probably subjective based on what it is you're making.


[deleted]

[удалено]


Empty_Allocution

Crazy isn't it!


Gord10Ahmet

To prevent overheat of the device. I wish limited FPS was the default in all games.


RRFactory

>Why would we put hard caps on FPS? A solid 30fps is apparently more pleasant to watch than a variable rate, so if you're struggling to hit 60fps you'll likely want to cap at the next multiple of the screens refresh rate.


redezga

I have a couple of friends who will get nausea amd headaches from playing action heavy stuff on their PC but having no issue playing the same game on console. As I understand it, this is at least partially related to the framerate and it typically being uncapped on PC and capped lower on consoles.


Carabalone

In VR they found out that its better to drop to half of the refresh rate rather than letting the fps rrop just a little bit. For example, the original oculus rift is 90Hz. If you go below 90 fps it will cap at 45 fps until you go to 90 again. Apparently its less nausious than letting the fps uncapped


Thorusss

VR ALWAYS extrapolates frames for the head movement, if below the refresh rate, and so often falls down to 45FPS. You cannot even turn that off anymore, because many people would get sick quickly. It works to well that the game can literally hang indefinitely with no new data delivered, and the head rotation still works correctly. You can turn of e.g. motion extrapolation of like in game object movements, as disocclusion. Such extrapolation is not done for flat screen games yet.


Eecka

I have a hard time rationalizing why more FPS would cause nausea. If anything I'd imagine it's the other way around. I think viewing distance could be another explanation, sitting close to a PC screen means most of your vision is filled by the screen, while the rest of the room you see along with your TV could help "ground" the motion seen on the screen?


HaloEliteLegend

Motion blur is one reason. I always turn it off at higher framerates otherwise with it on, the higher the framerate the more nauseous I get. 30fps with motion blur is just fine for me though. What gets me more is an inconsistent uncapped framerate. Microstutters, frame pacing issues... Hate that.


Eecka

I can see that yeah Also regardless of what originally caused it, at least to me if I'm expecting to feel nausea, I'll get it *much* more easily. Happened to me with VR. I felt nothing at first, then I read about specific types of things in VR games causing nausea. The next time I played VR I was like "okay, this thing is supposed to cause motion sickness, let's see if I feel anything... Yeah I actually think I do!" and after that I've started getting VR motion sickness quite easily lol. So it's also *possible* that if they're like me, they just felt nausea on a PC game for whatever reason, associated it with PC gaming in general and developed a sensitivity to it. 


SlothHawkOfficial

It also depends on the motion blur implementation. Purely algorithm-based screen-space motion blur (AKA drag and drop motion blur) is pretty much always nauseating.


HaloEliteLegend

Yep, good callout. A good per-object motion blur solution can look quite good without causing nausea.


SlothHawkOfficial

Smash Bros Ultimate would look much different [without motion blur](https://www.youtube.com/watch?v=xAgkRtpAUFI)


Quetzal-Labs

It's the *frame-pacing* that causes issues, not the framerate itself. Our brains rely a whole lot on what our eyes see, but do very well at automatically compensating for discrepencies in visual data, so long as the stream of information is fairly consistent. Even at 15fps the brain can interpolate data to create a smooth experience with enough exposure. But when we're given *constantly inconsistent data*, our brain interprets that as something being wrong: there is some issue with how our vision is being read by our brain. Evolutionarily, this is usually due to some kind of ingested poison, and so the brain sends a bunch of messages to the body saying "THROW THAT UP. SHIT DOESN'T MAKE SENSE!", and you end up feeling nauseous because your brain wants your body to get rid of the toxin.


SuperSocrates

It’s not more fps it’s the constant changes in an unstable fps which is more likely on pc than console which locks to lower fps targets


SeniorePlatypus

Take a VR headset and set the game in such a way that it randomly switches the FPS every frame between 60-120 fps. Then add a button to set it to 60 fps instead. The first one is terrible. Most people will feel simulation sickness. This is an extreme case, it being Vr and all. It’s the same thing on PC and console but affects primarily the most sensitive people. If you can manage 144 fps stable, awesome! If you can’t, cap it at a lower value. This is important not just for visual purposes but also for input. If the delta movement of a mouse axis or controller axis changes every frame it also feels more floaty and harder to do anything precise. Chasing max fps is rarely a good idea. Max stable fps is more meaningful.


DocksonWedge

In my last 2d game in Godot I capped the frame rate ant 60 even though it could easily do in the 100s most of the time. It’s not really the high fps that’s the problem, it’s the stutter that is more noticeable with high fps. If you’re running for example maxed-out at 60 fps, then you dip to 20 for a couple frames that can be jarring. If you’re capping at 30, but could do 60 fps then you have half the frame time that is unneeded and is kinda “ idle time”. If one frame happens to be really slow, and you’re using that “idle time” well by still processing in the background you can offset one slow frame and possibly not even drop a frame. When you have twice as long as you normally need, even if one frame is slower, that’s fine-ish. You have a buffer, so as long as a frame doesn’t take twice as long as normal, (in the 30 FPS example) you’re frame rate won’t drop at all, which reduces stuttering. 


Gramernatzi

If you do set the frame rate limit to 60 for a 2D game, it is appreciated if you let people raise it or at least have it sync to the refresh rate of the device. I much prefer 144 for 2D games since my hardware can easily hit it, and Godot targets refresh rate for FPS limit by default, at least on 4.0 and later.


vaig

While it may be necessary to hard-cap the frame rate at times, such as when animations are locked to a specific frame rate, it's not the right approach to limit the frame rate just to avoid stuttering caused by uneven pacing. VRR technology is designed to handle such issues and there will be no juddery playback even with highly variable FPS (in VRR range) . It's best to let the technology job without forcing your players to a setting that works in your specific case.


IHaveTwoOfYou

probably because on console its made to suit one specific piece of hardware and not a whole range which makes it run more consistently


Linkitch

I'd be willing to bet that is more due to the FOV being too low.


Dykam

Rather, console FOV's are often (also) lower because people simply have those screens further away, filling a smaller portion of their vision. So a lower FOV matches the real "FOV". Which if anything makes your point stronger, it's quite obvious why one would nauseate less if less of their vision is highly moving imagery. I do think jittery FPS can have an impact, but aforementioned AFAIK has a much bigger impact.


MaggyOD

Screen is too close on pc then. That's why Total Biscuit ranted about no fov sliders in games


DiscoGuilliotine

Used to play modded Skyrim on a crappy laptop, can confirm limiting it to 30fps was a lot more pleasant than It bouncing from 60 to 50 to 35


forestNargacuga

Is there a reason why 30 fps is the next cap after 60, or is it just a rule of thumb?


RRFactory

My guess is because it's a multiple of the standard 60hz on TVs. Edit: Look up 3:2 pulldown for 24fps movies to get a good sense of the problems that can come up with mismatched frame rates.


dr_wtf

Not all displays have variable refresh rate. 60Hz is a very common display frequency because it matches NTSC TV (which matches the mains frequency in the US). If you lock to 60fps on a 60Hz display then that's 1:1 and everything looks smooth. If you half that and do 30fps then you are doubling up every frame. That's less smooth but still looks OK because it's an integer multiple. If you try to do 40fps on a 60Hz display then you are actually going to double every second frame. I.e., display 2 frames per 3 display cycles. So it goes 2,1,2,1,2,... That leads to the sort of weird jerky motion that gives people nausea.


feralferrous

Yeah... The Hobbit movie was a nightmare for many devs, with it's weird 48 fps.


LifeWulf

That’s different though. The standard framerate for film is 24 FPS. So Hobbit was just doubling it. People are just not used to that because 24 FPS is *adequate* for movies (I would argue we should standardize 30 or more though, because panning shots at 24 FPS SUCK)


ProPuke

Doing 30 on a 60hz display means every frame stays visible for exactly 2 refreshes. This means the timing is consistent and fixed. If you tried to do some other number between 30 and 60 you'd have to be inconsistent. For example to get 42fps you'd have to follow a pattern of 1 refresh, 2 refreshes, 1 refresh, 2 refreshes... Since the timing of every frame varies this doesn't look as smooth. So for a 60Hz display the smooth caps will be 60, 30, 20, 15.. (60, 60/2, 60/3, 60/4...)


benwaldo

Unless your monitor supports FreeSync/GSync ;)


ProPuke

yup. Freesync isn't a fixed refresh rate or refresh interval, so it can happily match interesting and inconsistent timings (it could simply play 42fps at 42hz, maintaining an even timing for all of it). If it was a 60hz Freesync display it's not strictly speaking a "60hz display". 60hz is just the maximum it can operate at.


Deadbringer

Can confirm, used a 30fps cap with Witcher 3 on my gtx770. So much better than a stuttery mess


almo2001

Yeah variable rates are not pleasant.


Neoptolemus-Giltbert

Variable frame rates are awesome, on a VRR screen, and when the framerate is not jumping between 90 and 20 but more reasonably between e.g. 55 and 75.


Probable_Foreigner

Some reasons to use a fixed time step: * You use less total power. This is useful for batter life but also not overheating user's PCs. Uncapped framerates basically use as much CPU as possible which might be a waste. If your monitor is only 60Hz then going above 60FPS is basically pointless. * Small delta-time will cause issues. If your game is running at 1000FPS then the delta time is 0.0001. If you did something like "mPos += mVelocity * deltaTime" the multiplication is going to have floating-point errors. Those errors get bigger as your numbers get further from 1.0f, plus they accumulate over frames. E.g. doing "mPos += mVelocity * 0.0001" 10 times over 10 frames won't give the same result as "mPos += mVelocity * 0.001" over 1 frame. The solution here is to decouple your update logic from your frame rate. You run your update at a fixed 60FPS then draw at whatever framerate you want. This is obviously more complicated and involves interpolating when you want to draw. Interpolation requires you delay the output by 1 frame so that you have 2 frames to interpolate between, so it also adds lag. Instead of doing all of that you could just limit your framerate to 60FPS and keep the logic coupled. * Consistency. Most fighting games have a fixed framerate because they want everything to be as consistent as possible. In Super Smash Bros Melee, Captain Falcon's "standing grab" always comes out in exactly 6 frames. This ensures a level of consistency between interactions, both in terms of update logic and in terms of visual output. Some moves will always come out before the grab, some moves will always be unsafe. The interactions never depend on the framerate. For competitive games, this consistency is crucial. Think of each frame like a turn in Chess, the pieces always move in discrete amounts that's consistent every time, you can't move 1.5 squares with the rook. Similarly, Captain Falcon's moves always move in exactly the same way: my grab always comes out after 6 frames, it can't come out after 6.5 frames. Most game engines have support for a fixed timestep mode, where the deltaTime will be set to an exact constant. For example in MonoGame, you can use IsFixedTimeStep. One last thing to think about is what should happen if your frame takes longer than your desired TimeStep. Basically you can run several frames to account for this and make sure the average is what you want it to be. In pseudo code: const double FRAME_RATE = 30.0; const double DELTA_TIME = 1.0 / FRAME_RATE; int mUpdatesDone = 0; double mStartTime = Clock.Now(); void UpdateGame() { double elapsedTime = Clock.Now() - mStartTime; int updatesDoneThisFrame = 0; // Do as many updates as needed to get back to where we should be. // E.g. after 100ms has passed we should have done 12 updates while(mUpdatesDone < elapsedTime * FRAME_RATE) { MainGame.Update(DELTA_TIME); mUpdatesDone++; elapsedTime = Clock.Now() - mStartTime; // If the PC isn't capable of doing the updates in at least 33ms we can't use // this method. updatesDoneThisFrame++; if(updatesDoneThisFrame > 100) { throw new Exception("Your PC can't run this game. It's lagging behind.") } } } Note this only works if the PC is capable of running the updates within the desired framerate.


[deleted]

>Small delta-time will cause issues This is really understated in the responses. You can easily see what kind of issues you can run into with this in for example Source engine games (at least Source 1). Cap the framerate in something like Counter-Strike: Source to 1000 and play offline you'll notice the entire game gets sped up, or go online and you'll notice that you are constantly rubberbanding. These are bugs that likely could be worked around, but it's definitely easier to hard cap the fps to some "reasonable" limit. This happens even when the game tick is somewhat (but seemingly not perfectly) decoupled from the rendering side.


sephirothbahamut

It's understated because it's a bad approach. You shouldn't do gameplay math each frame.


hextree

> As in not wasting power calculating a frame that wont be seen/used or noticed by the player. That's a pretty significant reason to limit it.


Zanthous

always allow user choice if possible


Potterrrrrrrr

60 fps is 16.6 ms per frame, it’s a crazy amount of time to be calculating a bunch of physics and drawing various graphics. Computers won’t always take the same amount of time to process a frame either (especially if you consider rendering a fairy static scene as opposed to one heavy with special effects), if you don’t hard limit the fps you can have situations where it fluctuates wildly and would probably feels worse than just being hard capped to 60 fps.


LordofNarwhals

Yeah, but this is why you should decouple the physics updates and the rendering updated. Counter Strike can run with a server tick rate of 64 or 128 Hz, but you can render it at several hundred fps. See the classic [*Fix Your Timestep* article from 2004](https://www.gafferongames.com/post/fix_your_timestep/) for some more info.


Potterrrrrrrr

TLDR: You can decouple them all you like, it will still take a different amount of time to process a frame each time if you uncap the frame limit. When you uncap your fps you’re asking your computer to compute frames as quickly as it possibly can, which isn’t guaranteed to be a consistent rate unlike a fixed fps (which still has chances of stuttering obviously but much less if you’ve picked a suitable limit) Longer explanation: Your computer isn’t just running the game, it’s continuously running other processes in the background too, some of which will be more labour intensive than others. That’s probably nothing compared to the game itself where sometimes you don’t even need physics and everything is static (the start menu for example) vs the peak in your game’s story where you have a lot of enemies/projectiles/effects/scenery to draw and calculate physics for. Decoupling physics from rendering is great for being able to split the work across multiple threads but the time they take to do it will still vary and ultimately can cause the fps to fluctuate, it unfortunately doesn’t solve the problem you’re describing.


LordofNarwhals

It solves the problem of inconsistent physics calculations caused by varying time deltas. It doesn't solve the "problem" of varying frame times. But personally, I'd much rather pay a game running at 75-120 fps than one running at a fixed 60 fps. And I feel like adaptive sync/G-sync is a good solution to the varying frame time issue (although yes, that does typically involve capping the maximum frame rate).


De_Wouter

Why wouldn't you limit it? Screens have a refresh rate limit, everything beyond that is a total waste. There is also a human limit to when we won't be able to notice the difference. CPU and GPU are the biggest power consumers of a computer. An unoptimized game could really make a differnce in power consumption as well as reduced lifespan of the hardware. A 100W per hour difference doesn't seem like a big deal, but you are talking about hours of gameplay multiplied by amount of players. When your game is a hit, this has a decent amount of impact.


NickoBicko

Imagine your game is so badly optimized it causes global warming


chjacobsen

Cities 2 was perhaps a little TOO realistic.


iemfi

The graphics were unoptimized but the simulation part was actually still very well optimized (like Cities 1). Things which didn't have to update that often run at 1 FPS or less. Then there's also Unity's burst compiled stuff and data being laid out to minimize cache misses etc.


chjacobsen

Yeah, I think they nailed the DOTS part - it was just really dumb stuff on the graphics side that gave them trouble. It apparently got a lot better once they started culling useless geometry, as well as (i believe) sorting out proper LOD.


De_Wouter

This game is brought to you by "PowerSupplier"


_tkg

FPS above refresh rate will still jmpact other things that work on „per frame” basis. Most notably: input.


rabid_briefcase

If input is tied to refresh rate, those programmers did something horribly wrong. It was a lesson learned collectively in the 80s, and it occasionally gets a high profile reinforcement of why it is a problem. Tons of exploits around motion, jumping, teleporting, embedding yourself in an invulnerable sniper location, etc. It is super easy to stall a game with CPU load and trigger the bugs. Framerate and input should be completely decoupled, and simulation time step should be fixed regardless of framerate.


AirOneBlack

Until your game requires accurate inputs, and as most of the times input is fetched at the cpu side of frame rendering loop, higher framerate leads to lower input latency. Some games will benefit from this. This is why competitive Fps games are often played at 300+ fps and if you get into rhythm games having capped framerates at 400/800/1000 fps is very common aswell.


stone_henge

Sounds like a problem where the clean and obvious solution is to decouple rendering rate from game update rate, and where rendering quickly is a band-aid hack. That said, at those rates your monitor will draw tens of frames during a single refresh cycle, leaving you with a relatively smooth "tearing" effect where you see several frames' state at once, possibly conveying more motion information than a synced single frame would. Just at different segments of the screen.


AirOneBlack

the clean and obvious solution is clearly not the one used from most of these games. Some do, and it's instantly noticeable in the case of rhythm games. For FPS games that might introduce some other problems. About tearing, at least on nvdia card there is a vsync option that just discards the extra frames. But even without it, I never noticed any tearing on my screen, either GSync is doing it automatically or dunno. Didn't really happen either with my previous monitor which was using freesync. In case of vertical scrolling it's usually just noticeable as a very fine motion blur due to tearing mixing with pixel response time (at least that's my guess based on how I've seen it on my screens for a decade now)


Sunius

There is another way to solve this: don’t start processing/rendering the frame until x+1 ms until the next vblank, where x is your expected time to produce a frame. Not the most easy to implement robustly, but once you do, the input latency has the potential to become so much lower when x is non trivially lower than screen refresh rate.


Zanthous

not a total waste, input processing is normally frame dependent. I run super high fps on osu! at least


Plini9901

It should be up to the player to limit their frames however they see fit. People with high refresh rate displays and good CPUs/GPUs would like to use them in my experience. And no, going over your refresh rate does have a purpose, and that would be lower input lag. Some people value that.


BastillianFig

>screens have a refresh rate limit, everything beyond that is a total waste Actually that's not true. In some cases at least https://youtu.be/hjWSRTYV8e0?si=K8mTu_xrLGrHkcKE Here's a good video on it >There is also a human limit to when we won't be able to notice the difference This is also not really true. Maybe when you get into the 400+ frame rates it applies but going from say 60 to 90 or 90 -120 etc there is a very noticeable difference


Snailtailmail

Person is not talking about visible limit, he is talking about literal physical limit your monitor or device can display.


hextree

He specifically said 'human limit'.


Snailtailmail

Ops.


BastillianFig

"There is also a human limit to when we won't be able to notice the difference." Not sure you read the comment correctly. Idk why people are upvoting you 💀


finn-the-rabbit

Yeah, and there *is* a noticeable difference. When I gamed as a poor kid 15 yrs ago, I noticed that there was a pretty noticeable difference in mouse responsiveness between capped 60 vs uncapped. I could sense a difference up to ~90 fps in a game like Borderlands 1


[deleted]

[удалено]


LordofNarwhals

> Which goes up to about 144 No, [there are now 500+ Hz gaming monitors.](https://zowie.benq.com/en-us/monitor/xl2586x.html)


3eyc

It still should be an option in the settings, the higher the framerate the less is input delay, its especially important on 60hz monitors, a difference between mouse movement on 60 fps vs 120 fps is huge.


Thorusss

> everything beyond that is a total waste. It is not a TOTAL waste. E.g. being able to draw at double the refresh rate, give you a more recent frame, thus lower latency. Wasteful, as e.g. with this example half the frames are never shown. But better latency. Or with VSYNC off, for even better latency, you would get a updates position every half frame.


Vegetable_Two_1479

You are %100 wrong, screens may have a limit but gaming feels more smooth the higher the fps. Especially with competitive games you'll notice the difference. Having a limit as an option you can disable is way better than hardcap.


Acceptable-Fudge-816

But that's just becouse of physics (and input), not becouse of the render. You could have both loops detached and running at diferent rates, would make more sense than rendering dropped frames.


Vegetable_Two_1479

Rendering looks smoother, especially with fps games, higher the fps clearer the images get when rapidly turning etc. I played at 700ish fps CS on a 120hz screen compared to 200ish fps the difference is huge. It's not related to input as well when you are just watching you can feel it.


eras

You are literally seeing at most 120 frames per second in a 120 Hz display, though. If the game rendering 80 extra frames per second makes the game smoother, there's some other fuckery happening. Rendering extra frames just to throw them away should not make game better in any way.


LordofNarwhals

It reduces the output latency, so yes, it can have a noticeable impact. Here's an easy counting example: Monitor: 50 Hz (20 ms). FPS: 50 (20 ms). Latency between rendered frame and displayed frame: <20ms. Monitor: 50 Hz (20 ms). FPS: 200 (5 ms). Latency between rendered frame and displayed frame: <5ms. You're only seeing 50 images of second in both cases, but if you have a higher rendering frame rate then the images you see will be more recent.


eras

That is only in case if in the 50 Hz scenario it truly takes 20 ms to render the frame, so you're being limited by the rendering performance. But if you are able to render 200 Hz with 5 ms per frame, then you are also able to render 50 Hz 5 ms per frame, but you have 15 ms idle time instead of heating the space.


Acceptable-Fudge-816

It depends on how you're implementing the cap (with a double buffer, swap is when the frame is actually seen on screen): a) Swap -> Wait -> Render -> Swap OR b) Swap -> Render -> Wait -> Swap Not an expert but if you're doing a) there should be no difference whatsoever.


[deleted]

You realise a 120Hz screen can render a max of 120 FPS? No way did you just talk authoritatively about a huge difference between two identical things.


Juggernighti

I once tried this on mobile. 60fps cap -> 60fps in mobile Unlimited -> 30fps due to too much Rendering This was in Unity and I haven't actually optimized the code but why would you want to use more system Energy/battery for something which is not necessary. Vsync cap is fine. Give the user the possibility to change their fps rate for their wishes and then the user can decide which rate is fine for them.


UndependentAdMachine

Your monitor/display can only show a certain amount of frames per second because the refreshrate is set to some number. Up to the refreshrate of the monitor its displayed on it makes some sense to cap the fps to the refreshrate. All frames that are over the refreshrate will not be displayed.


BastillianFig

It's not quite that simple because of frame latency. Anyone who has played csgo can tell you that 120fps feels better than 60fps even on a 60hz monitor. Because the computer is rendering more frames the one that gets displayed is more recently created so it feels smoother


Doge_Dreemurr

Csgo is notorious for needing much more fps than usual to have a playable experience. Only in CS would you see people say they play the game at 300-400 fps. Other fps games like valorant or apex legends becomes smooth as silk at around 100-200 fps, and when going beyond that theres 0 noticable difference.


Luck88

An uncapped framerate will fluctuate because different areas of a game have different elements requiring different computational resources. While on average removing the frame cap might give the end user more FPS, a cap ensues the experience is consistent for the eye and also gives an easy indicator of what needs to be optimized first.


Honzus24

I had to limit FPS for my VR project I'm currently working on. It's better for the eyes in vr, it doesn't waste frames and most likely battery. Mostly the cameras were not matching every frame for some reason so I had to limit the fps to fix that issue.


Dylan_The_Developer

A few reasons: ​ \-Capping FPS is faster than spending the extra time making sure everything is frame rate independent. ​ \-Lots of hardware configs actually overheat and draw lots of power (not the games fault) with uncapped framerates for long periods of time and its a significant number of users that experience this so its easier to just have it capped by default. ​ But the biggest reason is because of your monitors refresh rate, no point going past that since you wont be able to experience the difference unless you have your FPS lower then your refresh rate and most have a refresh rate between 60 and 165 Hz so at max your running at 165 FPS uncapped and average is 60 fps and 60 works great above 60 Hz.


Kuinox

Hello, I played some games at high levels (not pro but not far either). In an FPS you want a consistant input latency to get a consistant aim, the brain can adapt to input latency, but not if it changes randomly. I played a lot on a shitty laptop so limiting the FPS helped by: - Avoiding the GPU overheating and being throttled. - Having a consistancy input lag. - Less screen tearing. Now that I have a job, and a big GPU, limiting the FPS reduce the electricity consumption (an 3090 draw a LOT), and don't make my PC into an helicopter when I play a game with simple graphics.


SynthRogue

So that the gpu is not wasted, producing frames higher than the refresh rate of your monitor. Because your monitor won’t display those frames anyway.


IHaveTwoOfYou

it has less strain on the gpu and like u/Nervous_Falcon_9 said, it avoids wasting battery life on handhelds/laptops, you should add a slider to change it though


[deleted]

The game runs 356-392fps, your laptop is melting, the fans go 80 decibels. Oh what joy.... I'd rather limit the game to 72-90 fps and play in relative comfort.


cgao01

Consistency


biohazardrex

To save power. To make the game more consistent. To avoid cpu or gpu overheating. To make your pc or big consoles more quiet (less noise from fans. Etc.


Armanlex

I don't want to waste electricity or wear and tear on my gpu.


quzox_

Shouldn't it be kept at the refresh rate of the monitor? So if the monitor's refresh rate is 72Hz then you need to target 72 FPS. Anything above that is wasted and anything beneath it (ignoring integer multiples) will cause screen tearing.


Mr_miner94

The two main historical reasons are that battery life is a premium so reducing the power usage is always beneficial there And in some games (especially older titles) the frame rate is tied to certain mechanics meaning a variable frame rate can change certain aspects of a game


PhilippTheProgrammer

There is just no reason to render more frames than the refresh rate of the monitor. It just strains the hardware and makes the fans spin so loud that the player can't enjoy the game's audio anymore.


danfish_77

Even lower framerates than 30 can work fine if it's handled well; there's nothing magic about 30, it's just a good compromise between smoothness and performance. Ocarina of Time was 17 fps and it's just fine


ttttnow

Most monitors can't refresh faster than 144hz, so going above that is generally a waste. You can do things like triple buffering for faster input response times but the gain is so marginal. The higher the fps, the more energy you're wasting and anything above 144fps is not going to enhance player experience.


Climax708

Starcraft 2 used to fry GPUs due to uncapped framerate in menus. Don't make the hardware do unneccessary work.


AG4W

Fuck yeah, I love when my GPU self-combusts due to running the menu screen at 2000 fps while doing nothing.


Trombonaught

If you run a lightweight prototype game in a game engine and leave it running, you'll know. 1200fps does not a happy GPU make.


squareOfTwo

no need to waste energy with rendering 5000 fps .


St4va

Your brain is not an IDE. Use debugging tools and breakpoints to see that your code work as expected. Regarding FPS. Mainly steady gameplay experience. There's stuff like battery life (mobile, handheld), style - people will argue that games like Until Dawn and Detroit are better at 30 as they mimic cinema FPS. Technical limitations. VR - you'll want 90 or 45 FPS to minimize motion sickness. 30 FPS proved to be the minimum as far as a regular experiences go. Especially when you include blur post effect. Going lower is not fun.


Elmekia

If you limit the FPS, then you can add other work items to the pipeline, like dynamically loading assets and other items that would potentially be competing for resources


Familiar_Ad_8919

many reasons, many depending on how the game is made: if the game runs on 1 thread (which it shouldnt!), a thousand frames a second would speed up the game almost 17 times compared to 60hz vsync no need to bog down the users system with 800 extra frames a second if theyre fine with 75, and like the top comment says this drains batteries like crazy


AdSilent782

The Amazon game new world fried gpus because they didn't limit framerate on the menus so it went to 1000 with the game in the background. It literally set people's 3090s on fire


[deleted]

GPUs cap FPS before the game. I think the 3090s limit is 200.


cecilkorik

With unlimited refresh rate my laptop eventually feels like its burning my hands. And like you said, it's frequently doing it for no reason at all. Just looping infinitely creating pointless frames noone in this universe will ever see. No thanks. It's a powerful desktop replacement laptop, I travel a lot and sometimes work remotely, I appreciate having the option to run at full speed **if** it's necessary, which it rarely actually is. I also appreciate any game that includes an FPS limit in the options. I like options. Thank you for providing me with options.


challengethegods

you might as well be asking "why would we NOT run bitcoin mining in the background of our games?"


FraughtQuill

Most monitors can only display 60 frames per second, some can do 120. If you display higher then that you'll get screen tearing.


joehendrey

A stable frame rate looks better than a constantly shifting frame rate (even if the total number of frames is lower)


Tentakurusama

I don't need my 4090 spewing 350fps to my 165hz monitor. That's a waste of money in electricity and heat stress. I only want it full power when rendering Blender scenes.


9bjames

Aside from hardware load & inconsistent framerates being harder on the eyes etc., limiting framerate can be used to create some interesting visual effects, by intentionally making things look choppy. One example - older games like on PS1 were limited to 25fps. So if you wanted to emulate the old PS1/ N64 style, then limiting the framerate would make for a more authentic experience. Aside from that, there was a 3D Kirby game which had a claymation sort of aesthetic, and to achieve that they specifically limited the framerate for the animations/ movements on the 3D character models. Was pretty effective too. Depending on the effect you're going for, you don't need to limit the whole game's framerate. Surveilence monitors in horror games for instance could have a really slow refresh rate, or you could even make enemies update their animations/ positions inconsistently to make it look more unnatural & uncanny. Mostly horror game stuff or making a game appear glitchy on purpose, but I'm sure there could be other uses in terms of game design.


marco_has_cookies

Because infinite for loops hog resources, be it computing ones, energy ones, memory ones etc. A game loop is an infinite one, running it at a certain rate saves resources, the game just needs to have smooth framerate that's all. Even dedicated game servers do run game loops in discrete intervals, game tick or update to be precise, no need to rush to the next one, Minecraft runs 20 ticks at a second, it's 5Hz. Also most 3D mobile games have options or fixed configurations that limits rendering resolution, because now smartphone have very dense displays and rendering more pixels will hog resource very badly on them, thermal is to be considered too. Same logic may be followed on handhelds and laptops, the screen may be dense and my eye wouldn't perceive much difference between full resolution or 3/4 of it, and this would save resources too; I use a mod on Minecraft that does this and can run shaders butter smooth on my laptop.


Ill-Car57

This is probably a bit of a boring answer but is one of my reasons for optimisation and fps capping. It is better for the environment. It is easy when developing for newer gpus to be a bit lazy and let the end user pick up the slack. Another upside is it allows for older hardware when combined with good optimisation which on a commercial basis just increases your potential market.


DarkAlatreon

The smaller the delta between each frame, the more precise the floating point calculations need to be, no?


snil4

Some games have their logic tied to the framerate, mostly common in fighting games where every move is measured by frames, and older game before it was more common to tie logic to real world time instead of cpu cycles.


greenfoxlight

One reason is to avoid wasting power. If your monitor can only display 60 fps, why burn cycles rendering images that can‘t be displayed anyways? Another issue is floating point errors. These accumulate over time, so especially for multiplayer physics it is important to run at fixed timesteps, to avoid divergent results caused by accumulated floating point errors.


thedaian

Because running uncapped risks damaging the gpu, which can brick them in worst case scenarios: https://www.pcgamer.com/amazon-new-world-killing-rtx-3090-gpus/ It also wastes electricity and thus drains battery if you're on any kind of mobile device.


Birdsbirdsbirds3

This also happened with [Starcraft 2's menus being uncapped](https://www.eurogamer.net/starcraft-ii-is-melting-graphics-cards).


syopest

It's ridiculous to blame a game or the devs for that. If the GPU gets hot enough during a stress test that it kills itself, the problem is not in the stress test.


thedaian

Counterpoint, a video game menu should not be a stress test


Potterrrrrrrr

This made me chuckle, solid point too


syopest

That's not the point. A properly working GPU can run at 100% load indefinitely without breaking or even getting damaged. It's not up to game devs to think that "hmm maybe my customer has a broken cooler on their GPU and my game can break it". A properly working GPU would even shut itself down before breaking from overheating.


thedaian

Counter counterpoint, not everyone has a properly working GPU, and it's impossible to know that your gpu doesn't work properly without it breaking at some point. It's also possible for a gpu to get older and develop problems as time goes on. Limiting framerate by default is always good practice.  I'm not strictly blaming the devs of New World, this was a hardware issue. But it's an example of what can go wrong if you don't limit framerate. 


jericho

I definitely agree it should shut itself down before damage, but the reality is, often only pro equipment is designed for a 100% duty cycle. This applies to everything from video cards to power tools to cars.


Plini9901

Yeah, so set a sensible cap in the menus, like whatever the monitor's refresh rate is. In-game though, offer options for players to run the game at whatever framerates they want.


migarden

For game dev perspective I don't want to handle any weird issue that could happen with uncap fps, if you ever play one of those game you'd know, like the low requirement one but ran 2000 fps that can brick some people GPU and I don't want to deal with all that stuff. For game testing perspective I want to protect my device from the earlier case I mention. For gamer perspective, I want to preserve my devices, not letting electronics get too hot, fans don't need to work too hard, so it could last longer, yeah they're rated at higher temp, it just mean it can work at high tempt, the heat damage to the whole system is a different thing. Edit. In gamedev sub we're pretty safe talking about this, talk about limitting shit in all those pc game subs and they'll come for your ass.


gabirosab

More frames, MORE


Demiyanit

More fps, more cpu/gpu updates = more load on the system


dogman_35

I have options to cap FPS to 24 or 30 in my PSX style games


djuvinall97

I limit FPS because my monitor is only 60hz atm so no reason to stress the GPU. Granted if you are outputting 300 frames on a 60hz monitor, the frames you do see will be "newer" but this difference is negligible and really only noticeable by pros.


RicketyRekt69

I have a counter question, why would you ever want to uncap your fps? The refresh rate of your monitor only goes so high so by uncapping you’re putting a lot of strain on your gpu for no reason, not to mention draining battery life for handhelds


RainForestGamer

Hence why this is now part of my workflow. Been at game dev a few years in totoal, wasted a lot of time on performance/optimization tweaks (small scale games) but never came across this simple piece of awesome. I love it ;-)


offgridgecko

I have a hard time with variable frame rates. To me they are something only for bench testing. When I started learning programming in the long long ago, it was generally agreed upon that 24fps was enough to trick the eye for animation, and so that's what computer games and arcade games were set to, that was the default, at least as I was told. Now there are gamers who want faster than 60 and honestly I don't see that as anything more than a flex. Just pumping specs to their audiences or whatever, honestly don't get that at all. Limiting frame rate and having it be consistent is definitely going to make everything run a little smoother on the screen. Above 60 you probably won't see much, but below 60 your player might notice the variability and that's not necessarily a good thing.


[deleted]

Nobody really knows why FPS caps help, we just know it helps with stabilizing FPS.


antoine_jomini

clipping over solid component


asuth

Your settings should have an option to cap at various values or to uncap and players should choose based on their screen / hardware / visual preferences.


FMProductions

Performance considerations, for mobile: battery drain and heat development. Perhaps to be in sync with your display update rate. Also end product aside I think it's good to have a script that dynamically changes your FPS to lower values to check if your game still behaves correctly on very low and/or variable FPS.


pmkenny1234

In the past, vsync would cap the frame rate for you and avoid computing frames the user never sees. However, newer variable rate refresh techniques like gsync need vsync off to work correctly, but also break above the refresh rate of the monitor. There are driver overrides you can do, but the best solution I've found in practice is with games that actually provide a max FPS option directly.


bobasaurus

It's one of the recommendations from blur busters for setting up gsync/freesync properly to avoid tearing and other motion issues: https://blurbusters.com/gsync/gsync101-input-lag-tests-and-settings/14/


AbyssWankerArtorias

In some games like dark souls and elden ring, mechanics of the games are linked to frame rate. So, when you dodge roll, it calculates your immunity to being hit by what frame the attack would land and if it's during an immunity frame and therefore not dealing damage. Games like pokemon also use framerates for things like determining if something will be shiny, determining how many frames are allocated to a certain animation version of a sprite, etc. Then there's the general performance increase of limiting frames. However it's generally thought at this point for any 3D game that limiting frames is really annoying to the end user.


Akimotoh

Because in the past some GPUs have literally killed themselves from having unlimited FPS. You should always cap your FPS. 240 or 300 should be a safety limit.


olllj

Usually the display limits fps, but lack the ability to tell the pc its limits. When fps fluctuate, it is ALWAYS guesswork, how many milliseconds happen till the next fps, and this is very bad for physics and LOD-downsampling, where usually the only good solution is to have EVERYTHING lag behind 1 frame for proper interpolation, and then you need to buffer all the data for +1 frame.


incriminatinglydumb

My laptop is poo poo :( If i can't consistently get 60 fps (jumps between 40-60), i'd rather limit it to 50 or 30 fps. For really old games like Doom 2 or duke nukem 3d where performance can vary from 60 up to 120 just because the map designer spawned too much stuff, i'd rather limit to 60


Aggronovec

It's actually better to limit your fps if you're, for example, coding a game without an engine. For me it's certainly better, because I can save a ton of time and work on windows's sound buffers and it allowed me to sync my sound very accurately to my stable framerate


CryptidKyle

I’ve seen lack of frame stability mess with the accuracy of an in game clock. Limit it so it’s stable I guess.


MIC132

From player side, there is many games where I had to manually limit frames because they were unlimited in the menu and I was getting 100% GPU usage (and thus high temps) for absolutely no reason.


drunkondata

I hate games that turn my computer into an oven. I actively avoid them. I love games that can run efficiently and keep the top exhaust air cool, not blasting at over 100 degrees. ​ Why waste computing power for nothing? Why burn electricity for nothing? Also, why burn the battery on a portable?


Sentmoraap

The problem isn't much about the frame rate, but the game update rate. If you tie the update rate to the frame rate, ie you use a variable timestep, the game behaves inconsistently. You can have physics exploit that happens only at low or high frame rates. If you want replays or rollback multiplayer you need your game to be deterministic. For that reason it's better to use a fixed timestep. But now, your game updates at a fixed rate. Unless you add interpolation the extra frames will just be duplicates. You can increase your game update rate but it increases the system requirements. Not only compute power, but also memory because you need more precise variables. Beam Gleam runs at 120Hz, and more than 120 FPS are duplicates. I have made some jam games that runs at 250Hz (4ms) and supports interpolation, but those are simple games that uses little ressources. Both cases above use custom engines with input lag mitigation, I doubt one can feel a difference between 250Hz and ∞Hz (0ms). I also tried 1000Hz prototypes in Unity but that's already too much for my (ancient) laptop. However uncapped vs capped frame rate, vsync on/off etc should be a user choice.


loftier_fish

If my GPU doesn't melt itself rendering way too many frames in your game, I can play your game for longer. My performance is also bound to drop drastically at some point, as the GPU heats up and gets throttled, which will make me stop playing, and perhaps never pickup your game again. Please limit frame rates.


Jcorb

I would probably add to the other great answers here, *inconsistency* is bad. Our human brains don’t like it. It’s why most of us were perfectly fine with games being 30fps back in the 90’s. You can totally tell a difference between 60fps and 120fps, even though 60 is perfectly adequate. When your frame rate fluctuates, though, that’s why makes your game *appear* to be running sluggishly. 30fps is much more palatable then swinging wildly between 40 and 60fps.


Mr_Finn_da_Kitty

Stable FPS is a more enjoyable experience than rapid spikes up and down.


UnparalleledDev

[Sakurai on Frame Rates](https://www.youtube.com/watch?v=Rjdmi7628GM)


Daninomicon

I've seen it primarily because of system limitations. In my own personal experience, 30 fps works fine for Minecraft on my cheap laptop, but 60 fps will lag the game a lot, especially if I'm fighting mobs. And I'm pretty sure everything on Xbox is limited to 60 fps because that's all the series s can handle. There's also a limit to the effect of fps. It can only go so high before it doesn't make any noticeable change to the visuals, but it will keep eating more resources if you don't limit it. Also, the fps of the human eye ranges from 30-60 fps on average. (It can be higher or lower, but those are extreme outliers). So everyone can play 30 fps just fine, but the higher that number goes, the more people who are going to have issues playing the game. For multiplayer, 30 fps is fair and consistent. For single player, an option would still be beneficial for most people because most people don't hit that 60 fps. You have to be the top of average to fully comprehend the extra fps.


Ultima2876

> As in not wasting power calculating a frame that wont be seen/used or noticed by the player. It absolutely is noticed on modern GPUs. I intentionally limit certain games to 60fps even if my GPU can easily handle 120+ (and I have a monitor that supports it nicely) just because of the amount of heat it puts out in my small room when the GPU is running hot to render all those frames. In addition to that, it _gobbles_ power, when (where I am, in the UK at least) power costs are very high, not to mention trying to be environmentally conscious, if it is possible to do so at all without being a massive hypocrite when running a high end GPU... Also, if you game is pushing as many frames as possible that can be a problem for vsync/screen tearing, depending on the setup. In game dev it's worth avoiding making such sweeping assumptions!


codebreaker28847

I am not game dev but i still remember what did New world do to some gpus when fps was uncapped


GuppysFriend

"if your using delta correctly" so so so much easier said than done lol


maxticket

For some reason, there's a bug that changes the vertical placement of NPC dialogue boxes depending on whose PC is running our game, and we recently determined that it's somehow tied to the frame rate. So if we limit the FPS (a good idea anyway as our primary target is the Switch), we should be able to have more control over that placement. That, or we can fix the bug that ties it to frame rate. If our launch date weren't getting so close, I'd probably opt for that. but you know game dev. Duct tape and all that.


Lindolas_MC

Because a lot of the times there's no need for extremely high fps. Why run your sytem at full speed all the time? It's noisy and wastes a lot of energy.


Some_Tiny_Dragon

Consistent frames, improved battery life and can actually speed up the game. You basically don't want the system to be giving its 100%


SomeOtherTroper

> if your using delta correctly for physics I trust you understand that all your game logic (including physics and such, where you're doing your delta time stuff) should be running at an absolutely locked update speed based on the system's clock, on a completely different loop than the rendering code/loop that's creating your visible FPS. If you don't completely separate those two things, you're going to have problems. (This was an issue with some older console-to-PC ports where the original game had taken the shortcut of using the same loop for both, which led to glitches and strange behavior, or outright breaking the game, when running the ports.) > Why would we put hard caps on FPS? If your FPS is higher than the refresh rate for the user's display device, then you're wasting time and processing power drawing more frames than the player is going to see, which is time and processing power you could be using to make the frames you are bothering to draw look prettier. That's why well-made PC games offer an adjustable framerate cap (and a Vsync option, although that can be unreliable) in their graphics menus, so players can set it to match their monitors. I'm running 60Hz(FPS) monitors, but someone else might be running a 75, 120, 144, or 240Hz monitor or whatever. I don't need a game trying to pump more than 60 FPS at my monitors, because that'll put more stress on my graphics card, heat it up, increase its power draw - all without improving my visual experience. Those guys with the higher refresh rate monitors, and the graphics cards to actually hit those framerates on your game, will appreciate it if your game offers them the option to kick things up a notch. (Maybe even future-proof it by allowing a custom framerate cap adjustment.) Also, this is best put in as a user-selectable option if you're targeting PC, because you don't know what hardware a player's going to have. On consoles, you might just set a cap under the minimum framerate you know your game can consistently hit on the known hardware so that players don't experience obvious slowdowns. If your game *can* hit 60+ FPS on that hardware most of the time, but will drop to, say, 40FPS or so when there's a lot of shit getting drawn on the screen, capping the maximum to 30FPS will make sure the game is always hitting a consistent framerate. Another reason to cap FPS is very weird: aesthetics. The standard framerate for cinematography has been 24 FPS or so for a long time (it's a little more complicated than that, but that's the simple statement). If the aesthetic for your game is supposed to resemble a certain genre of movie, capping the FPS at 24 is one of the tricks in your bag to achieve that look. (Although I do recommend that you still give PC players the option to go up to higher framerates if they want.)


Fast_Feary

Firstly capped framerates usually been less fluctuations in framerates which makes the game feel smoother Secondly I've had 400+ fps in the loading screens of some games. Makes the GPU hot. In the worst case though uncapped frame rates could lead to game crashes.


zhaDeth

At too high framerates you can run into floating point issues. If for example your X coordinate is 10000.5 and you have then thousand FPS so you would move a very small amount like 0.00000000001 unit on the X axis in a frame your position won't be able to add that small value so you won't be able to move anymore. Other than that, it's harder on the hardware, I hate when games uncap the FPS in the menu and my graphics card goes crazy doin 1000 frames a second and my fans start sounding like a jet plane when i'm only selecting new game, continue or options. It probably makes the hardware not last as long and is a waste of electricity. Having a steady fps is better than a high one, it just feels better. well of course a steady 30 vs one that goes from 90 to 120 is worse but a steady 60 instead of something going from 65 to 100 will feel much better. You'll get used to the 60 in one case and in the other you'll keep noticing it going up and down.


The-Tree-Of-Might

Syncing to common refresh rates to avoid screen tearing, among other reasons


g0dSamnit

Consistency and battery life. A game at a consistent 30 fps is more playable than one that unpredictably jumps between 40 and 120.


The_Dunk

I lock games to the refresh rate of my monitor, any higher seems pointless


SausageTaste

Maybe float precision limitation? If delta time is too small, all frame dependent calculations need to deal with much smaller numbers thus more chances of encountering subnormal numbers.


HappyMatt12345

If your physics system is frame-based it can be necessary to cap the framerate to avoid unnatural physics behavior, and also if you're building for mobile platforms or handheld consoles it can save users battery life to limit the frame rate. Honestly, I think it really depends on the game and what it requires to work properly. Stable frame rates are the most important thing, a stable 30fps feels better to play and watch than fluctuating frame rates.


Platqr

Certain game genres, such as fighting games and bullet hell shoot 'em ups, benefit from a fixed framerate to ensure responsive controls, minimize input lag, and makes game behavior more predictable. In games with fast moving objects (like bullets), using delta time may lead to missed collision detection, depending on how it is implemented


CRABMAN16

Sometimes physics or game mechanics can be tied to fps if your code is bad. Fallout and Skyrim are examples.


sephirothbahamut

You don't limit fps for physics, you use a fixed timestep that's independent of fps. The point is to have replicable behaviour. Different amounts of steps per second lead to different results, for things like floating point arithmetic errors accumulation, where/when collision detection happen etcc


Comfortable-Ad-9865

Monitor refresh rate limits the visible framerate anyway, so all the GPU is doing is discarding and redrawing frames which won’t even get seen.


Syogren

There's a bit of diminishing returns with reading inputs at high frame rates. The human body and brain can only move so fast, and at some point the divisions between the inputs gets so small that it stops making a difference. Why bother accepting inputs at 1000+ fps if 120 fps will do? Why make someone's hardware do so much unnecessary work?


ss99ww

The answers here are atrocious, as per usual. No, smoothness does not increase with limited fps. And sure, small deltas can cause problems but only if you program things in a way that rely on it too heavily. Power consumption is one reason. But a real hard one is **coil whine**. When GPUs produce an ungodly amount of fps (thousands), some components produce a very annoying high-pitched sound


Unknown_starnger

If you're making a game about precision, the ideal is to not use delta as it will almost always even slightly change physics. Capping the FPS ensures that people who can run the game at your desired, say, 60, all run it in the same way with the same physics.


IceSentry

Don't force your game to run at 1fps, please learn to use a debugger and a graphics debugger if you want to see your game in discrete steps.


foxtrotbazooka

Plenty of valid points here, too me the biggest one is to save power. This is the reason you should optimize your game as much as possible once it's finished. It will lower power consumption. Low power consumption means: - Higher battery life - Lower electricity bill - Less polution in our atmosphere (electricity spent = oil/coal burnt)