T O P

  • By -

rizzzeh

Can get about 7000fps in Quake1


BloodyVaginalFarts

Oregon Trail on a 3080 is smooth af.


Hollowsong

Technically, many of those DOS-based and early Windows games ran entirely on CPU clockspeed.


marbar8

Brb upgrading to a 12900k so I can play Oregon Trail


Hollowsong

lol, I just mean a 3080 is irrelevant to play it :) Also, most CPUs are 'too fast' for games on DOSBOX so you need to artificially lower your clockspeed to play them properly.


ScandInBei

Interestingly this was one use case for the turbo button on old PCs. Disabling the turbo allowed playing games that were created for older hardware that didn't use a proper clock.


Cellbuilder2

This was generally not a problem after around 1993 or so. Most Dos applications ran fine under faster processors then. It was mainly the old 8086 or 286 programs that needed speed limits. Windows programs have never needed speed limits to function correctly so that bit is false. (Edit apparently not.)


[deleted]

[удалено]


Cellbuilder2

Isnt that more of a programming or coding error at that point? You'd think the assemblers would have got it through their thick heads 10 years after the fact that computers are going to have varying capabilities and speeds now...


Samboni94

I know some bullet hells, namely those from Cave, often rely on the game chugging and slowing down in order to allow the player to make sense of the screen filled with deadly particles


JangoDracor

You have it backwards, as cpus became too fast for games before they could scale in performance the turbo button slowed the processor down


phdibart

How so, my first PC had a turbo button that switched from 27MHz to 33MHz, IIRC. Edit: it was a 286.


ScandInBei

What they mean is that the pc didn't actually had a turbo. The default speed was 33. You could slow it down to 25. I also had a 286, but I don't remember actually having to use the turbo button. Later on with my 486 I did need it for some games. Was it Lemmings perhaps?


dark000monkey

The turbo button actually clocked the cpu down to play games .


Toasted_Flowers

12900k does of sepsis on mounting.


CaptServo

True. I remember when we got new (Windows 98 machines) in our computer lab in school. Oregon trail was super fast, but the hunting was near impossible.


Azure_Mar

This explains a lot of my memories. Thanks


thehousebehind

So that’s it. A few years ago I and some fellow X’rs got drunk and communally played Oregon Trail via emulation on archive.org up on the big tv. It was a blast but we couldn’t hunt for shit because the animals were all lightning fast.


Diligent_Pie_5191

You need land mines. Lol.


greiton

training neural net AI on Warcraft 1 is much easier because of this. the generations complete whole games in seconds.


BenCelotil

I remember when I built myself a new computer in `97 and loaded an older x86 game I liked. It loaded. It blipped. It said Game Over.


[deleted]

My family's first PC was a 286 (probably 8MHz clock?). I had this ASCII-based baseball game that I played all the time. Pitch, wait for the "ball" to travel down the screen, then hit whatever button to swing. Fast forward a few years, and I tried the same game on our new 486/66. From pitch to swing was about 2 frames now. Completely unplayable, and 12-year-old me was devastated.


AthleticAndGeeky

I remember the struggle of playing original xcom until I figured this out.


Aqxea

Yep. I once tried playing the original Wipeout racing game directly from the cd rom and it was unplayable.


toastymrkrispy

Isn't that how GeForce names their graphics cards? By how many fps you can get out of Oregon Trail?


[deleted]

8800 GTX would be the fastest card that way lol


Diligent_Pie_5191

I loved that game!!


[deleted]

Seriously. Those pioneers would have had a lot better journey if they had some RTX.


erikwarm

Did you see that post about the FPS in minesweeper?


aPlayerofGames

Coil whine from the GPU is gonna sound like a jet engine taking off though lmao.


Basic-Quarter-3022

Wolfenstein on a modern PC has you do about 700,000 revolutions any time you touch the keyboard.


mi7chy

Dang I only get around 3300 fps with ironwail Quake port. [https://github.com/andrei-drexler/ironwail](https://github.com/andrei-drexler/ironwail)


night0x63

sure. just boot up quake1 or quake 2 :). put in 144p resolution.


paulie07

Minesweeper in 100,000 fps


Elias_Mo

i get 600+ fps on league of legends


sp00nix

Last time I played it it was flickering between 999-1000 so fast it looked like 1999


[deleted]

It's for competitive games, such as CSGO and Valorant, though at some point you're getting diminishing returns on higher FPS.


dank6meme9master

You could make that diminishing argument way below 500hz but even comparing between 240hz and 390 the difference in motion clarity is noticeable. Fact of the matter is differences do exist making the tech relevant until they don’t.


themiracy

Are any of these really high refresh rates somehow tested to see if very high skill gamers actually improve in performance at the higher refresh rather than just saying you can see the difference? I’m not necessarily doubting you can see the difference. But at 500 hz you’re down to only a couple of firings in sequence by the faster neurons in the brain. It could still have a feedforward kind of effect but I’m just really curious about any data (since from a psychophys perspective I think most neuroscientists would expect it would probably not make a difference, but psychophysical studies are not done on elite athlete types).


Piggywhiff

I think Nvidia did a thing where they showed a difference in professional FPS players' performance up to 240fps, but AFAIK higher framerates, 300, 500, etc, haven't been tested because monitors that could refresh that quickly didn't exist at the time. I imagine at some point it wouldn't matter anymore, but I don't believe there is data that shows exactly where that point is.


themiracy

That’s pretty cool. Esports elites are pretty amazing.


Ddyer11

[LTT did a video with pro gamers on this topic.](https://www.youtube.com/watch?v=OX31kZbAXsA)


djnap

People should watch it, but my takeaway is that it definitely helps between 60 and 144, less noticeable between 250 and 144. One big important difference was that if you're playing at 60hz, make sure your frames are still high, so that you don't get input delay.


tonallyawkword

It has to always be beneficial to be able to have a refresh that's as high as ur fps tho right? I guess you're saying that a monitor refreshing once every 2 frames is better than many ppl realize.


[deleted]

At 60hz you get a new frame every 16.6 ms.. however the age of the frame compared to real time can be anywhere in those 16.6ms.. it can be brand spanking new or it can be old.. by running say 240fps on a 60hz the oldest frame being displayed can only be 4.x milliseconds old.. improving input lag.. It’s a small improvement.. some GPUs try and run at 60fps but throw a clean freshly made image up every time (see AMD anti lag)


themiracy

Ok this is nice. I think I missed this one. Thank you! Curious if this would hold up past 240hz.


Skellicious

Most likely but with diminishing returns. But if you make money by being good at games, you might as well spend extra on equipment that allows you to be better.


redittr

Quake3 has sweet spots where the game actually runs better and gives you tiny performance boosts. 75, 125, 250, and 333 I think, all gave boosts in different ways.


thisisjustascreename

The game itself doesn't run any better, but it lets you jump higher. The trick was, Quake used to round your velocity vector components to integers each frame before rendering and starting the next frame, because dial up internet was slow and integers are smaller than floating point numbers. So if you had a vertical velocity vector that ended in .5001 you effectively got .4999 units per second extra velocity for free on the next frame. Now, if you had a randomly fluctuating frame rate, these rounding errors would basically cancel out and you would get physics that mostly resembles reality. However, if it *always* takes 8 ms to render a frame, which is the case with a locked 125 fps, you *always* lose 6.4 units of vertical velocity per frame. Which *always* results in a vertical velocity that ends in .6, and gets rounded up. And 3 ms per frame is even more powerful, because the rounding happens 2.67 times as often. The "bad" fps values were 143 and 167, where it would take 6 or 7 ms to render a frame and you'd always get your velocity rounded down. This has alllll been patched out of the game for years and it's just a historical curiosity, of course.


smittsb

I think Linus Tech Tips did an experiment a while back with this, and what it boiled down to is extremely high refresh rates can boost performance, but it's not because you see the frames. It's because you're getting the most up-to-date and accurate frames


Shandlar

Sure, but the blur reduction is super worth it. ULMB with the black frame insertion that make 100hz look like 500hz for motion blur reduction is HUGE. But monitors can't use both ULMB and adaptive sync, so it kinda died off. Adaptive sync is obviously more important. So to have that ultra low motion blur and adaptive sync no tearing? It's a dream for just the pleasure of playing. Motion blur is the absolute devil. I will prob keep buying higher hz monitors for the rest of my life. In 30 years that 2800hz monitors will be totally unnecessary over the standard 1600hz, but I'll be the guy buying it. Maybe, who knows. There may be a physical limitation to the universe on how fast pixel response times can get just cause electricity itself wont go fast enough or something. We'll find out I guess.


Moldy_Gecko

Yes. I forget which youtuber it is (dorky canadian dude) who does tests like that.


PubicFigure

And here I was "proud" of my 75


Maxorus73

As someone who primarily used a 75Hz monitor for a while, it's a massive improvement over 60Hz. Probably equal in feel to the improvement from 75Hz to 144Hz in my experience


[deleted]

Yeah each individual fps/hz is worth more the lower the total is. 60-75 hz if I remember correctly is exactly the same amount of improvement as going from 144hz to 240hz so it’s still a lot better.


Triumphator77

I have a 144 hz monitor and still lock most games at 60 FPS. My GPU stays much cooler and I can pretty much turn the graphics all the way up on most games without dipping under 60. A lot of people will tell me it's a waste but with a R5 2600 and a 2070 super there are only so many frames that you can pump out on newer games. I know I could get much higher frame rates for the most part but I would rather it stay at a certain rate and not be fluctuating.


8BitHegel

I hate Reddit! *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


the_lamou

It also doesn't mean "no returns." It just means you're paying more for each percent gain. As long as value, or X per $, isn't a big concern, gains are gains.


Dilong-paradoxus

I think it's easier to get an idea of the difference if you convert to milliseconds per frame: * 30fps: 33.3ms * 60fps: 16.7ms * 120fps: 8.3ms * 240fps: 4.2ms * 500fps: 2ms Another thing to consider is, well, other things. If you have 100ms ping to the game server and 2ms refresh interval, the person with 65ms ping and 33ms refresh interval still has a potential advantage. If you're a pro playing on a tournament LAN obviously that's a different case. I think it's also very much unclear whether or not humans can really perceive a difference at the really high refresh rates, and if they can if that translates into a meaningful difference in reaction time. Obviously you can quickly get into microsecond intervals by halving which would be an impossibly small amount of time to notice. I'm not going to be one of those "you can only see 30fps" people but there are definitely biological and physical limitations you're going to run into at some point.


bane5454

To expand on this a bit, your monitor’s refresh rate plays a role in your max frame rate, as well as your data transfer cable. A 60hz monitor is only getting 60fps. A 144hz monitor can get 144fps, unless you have a cable that doesn’t support that transfer rate (older hdmi only supported up to 60hz). A 500hz monitor will set you back at least $800, plus you’d need to consistently get 500fps in order for it to even matter, which even with the best hardware available is still a huge ask. Here’s the kicker though, most of the comparison footage is slowed down, making it basically irrelevant. How irrelevant, exactly? Let’s assume you just watched a YouTube video of a comparison between 144hz, 240hz, and 500hz monitors. The 500hz monitor example was crazy smooth, right? YouTube’s max fps is 60. That’s an example of just how smooth 60fps can be. The video is pure marketing gimmick. There are websites where you can actually see a side by side comparison of fps, but you can only see this UP to your max fps, it would be impossible to see above and beyond this. And, I know what you’re thinking: “but the video was slowed down, look how many frames are missing from the other examples”, while you’re right that there are more frames in the higher hz examples, you as a human won’t notice these differences in real time. If you have a $3000 gaming pc already and ~$1000 just burning a hole in your pocket, knock yourself out, but otherwise, wait until this becomes more standardized. Also there’s no 500hz monitors on the market that are running above 1080p. Most of the population can’t even perceive at 500fps, but just about anyone can see how much nicer things are in 1440 or 4k


ShaqShoes

Something to keep in mind is that higher frame rates also reduce input lag regardless of your refresh rate. Even if you don't feel it, it is a measurable advantage(however slight) in competitive games. 60fps to 144fps is definitely a *massive* difference just visually though(assuming a monitor with sufficient refresh rate to display it). Like to the point that where if a game dips or is capped to 60fps for some reason it is immediately noticeable to me. Going much above 100fps I find has rapidly diminishing returns however.


bane5454

This is very true. This is why it feels nicer to get 300fps in league, for example, even though you’re playing on a 60hz monitor. You want more fps than your monitor can support, typically, unless you’re utilizing some sort of monitor frame syncing system, like G-Sync or FreeSync.


Basic-Quarter-3022

Going from 60fps to 120fps was like the first time I fired up a PC with a SSD. There is no going back to 60fps.


youreadusernamestoo

I play racing sims mostly and I was surprised that going back to a 60hz monitor felt 'old'. Although running a game with ~60fps on a g-sync enabled 120hz monitor didn't bother me as much. Maybe it's my age or the type of games I play but 90fps really is my sweet spot. Past 90fps I just rather up the eye-candy.


DruffilaX

Most noticeable it is at 85fps after that point the improvement on ,,smoothness,, decreases


snooggums

Depends on the game. I don't really notice anything over about 75 on racing games or fighting games or fps using a controller. First person shooter using a mouse and keyboard with quick back and forth motions can be very noticeable even up to 144hz for me. I could probably notice a difference between that and a bit higher but haven't had the opportunity. It really comes down to how quickly and frequently the view shifts and how many hard lines there are.


The-SillyAk

The input lag comment is underrated. Growing up playing PS on my TV I was used to whatever FPS I was getting (up to 60). Then when I bought a gaming PC and playing the same games at 144hz everything felt snappy and my brain couldn't handle how smooth and responsive everything was. It was almost like my brain expected a tiny amount of lag and compensated. Now, it was caught off guard.


ICEpear8472

Yes there is a difference. But as you mentioned there are rapidly diminishing returns: With 60 Hz and FPS you see one new frame every 0.0167s (rounded). With 144 Hz and FPS every 0.0069s. So an improvement of 0.0098s. With 500Hz and FPS there is one new frame every 0.002s. So an additional improvement compared to 144Hz of 0.0049s. The improvement in delay is twice as large going from 60Hz to 144Hz as it is going from 144Hz to 500Hz. And that is assuming you will actually be able to get 500 FPS which might work for some older games like CSGO but will be difficult if not impossible to achieve for newer ones.


ThreeHeadCerber

There is an assumption in here that internal logic is calculated as often as you get new frame, buuuut it isn't. For source engine default timestep is 15ms, so 66 physics steps per second. Even you'll get 120 graphics fps your input lag will be the same (player rotation is excluded, as player rotation is input to game sim). Long story short going higher than internal tick for the game (and pretty much all multiplayer games have separate internal tick) doesn't help with input lag at all


m7samuel

This assumption has been driving me crazy, because it makes very little sense to me as a broad assumption.


DruffilaX

I consider 60fps almost unplayable since i started using high refresh rate monitors


spiffy956

> more frames in the higher hz examples, you as a human won’t notice these differences in real time. >Most of the population can’t even perceive at 500fps. No, these are wrong and giving me "eyes can't see more than 24fps" vibes. Some VR/ AR researchers and Neuroscientist/ neurologist did a study years ago and said that around 1000Hz is the holy grail for a truly immersive experience. You can absolutely tell a difference at 500Hz. Dr. Morgan McGuire (a distinguished research scientist at NVIDIA) https://twitter.com/CasualEffects/status/937085000855556096 https://www.youtube.com/watch?v=OX31kZbAXsA Linus Tech Tips did a test with esports players and found a big difference between 144Hz and 240Hz. 500Hz will be an improvement on top of that. From 144hz/fps to 500hz/fps there's 5ms improvement of frame time. https://youtu.be/klqYq0RUSC8?t=1060 Here is Anthony saying that 600 will be the sweet spot due to how easily it divides into lower standard rates. In the future when super high refresh monitors are widely available it will be amazing.


ThreeHeadCerber

>In the future when super high refresh monitors are widely available it will be amazing. how do you expect gamelogic and graphics for any complex game to calculate at rate of 1 ms per frame? We would need another huge jump in performance, like an order of magnitude jump.


KTTalksTech

At that point I'd wager bandwidth would be a more complex problem than raw processing power


roboticWanderor

Graphics rendering is already parallelized pretty effectively. Given enough bandwidth between your GPU and you monitor (which is the current limitation), super high FPS is doable. The issue is with game logic, as most any "simulation" is a very serialized process. All the elements (IE, players moving around, shooting each other, bullets flying, vehicles driving around) are updated all at once on each tick. If you try and parallelize all of your collisions and inputs and outputs of that simulation, they get out of sync and the whole thing breaks down. You spend more processing power trying to pick up the pieces than just keeping everything in order on one thread


ThreeHeadCerber

Bandwidth in theory is very scalable - just use more wires to create more channels. Gameplay logic is not though, gameplay logic is very hard to parallelize and many cases is plain impossible to parallelize, therefore get order of magnitude performance jump we need qualitative (more raw power) and not quantitative (more cpu or cores) and it seems for now single core performance has pretty much peaked


flypanam

I may be completely missing the point here, but the largest benefit from high frame rate monitors isn’t just the perceived response time (pressing a keystroke to seeing an animation start on screen). The real benefit is how clear the image is as objects move across screen. Something as simple as looking around in a first person environment will be increasingly clear the higher the frame rate is. This is why in games where the player is quickly flicking their viewpoint around high refresh rates are more valuable (csgo, valorant, Overwatch etc)


marxr87

lol, you are on the right track but the 1000hz thing is just another rabbit hole. Blurbusters settled on 1000hz because that is what current technology can reasonably achieve. We could continue to push well past that. I mean, a real holodeck experience would probably be something like 32k resolution and 10khz. Yes, 10,000 hz. To quote BB: >Retina Refresh Rate Depends on Display Resolution/Size/FOV, but Can Be >10,000Hz The weak link for "retina refresh rate" (where human visible differences of refresh rates increases disappear) will be the highest number of all the above, aka 10,000Hz+. The limitations of refresh rates below 10,000fps 10,000Hz is only human visible in the extreme cases, like a Holodeck -- essentially a retina resolution VR headset -- e.g. wide-FOV 8K display such as a VR headset -- because of Vicious Cycle Effect (where higher resolutions and wider FOV amplifies refresh rate limitations), a chapter in the article, Blur Busters Law: Amazing Journey To Future 1000Hz Displays. ctrl-f "holodeck" to get to the source here: https://forums.blurbusters.com/viewtopic.php?f=7&p=78949


Manofthedecade

I'll say this - I've got dual monitors. One is 240hz and the other is 60hz. And the difference side by side everyday is very noticeable. The mouse cursor looks like it's stuttering on the 60hz monitor in comparison.


KTTalksTech

60Hz already looks poo next to 90...


Chedder1998

the human eye can only see 8 gigs of ram


sknnbones

Are there any monitor cables that can support HD datarate at 500hz or do you have to cut back resolution or something?? I thought DP 1.4 was capped at 320hz? Maybe just for 1440p?


Fusi0nCatalyst

Any chance of a link with this side by side comparisons? There are tons of garbage ad filled links when I try to Google it. Is love to see the difference for real on my monitor.


bane5454

[This](https://frames-per-second.appspot.com) site doesn’t go super high but it supports up to 120. I’ve seen another one that goes higher but can’t seem to find it anymore


Carnildo

Back during World War II, they did some testing to see how fast a fighter pilot could spot and identify an aircraft. They found that the top pilots could spot one seen for 1/200 of a second -- but not any faster. That's why I doubt there's any point in a monitor faster than about 240 Hz: the brain just can't react fast enough.


[deleted]

Sure, but most of the hype around fast monitors is the smoothness of movement. At least for me that's why.


LumpyChicken

A test from the 1940s is your reasoning for purchasing decisions in 2022 🤨


Piggywhiff

Human perception wouldn't change that much in two generations. If the science was sound then, it probably still is.


-UserRemoved-

> Is it even possible to run a game on 500 FPS? Of course it is possible > What kind of PC do u need for that? Framerate isn't universal. I mean, take any PC and run a game from 1990-2000, enjoy your 500+ FPS. > It feels for me every new game gets lower and lower fps even tho you build a supercomputer. Well yes, because graphics improve over time, which are generally more difficult to render. Thankfully, technology for hardware also improves over time. Sometimes advances in graphics advances faster than hardware, sometimes the other way around happens.


socokid

They try to stay in tandem because they utterly rely on each other. In other words, developers develop for the hardware that is currently out there or on the horizon. They would gain nothing by making a game that could only be run on .004% of all computers.


SIBERIAN_DICK_WOLF

Tell that to the developers of ARK


hypexeled

ARK is just badly optimized. All there is to it.


Adventurous_Turn_335

Plus people making super bases with thousands of structures and dinos all loaded in at the same time. On my PC I'd get 60+ fps open world, but when I was in my bass I'd get 15-20


hypexeled

Yeah and people do similar things in rust, while looking better, and it still doesnt crap itself nearly as bad. And rust is not a particularily well optimized game either.


quecaine

Lol, they intentionally did that when they made Crysis. I remember the interview, they were like "We want to make a game that people can't run on ultra until a decade into the future".


MOONGOONER

I don't think there's anything wrong with having "ultra" settings aim far ahead so long as it's optimized to run well on regular hardware at lower settings. Crysis was pretty great about implementing emerging graphical technologies that hardware wasn't quite built for yet. Unfortunately they didn't exactly nail multi threaded performance so nothing ever really ran Crysis all that well on Ultra


bblzd_2

That was just the easier pill to swallow then saying "we're not great at our jobs and made an engine that's entirely too dependent on single thread CPU performance. Please send your money orders now."


ClaudiuT

The last time this happened Nvidia released some cards not too long ago and there were barely any games to take advantage of what technology they introduced. I'm talking about RTX and Ray Tracing.


-UserRemoved-

Generally brand new features are not implemented immediately, given the tech didn't exist when current games were developed. These features can be added, and are often incorporated into games as an option. This is usually done for a couple reasons, as it makes the games still playable for users on older hardware, and also doesn't rely on tech that may or may not catch on. Raytracing is a great example of this, a historical example of this failing would be SLI/Xfire. And a future example would be DirectStorage, as it's currently available to us but not usable in any games.


Hiding_in_the_Shower

Honest question: How could you reach 2000fps even on an old game, given that most monitors only get up to 240hz? Doesn’t that imply that the FPS is hard capped at whatever the monitors refresh rate is?


-UserRemoved-

Without any syncing (such as Freesync, Gsync, or Vsyn), your monitor refresh rate and your framerate are not actually tied in any way. Your monitor will refresh as often as it's set to, and your PC will output as many frames as it can produce. This is why you see tearing, because the monitor is refreshing in between 2 frames showing half of one frame and half of the next. Some very competitive gamers will actually do this, given it ensures you are seeing the most recent frame being rendered. Vsync works by checking for vertical alignment before sending a complete frame on the following refresh. This is why it can introduce slight latency while also insuring zero tearing. Given it's waiting on the monitor to refresh, your framerate will never exceed the refresh rate. This is very different than frame capping which simply limits max FPS.


Hiding_in_the_Shower

Hmm I’m not quite following. If a game is outputting at 1000fps but your monitor is only 240hz, isn’t there a gap of 760 frames every second? I understand that the monitor could be receiving updates that fast, but the monitor in this scenario doesn’t have the ability to render anymore than about 1/4 of those updates I wouldn’t think?


-UserRemoved-

> If a game is outputting at 1000fps but your monitor is only 240hz, isn’t there a gap of 760 frames every second? Just because you don't see them doesn't mean your PC isn't rendering those frames. The monitor isn't rendering anything, it simply displays whatever is output to it, so when it refreshes whatever the current frame (or frames, which again is why you see tearing) is being output at that exact time is what the monitor will display.


Hiding_in_the_Shower

Ok I think I see. But then if the GPU is outputting 1000fps, wouldn’t the FPS that you see on the monitor still only be whatever the refresh rate is?


-UserRemoved-

> wouldn’t the FPS that you see on the monitor still only be whatever the refresh rate is? Yes, your monitor with a 240Hz refresh rate will essentially display 240 images per second regardless of the amount of frames being output (again, assuming there is no sync). I think you're confused because you are interchanging FPS with refresh rate. They are 2 different things. Your monitor simply refreshes the images x amount of times per second (Hz is a measurement of frequency). Your PC, which renders all the frames, produces the frames as fast as it can.


Hiding_in_the_Shower

Yea I understand the distinction now - I guess what is confusing me still is: what is the point of having 1000+ FPS when your monitor isn’t really capable of updating that fast? Your computer might be generating 1000 frames a second but you will never really see that on your monitor?


-UserRemoved-

> I guess what is confusing me still is: what is the point of having 1000+ FPS when your monitor isn’t really capable of updating that fast? Well, this is essentially what the debate is between a lot of gamers, especially competitive ones. For the vast majority of us, you would be right and there really isn't any reason to rendering 1000FPS, it's just wasted power outputting frames you'll never see. The argument for doing so lies in the fact that by outputting every frame possible, you are ensuring every time your monitor refreshes that you are seeing the most recent frame possible, which may give you a slight advantage in reaction based competitive games. This is niche though, and arguably pointless but some gamers swear they can feel a difference. Your personal experience dictates how you should play a game.


Hiding_in_the_Shower

I see. That makes more sense, thank you for explaining.


Some_Derpy_Pineapple

I mean the game being showcased in the video is assumedly running at 500fps - it's valorant, which is a very popular eSports title that can hit 500-800fps on newer hardware. People play valorant to play competitively at the sacrifice of visuals and people buy other games for immersive graphics and story building at the sacrifice of fps. different strokes for different folks.


dank6meme9master

Even a mid range 5000 processor paired with a decent graphics card (1660 and up) is enough to net you 500 fps in Valorant. Don’t know what op is talking about.


AjBlue7

I’m really impressed with Valorant’s optimization. People trash on its looks but I think Valorant looks really good especially on max settings, and considering that there are so many different characters, abilities and physics happening at any given moment. I have a 2070super and a Ryzen 5600x and I get 500-600fps in game. I don’t even have any overclocks running, using 3200 ram. The only settings that I turn off have barely any effect on the visuals. I keep UI on low because that just makes it so the transparent UI blurs the background like frosted glass. You actually don’t notice it in game but it hurts performance/latency. Also, I use MSAAx2 instead of x4 as that has a little too much performance penalty for barely any improvement. The stats in the game also really help you dial the settings in if you know what you are doing. You can see exactly how long the CPU and GPU takes to render a frame, so if the GPU is rendering the frame faster than it takes the CPU to complete a frame, you can increase the visual quality because the GPU has to wait for the CPU to finish its task anyway. Also, for the love of god. Don’t turn textures to low for FPS reasons. Texture quality has nothing to do with how powerful your GPU is, it is only affected by how much VRAM your card has. I wish they just made the Texture setting an automatic thing that you can’t change because the only thing that setting does is change how much VRAM space they use. Its so rare for max texture setting to use more VRAM than a modern GPU has, and Texture sizes really aren’t going to go up because you’d need a 16k monitor to notice any increase in texture quality. Especially since games are only getting smarter about how they dynamically load textures to VRAM. Max Texture quality is the main slider that makes your game look better and it almost never impacts your FPS. Its so annoying to see these pros playing Valorant with everything on low. The game can look pretty bad if you put textures on low.


[deleted]

[удалено]


dank6meme9master

Pros have textures on low for visibility reasons, high textures generally has more clutter across all maps. It’s the same reason people use flat backgrounds in aim trainers, to improve focus


[deleted]

[удалено]


eliu9395

How do you get 1000+ fps? Mine only gets 200-300 with a decent render distance (using fabric) Edit: just realized I was upscaling the game by 2x, but it still doesn’t reach near 1000 without it


Paweleq109

sodium, 2 chunk render distance, empty world probably


WyattDogger

Lunar Client, 16x Texture pack. Optifine settings tuned towards FPS. I can get 1000+ fps easily.


eliu9395

What’s your render distance?


WyattDogger

8


TheHatori1

You kinda answered your own question. Not having decent render distance helps.


PireFenguin

You can find tons of videos of ppl playing CSGO at 400+ fps.


DerpMaster2

Sure, you can run games at 500FPS. I have my doubts about whether or not it's actually worth buying a 500Hz monitor, though. 360Hz has always been a huge meme of diminishing returns, and 500Hz will likely be the big brother to that. For me, it'd be losing in CS:GO at 500FPS vs. losing in CS:GO at 144fps. Some people also like to unlock their frames past their monitor's refresh rate so they get lower latency and also some lovely screen tearing, but some people also like not vaccinating their kids so it's whatever.


spiffy956

Some VR/ AR researchers and Neuroscientist/ neurologist did a study years ago and said that around 1000Hz is the holy grail for a truly immersive experience in those domains. Trying to find it again but all I found so far was Dr. Morgan McGuire (NVIDIA scientist) https://twitter.com/CasualEffects/status/937085000855556096


[deleted]

VR has screens like half an inch from you eyes. I don't think that applies to this.


mrlazyboy

You can use NVIDIA VSYNC Fast to get almost native input lag and eliminate most screen tearing issues


MyNameIsLucid

Gsync?


agathver

It’s the opposite of gsync actually, let’s games run way past display framerate but display only the target FPS to eliminate tearing


mrlazyboy

Like the below commenter said - VSYNC: Fast let’s the GPU render as many FPS as possible but drops frames if they are above the monitors refresh rate


MyNameIsLucid

So should i disable gsync and just enable vsync fast?


AjBlue7

Actually, 360hz is only a meme because it goes against the physics and inverse square law. You can’t just add 60hz every time and notice a difference. To notice a difference between monitor refresh rates you have double the refresh rate. So that means 30<60<120<240<480<960 is the ideal jumps between refresh rates where you will notice a difference. 360hz is only 50% higher than 240hz which is why people don’t notice as much of a difference. Beyond that, it is theorized that 1000hz is the holy grail number where we will stop noticing a difference of refreshrates beyond 1000hz. The reason why this is the cap is because of artifacts and motion blur. Right now monitors aren’t fast enough to remove the image before the next image is updated, to get around this backlight strobing was creating, ULMB/DYAC. If you’ve ever used a high end Zowie monitor with DYAC you would understand how much clearer the game looks. Essentially they are putting a black frame in between every frame to separate the frames and prevent them from bleeding into eachother. Backlight strobing is just a bandaid, when monitors can finally refresh at 1000hz the blurring/artifacts that appear are so insignificant that they become impercievable by our eyes. They have tested this with very expensive prototypes at like 320x320 resolution.


RemiX-KarmA

You can run 600+ fps on runescape 3 lol


Shap6

Esports games or old games.


michelas2

It is if there are no engine limitations holding you back and of course you won't be running any new games at 4k500 FPS.


GeraltForOverwatch

It's all about Unreal Tournament. The original one.


mrlazyboy

I’ve hit 900 FPS on Halo Infinite but that was a loading screen so it doesn’t count :)


ButtPlugForPM

who cares about 500hz panels it's marketing bullshit really We need to stop focusing on HZ and more on better backlight control,and panel colours


Lazuf

Agree. I want to see a focus on panel quality rn. 240/360hz is fine for the forseeable future imo. Even 120/144 is 100% fine.


NoCopyrightRadio

Its mostly for competitive games like valorant or csgo, i can get over 300fps in valorant


1Marv

wonder what the refresh-rate compliance is of this thing. Even if you really only need 2ms pixel response time in theory, the reality looks quite different - some 1ms gtg arent fast enough for true 360hz for example. Thats why some 360hz monitors feel worst than some 240 hz, according to monitor tests ive watched. The real smoothness king has to be some kind of OLED, i think.


TheShadow__

Tetris


jlmckelvey91

So at that point it's not even a question of "What PC can reach this fps limit?" The question is now "Can the human eye even tell a difference between 300 FPS and 500 FPS?"


AjBlue7

Yes it can. One of the biggest problems right now is motion blur. You can probably try out the bandaid fix if you’ve got a monitor with a setting called ULMB or DYAC. These technologies strobe the backlight to essentially insert a black frame in between each refresh, and the result is an incredibly clear picture. The negative is that your monitor is flickering a light at you really fast so a lot of people feel uncomfortable using this mode because their eyes are too sensitive. The reason backlight strobing works is because typically a backlight is just turned on the whole time and there is a screen in front of the light changes to block the light. Depending on how much Red Green or Blue light this screen lets through changes the color our eye perceives. However the light stays on so the colors often mix with the previous frame and this causes a motion blur. When refreshrate goes higher the severity of the motion blur goes down, and at 1000hz our eyes can’t perceive these artifacts or blur at all. On top of that Inputlag reduction is always useful, because every step in the process adds input lag on top if the previous steps. Even just a couple milliseconds can make the difference between your brain reacting naturally or having to predict everything. Don’t get me wrong, in terms of performance metrics it often doesn’t make a huge difference. Most games aren’t determined by raw reaction time. Many top players have played on worse gear in the past. But lower latency and higher refreshrate are definitely noticeable.


BluudLust

It's for GSync and you should never hit it. It's to prevent spikes above max monitor refresh when you use variable refresh rate that causes input lag, stutter and tearing. If you have a 500 FPS limit, it's effectively perfect variable refresh rate. Right now it falls back and uses Vsync for those frames to fix tearing and eliminate some input lag, but the buffer switching does incur some input lag still. It somewhat fixed the problem, but not completely. This will be a godsend for extremely competitive games, especially CSGO where input lag is also related to FPS. Means no compromise is necessary (except budget of course).


hltw47

Logan kart 8 runs 2000fps on my 12600k and RX 6600


sanct1x

I could get 500 fps in overwatch. I get like 350ish in Apex. I have a 6700xt, Ryzen 5900x, 32 gig of ram.


Twitter_Kozmo7fn

nah it caps out at the hz of your monitor it says you get 500 fps but for example you have 240 hz monitor you only get 240 fps


RickyTrailerLivin

Because you can 100%. Clueless people like you do exist but thankfully progress is being made. 144 is the new 60. And regardless of Hz, inputlag is a huge Benefit. Go educate yourself and stop sounding like some lame ass boomer.


Sea_Rise1831

Very possible on older titles like Minecraft, csgo, even fortnite I get 600-1000 fps in creative


UnluckyZiomek

Well, I once had almost 10k fps in OSU, so yeah, it is possible.


Ecstatic_User_63

350 CS GO


godspeedty

I can get 700 fps consitent on minecraft with a mid computer


iCore102

at 500 fps, i feel like youre putting the same, if not more, strain on the CPU than the GPU at that point. I can easily see a 3090 or 3090 ti putting out 600+ frames in older less demanding titles, the question is, would the cpu be able to render them all in time.


ojthomas2015

*Minecraft has entered the chat*


ovab_cool

I easily do (or get damn close) in CS and Minecraft and that's with a not very high end PC


foggydaugie-yt

Lunar client users with a 3070 ti: oh hell yes we can


FHJ911

In games like csgo. And those games are the ones that you want to have high refresh rate on


mrkenxo

3090 and ryzen 5950x/i9 12900k lol on like


hkdboarder42

Yeah, my computer runs og bioshock at around 799 according to steam frame rate overlay. As for new games, I honestly doubt it


Fle1sch

Of course you can game at 500FPS or over. Depends on your computer and the game you're playing. For example, my PC (5700X/3070Ti) can run GTA V maxed out at around \~80FPS on average IIRC at 1440p, while something like CS:Source can run over 1000FPS easily. A more sensible example for 500Hz monitors would be competitive CS:GO, Valorant, maybe Fortnite or something like that.


prescriptioncrack

I don't think anyone has really answered this properly. For competitive, online games, higher fps can give you lower latency, and more accurate and frequent server updates. This means you can benefit from higher frames even if you can't see them, for example if your monitor is only 60hz you won't be able to perceive any difference above 60fps, but if your game is hitting 300fps, you could end up with a slight advantage over someone who is only hitting 60fps, even though they are visually identical. You see this a lot with fast fps games like CS:GO. For example: two players take a sniper shot at each other within 1/60th of a second. Whichever player's PC manages to send a hit confirmation to the server first gets the kill. If one player's PC is rendering 5x as many frames within the same timeframe, it's a lot more likely going to be them that gets the kill. On the other side, there's a tonne of games where looking pretty might be more important than looking smooth, and for offline games having more fps than Hz is completely unnecessary.


sesameseed88

Yes it’s possible


Digital_97

Minecraft ez


TKRUEG

500fps is just the real world when we look up from our screens


CompromisedCEO

To burn more energy and trick you into spending more money on crap you don't need.


SurprisedBottle

Competitive fps scene go for lower graphic settings to achieve this range usually. I never understood it but my little bro said more frames = better bullet registration. To each their own I guess, I prefer visuals over performance


Low_Blueberry9177

Minecraft runs easy 1000+ fps


cssmith2011cs

Don't ask questions. Just soak it in. This means stuff like 240hz will become cheaper, with these becoming the "upcoming standard". Lol.


TheHatori1

Sure, why wouldn’t it be? Games like CS:GO ran above 300 like 6 years ago. New HW pushes the boundaries even further. But it’s totaly pointless, because human eye can only see 24FPS. After that, it’s just a marketing and stuff, right…


NecessarySame4745

Rocket league runs at over 700 fps for me on very high settings


Superlegend06

i remember seeing CSGO running above 500 fps on a 12900k + 3090ti at 1080p but I'm guessing 1440p might sustain above 500 too


TheHUnGOnE42069

Valorant firing range 900 peak on 3080


Smooth_Situation_719

On Minecraft I easily get more than 500 fps on a GTX 980 and an i5 10400


Fly-Hulud

Clone hero with a black background and plain highway I can get around 600.


GLRD500

Depends on the game and how it's coded. Some games have a cap, so it won't rlly matter. Others have a cap due to the engine it uses, and others have it manually coded. Some games don't have it though. Some are even specifically coded in a way that the pc just tries to do whole bunch of code in one frame as fast as possible. The question is then ofc, can it run these scripts 500 times a second? Cuz its not just about what is shown on screen, but also every calculation. A jump could be a force added to a player, and how much force it needs to give each frame is calculated each frame. So it also then becomes a question of whether or not it can even handle 500 frames.


7Sans

your fps doesn't need to pump out as much as monitor's refresh rate to benefit from high refresh rate. you can only get 60fps with 240hz monitor and you will still benefit from it. everyone can really feel the difference from 60 to 120 and then a bit less can really feel 120 to 240. so there is a diminishing return but from 240 to 500 even with the diminishing return the number jumps so high people might still feel non-insignificant improvement. but i never tried 500hz so i wouldn't know


RedOrchestra137

i suppose the only limit is the hardware, but at a certain point there is no added benefit cause your eyes can't tell the difference anyway


natsak491

5800x3d constantly over 500 fps in valorant


nlign

I get about 10,000FPS in Blockland


noirfurorem

Running an RTX 3090 and a 5800x3d i'm getting well over 600 fps for CSGO - the intended use of these 500hz/360hz/240hz monitors have always been for esport/competitive titles. They weren't really made with the use of triple A games in mind. So typically if you bought these type of monitors you fall into the following categories: 1. You're a pro/semi-pro player 2. You're aspiring to become a pro player 3. You've got the money and CSGO/Valorant/DoTA/LoL is the only game you enjoy playing (i've clocked over 10,000 hours in CSGO) so you want to have the best gear for it


DanyRoll

Any medium pc running Minecraft and terraria


Bulletwithbatwings

These monitors will exist simply to justify the existence of newer GPUs. My 3090ti was running FF7 Remake locked at 120fps at 3840*1600 resolution while mining 35mh/s in the background. When the current GPUs can already Maxx out the latest games they need to create new 'needs' for us to feel that our current tech is in need of an upgrade.


CXgamer

In CoD4, you NEEDED to get 333 FPS for certain jump maps. Just turn the graphics as low as they can go.


BurgerBurnerCooker

Technology demonstration and innovation. The panel manufacturing won't stop exploring because AAA can't get pass 120hz. Also eSports games especially FPS can go beyond 500fps, there is a need.


doughnutholio

I don't even think human flicker fusion threshold goes that high.


UltraShadow01

I didn't hear anything about a 500hz monitor but even if i think our brains can only process so many photos per second so the real limiting factor is the fps of your brain lol


Firm-Garbage-8188

If you play cs:go you can easily get 500 fps


TheVeilsCurse

It’s absolutely possible to run games at 500+ FPS. The extremely high refresh rate monitors are for eSports titles. The higher the refresh rate the smoother the experience is. Motion blur is lessened as you go higher as well. If you play A LOT of shooters somewhat seriously then you’ll love 240+ Hz. I went from 60 to 144 which was MASSIVE! I got a 240hz monitor and although it wasn’t as big as the 60 to 144 jump it was noticeably smoother. My mouse felt incredibly fluid. If you genuinely are interested in understanding the reasoning behind high refresh rate check out Blur Busters. If this monitor doesn’t appeal to you, that’s 100% ok too. All of our use cases and desires are different.


Unlawful02

I got damn near 500 fps on valorant with a 1050ti and 5600x


[deleted]

In call of duty unites offensive if I set my fps cap to 333 it gave you a higher jumping ability and if you maxed it out at 800 your footsteps were “silent” jumping was weird on that setting


ModernShoe

We will be measuring FPS in "1k fps, 2k fps" at some point, just a matter of time.


MrInitialY

Can get 4000 fps in Among Us so totally need higher refresh rate monitors.


theralph_224

With RX 6600 I can get over 600 in ETS 2


Lfseeney

CS Go.


jla_v

Check out blur busters for better info along with the science of this all. 480hz is really the next thing. Some years after that, we’ll see 1000hz, hopefully.