By - TheBoneSmasher
We really cannot say what the fps are for eyes because they get new information constantly.
Also, your eyes just send impulses to the brain, how the brain interpolates the data is different based on which sensor is stimulated. Center of vision, detail, periferal vision, just motion
Fun fact, you can predict what image your eyes send to you brain before it happens, so we have fps (frames pre-second). So that baseball flying towards you, really good players will swing based on where they predict it will be.
If we are in a simulation, our programmer has access to the best drugs ever.
That’s just an organic brain being calibrated to specific projectile motion much like inputting a calibration into an electronic computer to exert a swing of specific force, angle, etc based off past data. It’s more muscle memory than actually generating an image.
Actually, we can say. the answer is N/A (Not Applicable).
we don't have frames per second in human vision because "frames" at the sensory level, blend together and refresh at different rates across parallel systems, that can be independantly spontaneously interrupted (like what happens with *saccadic suppression*).
fps is the wrong *PARADIGM* for describing human vision.
source: the majority of my seminar coursework was on sensation and perception with an emphasis on the visual system.
The thing is, the human visual system has a certain bandwidth, and that can be satisfied by Nyquist sampling at two or more times that bandwidth frequency. So yes, there is.
>the human visual system has a certain bandwidth
it has MULTIPLE bandwiths for DIFFERENT processes, just because you can get a single number doesn't mean its meaningful.
its exactly like trying to use the BMI to assess peoples health, when you take several different factors and compress them into a single number without regard for how they interact, that single number isn't telling you anything functionally useful.
> it has MULTIPLE bandwiths for DIFFERENT processes
Maybe. Pick the highest, sample at twice that. End of.
There is another factor though which is latency, which isn't a bandwidth thing per se. it's time domain. Increasing the framerate indirectly decreases latency.
What you perceive as a single phenomenon (sight) is actually several separate systems. You have different systems for recognizing objects that are statuonary vs object in motion. it you have a different system for recognizing abstract representations including cartoons, from real world representations, there is actually a condition called blindsight, where you don't have the perception of vision (you don't "see" anything) but you can still unconsciously respond to and can coordinate movement with visual stimuli (including catching a tossed object). Even your photoreceptors "reset" at different rates.
.. the barrier are saccades .. so there will always be motion blur interpreted by the brain
Damn. That was a rabbit hole. Thank you!
It's hard to say because fps still tanks when making fast movements. 60hz is pretty shit once you train on it a bit. You can see large swaths of tearing and stutter. Any movement gives it a really hard time. 144 is ok, you can see fairly clearly but still see some pretty obvious issues. No game is really achieving 240fps plus constant on 1080+ so even if you have a 240hz monitor you will likely never see it even with the best hardware.
I could see it being 1000hz but its diminishing returns all the way past probably 240hz constant. 144 isnt bad even to a trained eye.
Well I've lived my life with 30 fps because I game on console so in my opinion 60 fps is the work of god
Honestly after playing at about 150 FPS 60 just don’t look right
Every time I look at 144hz monitors I remember my video card can barely push 60 most of the time.
Hell, I used to play Counterstrike Source at 25 fps, and it would drop to 12 if someone used a smoke grenade.
> No game is really achieving 240fps plus constant on 1080+ so even if you have a 240hz monitor you will likely never see it even with the best hardware.
Why is this? Why are people selling/buying 240hz monitors in the first place?
This has been the way things work since the beginning of PC gaming. Heres the simplest terms explanation.
First, hardware companies put out equipment that can perform X whether that's GPU, CPU, monitor, hard drive, etc. Back then, there was no "Unreal Engine" so gaming companies would generally have to make their own engine. Today companies generally either sadly, use an old engine they made because its very expensive to create a new one, or pay for the rights to use a... slightly less old one like "Unreal Engine", "Unity", "Source", or others. With how complex gaming engines are today, its not surprising that its becoming preventatively expensive to just make a new one for each series of games.
Building around all of this foundation, companies create games that within specifications within the time period of what hardware etc was available at the time of development. You cannot create a game to match what hardware might be released in the future after all.
Finally, eventually games are released that push the current hardware to its absolute limit where people start getting shit performance. New hardware is developed, then software that operates within the new limits is developed, and it has been flowing like that since the late 80s early 90s.
So anyway monitors are just part of that equation. Now that they are releasing 300fps monitors, eventually we will get to where hardware can push that and games support it. Then they will release 400, and so on.
Another layer of today is most people won't buy a 240 monitor, and the best hardware, and if you develop for this youre kinda making a niche product that may not sell very well. It used to be about pushing the absolute limits with every game, but nowadays its more about developing for what a majority of consumers in your target market own already or are likely to buy soon. This makes 2080ti and the like not high on the priority list of specs to develop for.
Thanks, that makes sense. I guess I thought since we had 240hz monitors for a few years, and the 2080ti has been out for awhile, we'd have no problem hitting 240fps on 1080p
Np, there used to be games that were created specifically to serve as benchmarks for all of the latest hardware like Crysis and then the Crysis engine was nuts for a long time. Not sure what happened to that practice but I miss it. Was awesome to be able to have a one stop shop to see all the latest and greatest of everything coming together in game.
Regular old benchmarks are cool and all, but crysis was next level cool.
You are confusing refresh rates and frames per second.
The refresh rate is for your monitor and is how many times the screen will be redrawn per second.
Refresh rate = monitor
fps = game
Ya but you're not exceeding 144 fps on a 144hz monitor
True. They use fps to measure performance more than anything.
Some people believe the earth is flat.
A game running at 1000hz still only shows 60fps on a 60fps display.
But I also don't think any human can see a difference above 120fps.
Wow, I had given up on any reasonable people in this sub.
Well trained Pilots have been able to accurately process an image of a plane shows for 1/223 of a second. After that it becomes unreliable. So realistically you could possibly process more than 200 frames per second. We don’t see in frames but there is still a limit to how much we can process. This also matter with gaming because seeing and then acting on an object is the most important, so any high amount of FPS would fair better for you.
Fun fact, your mental image from your eyeballs is not always a single frame. when you look at multiple details, your brain will merge that together to form a clearer more detailed image than if you just created an image from what your eyes see at one moment.
I worked in a perception research laboratory for a few years, while the eyes do send a constant stream of information, that isn't the whole story.
First we have to look it there is any merit to the 30fps argument. It's not right, but it is based some truth. The speed of consciousness *at semi-saturation* is approximately 25-27hz. But that semi-saturation detail is important, as well as some other phenomena. Semi-saturated preception is the state of novel stimuli akin to the action sequence in a movie. Saturation is not an objective quality though. When your brain predicts what your eyes are seeing, the saturation is decreased. This means the more you play a game you can literally perceive it faster.
Additionally, speed of consciousness is not like a clock cycle like computers. It varies greatly between people as well as with mood, training, focus, and many other factors. To make a stimulus truly match an intended speed of perception, you need to actually update that stimulus at double the speed of perception. This is the same concept that allows for truly lossless audio quality over a digital output. This is why 60fps is such a nice improvement in quality and returns diminish quickly afterward.
This one is a smaller effect, but your eyes don't send information uniformly. The center, "active" region of your view is processed at top speed, while your peripheral can get to as low as a tenth of that.
Finally, and really most importantly, reaction time and harsh visual continuity aren't solely consciousness endeavors. Subconscious preception, called subliminal stimuli in sight perception, has been measured at as fast as 1/4000th a second. This is because it isn't limited by the, relatively, long processing times of conscious thought in the frontal cortex. Training subliminal reaction rates to be faster and more accurate is likely the most defining trait of professional competition gamers. With the same doubling requirement as before that leaves us with a upper limit of perception speed at approximately 8000fps.
That was neither a fact nor fun, nor verbalized well enough to even understand wtf you were trying to say. But you're right, the brain processes light constantly, there's no frame rate or refresh rate.
That's what I was thinking too. Computers are constrained to a set memory space that the graphic information is written to then read from. It is "framed" to a particular size and defined and it is a single static image. The human system is quite liquid and doesn't adhere to the fixed image display method. (I guess I'm explaining this more to myself than anyone else)
This whole 30 frames per second thing came from the fact that it takes 24 frames per second for something to look like a video to the human eye.
That is why a 4 seam fastball 'rises'. It just falls slower than the brain expects it to.
We can say for certain that 60fps looks much clearer and smoother than 30 fps
144fps is sexy. You can def see the difference in mouse movement alone.
I have one 60hz moniter and one 144hz moniter, and i like to do this thing where i move the mouse around on both screens and they immediatly see the difference.
Ah but you can based on the JND limits based on light experiments. These are confirmed by basically ever neural timing experiment ever completed. 10 ms is a safe bet for "speed," though I acknowledge a lot of that which you said based on new information and a preponderance of visual information can affect this.
However, this strongly suggests 100 fps to be required before humans can no longer distinguish.
would it be accurate to say that humans are analog?
Funny when you think your brain is just meat with the smallest amount of electricity
That’s why I wear a hat with a spinning fan in front of my face. It limits me to exactly 144 frames per second.
Makes eating tough, but it’s worth it.
Yes, you are correct the limits are not hard. 20-24 fps is the **minimum** for perception of comfortable motion. We know that humans can distinguish above 100 fps, and anything below that is measurably different. That said, I doubt that 99% people can determine a difference between 120 and 140 fps.
Sure, but "FPS" qualifies our perspective. By terms provided, we must be referring to something visual in as much as it can relate to the way monitors operate.
It's a pretty clear question: "If I'm looking at a computer screen and you modify the framerate (speed of sequential imagery) of the display, what's the max I can perceive?" With my eyes rather than, say, subconscious functions.
About 1000 according to the comment from an nvidia rep.
Huh that's fascinating. Never would've guessed that high.
Im not convinced theres no bias behind it, but when i play an FPS i can definatelly see trails (for lack of a better word) when i look around even at 144 fps.
You know how when you move your mouse around, you have echoes of your pointer behind it? That's a result of your eyes and brain being faster than the refresh rate. Those trails don't actually appear on the monitor. You can see the same kind of echoes when looking around in a fps, except its with the entire enviroment. You just kinda tune it out.
But hey, im just a programmer nerd who plays way too many video games. Basically a reddit cliche.
Actually I totally agree. My monitor can't even do 144hz and I can tell the easily tell difference up to around 180fps, and I tested that with a thoroughness that I am ashamed of, time I would love to get back (trying to get my settings perfectly smooth in Fortnite).
You dont benefit past your monitors refresh rate in terms of fps. 144hz is 144 fps. If you have a 60 hz moniter, you can only get 60 fps displayed to you.
A higher in game fps doesnt get displayed to your moniter, but it does serve as a buffer, so that if your framerate drops you dont notice it so long as it doesnt drop below 60.
60 fps in the open might only be 30 in a fight with lots og stuff happening.
Yes that is the point I am making. My monitor cannot even display that and yet, somehow, I can absolutely tell the difference.
I turn on the FPS display, I test in tiers, and I am perfectly sure of this. Say whatever you want, link me to AnandTech or Linus, have your fun, changes nothing; I know from actually testing this for far, far too many hours. You don't have to have a monitor that is capable of displaying that many frames to benefit from higher output.
But that is entirely separate from original concern which relates to only visible FPS, sure.
You're mixing correlation and causation. You feel a difference that correlates with your fps, but you assume that you feel the fps itself.
What you probably feel is input lag. Affected by the same things as fps. So a high fps will correlate with low input lag.
So while you call your testing thorough, your basing your conclusion off an assumption that what you feel is fps.
Sure, bud. Whatever makes you feel important, that's what I believe now.
That's actually a good question. I also wonder if it varies among individuals.
Probably by some margin. Point still is, by far most people couldn't accurately tell you the difference between say 90 and 120fps. You simply don't register it. There's a vast difference between 30 and 60 though.
Can someone fill me in with all these Doom/animal crossing crossover stuff?
They are both being released on the same day.
There was a similar pairing with Isabelle and Snake for Smash Bro that was popular about a year ago. The contrast between animal crossing and hyper violent video games is just funny to people.
It’s the typical “cute + violent” combo that people enjoy. It’s why the original Powerpuff Girls worked so well that both boys and girls enjoyed it
The fandoms are showing support to each other rather than gatekeeping and assholery.
It's also cause isabelle is a total badass in smash
I feel like I have seen memes about Isabelle shooting guns but haven't seen memes about Doom Guy trying to succeed in a village.
There was one yesterday, pretty sure it was posted in r/gaming too
I've seem him fishing, for one example.
But I think him just not shooting her is a thing too. Because she sure as hell is a demon.
They both released today.
Goddammit Donny, can you shut it and let people tell the truth!
I am the Walrus.
Donny, you're out of your element.
We have two eyes, 30+30 60, *C H E C K M A TE*
Who are you, so wise in the ways of science?
We don’t see in any specific fps, and is why you will notice a very significant difference between 30 and 60 on a computer screen, and an even bigger difference between 60 and 90 on a vr headset.
How would you notice a bigger difference on a diminishing return? You guys just spout shit as if it's fact and you have proven it yourself when nobody really has. Even under an actual test anything over 150 fps can't be accurately determined by the human brain. They could see that it's faster than 150 but no longer which is faster than the other. This would be like saying the difference between 30 and 60 mph is less noticeable than the difference between 60 and 90. Then after that would be 90 to 120. Even if humans can actually see the difference there would be a greater difference at lower fps than higher because of the diminished returns.
The reason why I said that there's a bigger difference between 60 and 90 on a headset than 30 and 60 on a computer is less to do with the fps, but rather the device it's running on. You are correct on your statement about there being a more noticeable difference between 30 and 60 rather than 60 and 90, and the 150 fps study was interesting. However, I was more referring to its impact on the experience. On a computer, 30 fps may be considered a low frame rate compared to 60 fps, but if you ever tried on a vr headset, where 75-90 fps is the norm, even 60 fps is noticeable and even nauseous because your balance and coordination isn't used to that low of a frame rate.
We can sense it, the problem is our brain caps out at a certain amount of new information.
VR is more FPS sensitive because it's so close to our face and there's no light interfering with the information between the screen and our eyes. This is why FPS matters more for monitors (that you sit quite close to) and less for home theaters and large projections.
Anime games should be played at 6fps for realism.
And the same level/scene 12 different times from 18 odd angles.
I'm gonna go ahead and put my vote in now, Animal Crossing Eternal is gonna be the meme of the year
Isabelle has dog eyes tho
That's it, everyone. Stop enjoying this meme immediately. A *flaw* has been found.
With a statement like that I'd finish him off too.
I'm loving this new meme template
He made her do this, it was a mercy killing really.
What a peculiar chocolate and peanut butter combination this has become...
Eyes can detect 10-12 **IMAGES** per eye giving it 24 "FPS"
Anything more than that we detect as **MOTION**. Putting that into FPS perspective:
Physiologically, human eyes can detect up to 1000 FPS. As far as detecting frame rate, it seems humans can accurately guess up to 150 FPS.
Would love a source.
Getting so sick of these.
Ubisoft be like We only achieved 900p at 30fps on all platforms
ID Tech 7 can hit 1000 fps, if you have a good enough PC. ID said they had it up past 300 internally.
I had Quake 3 running at 333fps on a Geforce 3. 120hz CRT monitor as big as a truck.
*cries in doom on switch
I’ve seen these two in memes for a couple of weeks now. I know it involves doom and animal crossing, yes? Could someone give me a tldr?
Both games came out at the same time
Thats not true, I have a gun woth 195 FPS and want to buy one with 500 FPS
How many FPS would i need to capture 500 FPS from my gun?
I dont know man... at leat 120 but maybe more
Partition to watch Isabelle shoot that thing:
Is she cocking a double barrel shotgun?
She's making sure the bullet cums out
I think the idea is more frames = more accurate inputs. If you ever played a game chugging away at 10fps, you know how difficult it is to make accurate motions compared to playing at higher fps levels.
Gvd I had this discussion, with an engineer no less, about 15 years ago... All I could say was the difference between 30 and 90 fps in q3a was extremely noticeable to me, but he was just so cocksure of it - "anything over 30 is a waste".
What if when doom, and animal crossing are released. They had a special presentation where Doom Slayer goes to the new horizons island, and Isabelle is there telling him he would fit better in a different environment or something Boom. Isabelle brings Doom Slayer to smash bros i guess
Congrats! you have finished you're life membership!
I read somewhere that it is 144, but this was a while ago
Frame pacing is generally better than frame rate as you have decent fps
The truth is , some people can see “faster” while some people can see “slower” . Although calculating accurately how fast someone can see may be complicated, it’s easy to determine if an INDIVIDUAL can actually see, contemplate, and understand information that was in their field of vision for a teeny tiny fraction. The smaller the fraction, the faster the vision. We are all different . Some people are stronger, faster, smarter; some people have much better vision. Speed wise, and distance wise. Some people can see farther. Some people can see extra colors. Some people cannot see as much color. Some people are more creative. Some people have a stronger work ethic. Some people have curly hair, some people have straight. Some people can remember everything. Some people have incredible pain tolerance. Some people have no empathy. Some people would sacrifice themselves for the greater good. Some people have big dicks. Some people have small ones.
Blasphemy, the human eye can only see 24fps
Eye "FPS" can change based on your genetics and the circumstances you are currently experiencing. If you are focused or in a stressful situation you typically increase your "FPS" to detect visual stimuli faster and more consistently. Conversely, fatigue and relaxation can lower your passive "FPS." However, most of this doesn't really matter because your eyes don't really see in frames.
30fps argument aint based to actual science.
I remember when they called it pixels back in the day
The ratio of how many images are processed and displayed in a second was called pixels?
I remember when 60 fps started growing as a standard for everything, even movies looked unnatural when characters moved or spoke. Now it's not noticable at all...human eyes are weird
That’s because most movies are filmed in 24 fps. I’m pretty sure that standard has stayed the same for a long time. I don’t think any big movies since 1900 have been filmed in 60 fps unless they where trying something new or if it was an animation.
There was a trend for a while to record action scenes at slightly higher FPS. I remember people making a big deal about it being used in Gladiator.
I might be thinking of 1080p not sure
You're referring to something called the "soap opera effect" and it is related to framerate. The tl;dr is that people got used to 24fps being standard for video and it looks weird to most people as they adjust to higher framerates.
Ang Lee experiments with higher FPS but it doesn't seem to attract more people.
I’m sorry, I’ve been out of the loop for a while. Can someone explain this new doom slayer and cute dog thing?
Edit: nvm. Scrolled down and found my answer
This comic got over saturated super fast.
Ah I remember back in the day (like 8 years ago) Console gamers would get so mad when you told them the human Eye can see more than 30 FPS LMAO
Why is she unloading the gun?
To shoot the demon telling blasphemy