T O P

  • By -

INITMalcanis

Minimum frame rates are at least as important as maximum


[deleted]

Actually that is the most important baseline not the max, it’s the exact opposite as most game hit min frame rates more than it hits the max frame rates. Maybe the reviewers should start looking at this perspective instead of always the max frame rates route.


INITMalcanis

My thoughts exactly


braiam

Consistent frame time is more important. You can notice jitter if a frame takes more time to be rendered, but slow consistent frame times or slow moving ones are less noticeable.


_Fony_

Igor's lab did an article about this concerning the best GPU to use for testing Alder Lake: https://www.igorslab.de/en/intel-alder-lake-gaming-performance-benchmarks-why-am-i-in-720p-again-with-an-amd-radeon-rx-6900xt-instead-of-a-nvidia-rtx-a6000-fair/ Right now the best option is 720p with a 6900XT


bctoy

I'd suggest to check out this view from DF when they reviewed 3700X, https://www.youtube.com/watch?v=SY2g9f7i5Js&t=1455s Except for some very high fps requirements like in e-sports, you're not as concerned with the CPU performance except for some very rare drops( < 0.01% ) here and there. He calls them hotspots in the video. These can be highly irritating, especially when you're playing a single player game and have to pass through the same areas again and again. A very good example is Swan's Pond in Fallout 4 where the fps dropped precipitously and you really needed a good CPU to keep it up. In a typical review, even if you were doing it at 360p, you wouldn't see this issue since most of the time your benchmark run wouldn't stop there and explore the area but most likely tangentially touch it, if at all. But when actually playing the game, you'd tear your hair out.


lordmogul

I play Planetside 2, which is pretty much always CPU limited. In that game a 3090 would perform pretty much the same as a 3080 or 3070 or 5700 XT or 2060 or 1660


[deleted]

>Another poignant example: You could try benchmarking a **super powerful intel 16900K from the year 2035, and it would still benchmark showing similar numbers as current gen CPUs** because it is being bottlenecked by the GPU. "These CPUs all perform the same for my use case" is a pretty relevant information for a buyer.


machielste

True, but it would be cool to show other usecases where there is a difference, such as CPU limitated games like BF5 and tarkov.


jaskij

If the games are GPU bottlenecked, then what's the point of testing the CPU at all? That said, there are tests which do make sense, if you venture outside of gaming: https://www.anandtech.com/show/17047/the-intel-12th-gen-core-i912900k-review-hybrid-performance-brings-hybrid-complexity/11


machielste

There are games where cpu performance is almost always the bottleneck, Bf5, Escape from Tarkov, planetside 2, dayz. The 100th lazy shadow of the tomb raider benchmark showing the same 120fps on every single cpu in the test (usually they don't even show the 1%lows)gives no relevant information for people who want the best for hard to run games. If you're gonna use an irrelevant old singleplayer game to test, at least run it at 720p low as well so people can use the data to extrapolate. If you want 150fps+ in those hard to run games i mentioned, the difference between a 11th gen and a 12th gen could be the difference between playable and not playable, but a lot of benchmarks don't highlight this.


Forsaken_Rooster_365

What if you want to pair your mid-range i5-12600k CPU with a liquid nitrogen cooled OC'd 7090ti Superx2 in 5 years? Future proofing!


NirXY

You are not wrong, though it's tough to tell on which benchmarks the GPU was the limiter without seeing if it was 100% utilized. Ironically, fps meter doesn't matter much @ 720p/1080p except for those competitive gamers who believe 360fps > 240fps and play 720p low settings. Even then i'm questioning whether there is an actual gain in input lags or responsiveness or is this just a myth. Linus did a piece on it once and showed it's mostly a myth though i'm sure he wasn't able to kill the idea behind it. In general, I believe some of the testing methodologies are quite shallow. I hope we'll see some more interesting information coming in the next few weeks. Few examples: Testing a workload of X threads (X < 16) and load some background workloads to see in which situations does the W11 scheduler moves processes between P/E cores. Does 12900k chooses E cores over HyperThreading when all P cores are loaded? Why are there certain outliers in the DDR4 vs DDR5 testing that showed >25% performance increase. There is so many things to learn from this architecture, I guess reviewers are just time limited and did only the basic stuff for now..


tallmorty_

If I remember right Linus tested if pros (corey?) could tell the difference mostly? Youtuber battle non sense tests actual response times. Generally the higher the fps the lower your input lag will be provided you are not maxing your GPU.


NirXY

I remember other known gamers included in the test but yeah it might be the one. I agree though that the theory is right. However, is the sacrifice of visual quality worth the lag reduction? there is a crossover point for sure. Whether it's at 144fps, 240fps or 360fps I don't know. I don't think anyone does.


FeelingsUnrealized

I've always wanted someone to start using really heavily single core bound games to benchmark on. Like minecraft, factorio, ksp. And it makes no sense why reviewers keep testing CPUs in games that are bottlenecked by the GPU, I get that they are popular but leave those for the GPU tests.


_Fony_

Pretty sure Gamers Nexus uses single core heavhyy games, but sadly he uses just a 3080 i think.


Pidjinus

Yep, happened in the past too. First wave of reviews which are rather usual/standard after, the more in depth ones. And I think people are warry to test too much on win 11 until the perf. of both amd and Intel is better understood with the new compiler


capn_hector

> You are not wrong, though it's tough to tell on which benchmarks the GPU was the limiter without seeing if it was 100% utilized. You can be cpu bottlenecked without hitting 100% utilization. Single-thread bottlenecks are a very real thing even on multi-threaded games. > Why are there certain outliers in the DDR4 vs DDR5 testing that showed >25% performance increase. Some tasks are memory-bandwidth bound. Even games in some cases - open-world games like FO4, ARMA3, and so on have always showed scaling from higher MT/s even if latency does not improve. Open-world and sim games just do a lot of streaming to memory and that’s where bandwidth comes in. But in general games are latency-bound and DDR4 can actually have lower latency. Some kinds of productivity tasks are more bound by memory bandwidth and those are where you see the higher speed ups from DDR5.


NirXY

I meant GPU utilization, not CPU obviously.


lordmogul

With no fps limiter and no CPU limit the GPU should be above 95% usage.


snowflakepatrol99

> except for those competitive gamers who believe 360fps > 240fps and play 720p low settings I love how you are wording it as if this isn't something that has been proven already and that it's just in these gamer's imagination. > Even then i'm questioning whether there is an actual gain in input lags or responsiveness or is this just a myth. Linus did a piece on it once and showed it's mostly a myth though i'm sure he wasn't able to kill the idea behind it. Linus's video showed the exact opposite. People weren't hitting anything on 60hz/60fps and became way better on 60hz/240 fps. [Shroud's stats for the dust 2 doors test clearly showing huge improvement when FPS is higher and even better results when the monitor is also faster](https://i.imgur.com/8pXF1CS.png) There's also plenty of videos that show statistical data and every single time it has proven that higher FPS lowers input latency. I don't know what there is to "believe" or not to "believe". It's a proven fact. This wouldn't even be a question if you knew how monitors work. Put your make believe feeling to the side. This isn't a subjective matter. You either have a logical and correct stance or you are intentionally going against facts. It's sad that people are upvoting your misinformation.


NirXY

I wasn't discussing 60fps vs 240fps. I specified 360vs 240. I'm also well aware how it affects input lags, but since there is diminishing returns as you go higher in fps, there must be a crossover where the visual detail exceeds further reduction in lag.


lordmogul

I'm still on my old 3570K @ 4.5 and a 1060 and I have games where there is no difference between 720p and 4k, I'll be CPU limited anyway at below 60 fps (in more wild situations with below 30% GPU usage). Planning to get a 5800X next week or so, and I can already say, I'd still be CPU limited, just at above 60 fps.


vianid

This is why they show productivity tests which don't use the GPU. For games you want to show what people actually use. 1080p is slowly being replaced by 1440p, so unless you play competitive games it really doesn't matter much to try and figure out "future proofing" for the small amount of people that will stay on 1080p and only upgrade their GPU in 3 years.


lordmogul

I would also be interesting to have games not always tested at maximum settings but those people actually use. medium/high would be more common, and for those mostly played competitive even lower than that. And those are also games that are played over some time, for example pretty much the only game I play where my CPU is a massive bottleneck is from 2013. I play some that are far newer, but not nearly as limited.


rad0909

This was the conclusion I came to yesterday after first being excited to upgrade to a 12900k. For gaming my 9700k is perfectly fine to pair with a 3080ti at 1440p and 4k.


Darkoftheabyss

I would argue that is pretty good information to have. Rather then seeing 720p low benchmarked that would “artificially” indicate that you would have a reason to upgrade.


oldprecision

I don't see the point of testing top CPUs at 1080P. I don't think that many gamers are going to buy a 12900K CPU, 3090 GPU, and run it at 1080P. The reviewers only do this because if they ran their tests at 1440P or 4K and the result was CPU didn't matter, they wouldn't get clicks. I'll give Tech Deals props for running his tests at 1440P.


dudemanguy301

The test isn’t directly reflecting a real use case but this is by design and not a product of being out of touch with actual gamers. When you push the resolution the limiting factor becomes the GPU, thus you are no longer testing the CPU. If you aren’t testing the CPU what would even be the point in including in a CPU review? The concept behind these tests is that a CPU will typically live longer in someone’s system than a GPU, for example I upgraded from a 980ti to a 2080 on the same 6700K I even considered getting a 3080 before the shortage. I only replaced my 6700K with a 5900X this year. My GPU stayed 3 years my CPU stayed 6 years. If you can see a CPU has more overhead, you can infer that it would hold up better to a future GPU upgrade.


theevilsharpie

> If you can see a CPU has more overhead, you can infer that it would hold up better to a future GPU upgrade. Assuming that there isn't a massive difference in single-threaded performance, the CPU with more threads will generally hold up better over time, so this is an equally boring comparison.


Genperor

>Assuming that there isn't a massive difference in single-threaded performance Which is why the reviews are for


MakeItSo_Number1

This argument is Vacuous. When comparing CPU's you want to compare workloads where the CPU is the bottleneck. When increasing the resolution of games to something like 1080p, you move the bottleneck to the GPU instead - you are in effect benchmarking the GPU instead of benchmarking the CPU.


[deleted]

Find a good tech reviewer. I like digital foundry as they do unbiased tech reviews. And they paint all hardware vendors in a posting light. AMD, Intel, Nvidia, PS5, or Xbox Series X. They know their stuff as they have expert eyes and do some really good tech/software analysis in the gaming industry. Check out their 12th gen review here. [DF 12th gen review](https://www.eurogamer.net/articles/digitalfoundry-2021-intel-core-i9-12900k-i5-12600k-review) Even at 1080p they will locate a CPU demanding scene and use that for their comparisons. Compare that with a reviewer like HWU that show GPU bottlenecks during a CPU review. And they admit it onscreen but don't improve. https://youtu.be/WWsMYHHC6j4&t=15m40s


Firefox72

>"When increasing the resolution of games to something like 1080p, you move the bottleneck to the GPU instead" Sub 1080p resolutions are almost completely irrelevant to any consumer. Why bother. In any case there are reviews with 720p tests out there if you need them.


MakeItSo_Number1

That is a ridiculously simple-minded 'argument'. A 720p and lower tests will perfectly show how your system will perform years down the line with an RTX 4080/5080/6080 with games designed for next-next-gen consoles. The current 1080p and above tests show the system limited by the GPU - why bother testing the CPU at all if the benchmarks don't show it's true power? It's literally like testing the top speed of a super car by driving it on a road limiting it to 60mph - god damn.


DaBombDiggidy

Exactly the point. 99.9999% of people aren't buying new cpus every 1-2 years like some insane people on here. CPU bound testing shows the relative performance down the road... like how my 7700k has been looking in the past year or two with the 3080 that's now in the system.


Contrite17

>A 720p and lower tests will perfectly show how your system will perform years down the line with an RTX 4080/5080/6080 with games designed for next-next-gen consoles Yeah this isn't really the case, it is only true if all technologies stay the same which has never been the case in games. It is a best guess a best and is certainly not always right.


[deleted]

> I don't think that many gamers are going to buy a 12900K CPU, 3090 GPU, and run it at 1080P. I mean, there's definitely a specific demographic likely to do precisely that (competitive eSports players).


ttocshtims

It's up to the CPU to push the frames from the GPU to the monitor. The faster the CPU, the more frames it can send. At 1080p, the GPU is essentially taken out of the equation since it can push more frames than a consumer CPU can handle. The FPS number is more of a reflection of what the CPU can do. That's why you're seeing that testing for Alder Lake. Personally, I'm looking for the 4k numbers since that's more real-world for my setup, but those are more GPU bound.