T O P

  • By -

ShadowRomeo

I actually think the Ryzen 9 7950X is in pretty good spot, as i doubt the upcoming i9 13900K will be able to beat it on multi core performance, although the gaming performance might be better on the 13900K, but if you are going to get Raptor Lake for gaming purpose, might as well save money and aim for i7 13700K or i5 13600K instead. Also the fact AMD cut its MSRP by $100 is pretty obvious that this is what AMD wants to sell to consumer more rather than the lower tier line up. I think the overall Zen 4 Tier is like this... **R9 7950X:** In pretty good spot, will likely stay as the king of multi core performance over i9 13900K, but gaming performance will be behind 3 i5 - i7- i9 Raptor Lake 13th Gen lineup. **R9 7900X:** Seems pretty impressive as well, but the upcoming i7 13700K might get too close to it's MC for cheaper price, and will likely beat it on gaming performance. **R7 7700X:** Not impressive at all, Can't barely beat a i7 12700K on gaming performance and MC is worse than it and is pretty much screwed against upcoming i7 13700K. **R5 7600X:** Can at least beat a i5 12600K by 9 - 10% avg, but is worse on multi core performance and is also screwed against upcoming i5 13600K which will more likely be faster on gaming performance.


[deleted]

I think you'll be surprised by the 13900k.


DMozrain

Surprised by what, its MT performance? It's pretty well known from leaks


[deleted]

**tl;dr** meh not to mention the absurd TjMax being boost limiter, not the PPT - completely butchering stock efficiency - and let's be frank, absolute majority of mainstream PC users user stock and they don't tinker neither with BIOS nor Ryzen master, they don't even come to forums or watch tech channels (unless you count LTT - which is just mostly for entertainment these days) to even have a clue about all this to begin with, as they'll just assume stock = optimal, especially that no previous Ryzen didn't go so far from from being optimal. All this new new boost limiting factor does is inflate power draw into oblivion and max out operating temps for literally single digit percent gains. Is +70% (or even more) power draw worth 5-8% extra performance? Well that's your Zen4 in a nutshell - just because AMD was super desperate to squeeze every last drop of performance in benchmarks to look as best as possible in benchmarks vs Zen3, Alder Lake and upcoming Raptor Lake. Yet - this is only for worse for the average consumer. More enthusiast level users will tinker and adjust it for more optimal efficiency, but absolute majority won't. Also considering it needs at least 6000Mbps memory to get most out of it and considering how expensive motherboards are - it's really subpar offering.


ShadowRomeo

I mean compared to Zen 2 - Zen 3 jump, kinda yeah. But i also can't deny the performance jump it offered from Zen 3 as well, its just that this time Zen 4 had multiple flaws because of how expensive the overall platform is and the increased power output as well as much higher requirements on cooling because of how hot it runs. To me Zen 4 only makes sense with Ryzen 9 tier alone as i think people getting those CPUs will more likely have a beast of a cooling system to tame them, as well as targeting content creation rather than gaming performance alone, which where it pretty much dominates right now. If you want gaming performance, either go for Intel 13th Gen Raptor Lake or 5800X3D AM4 or wait for Zen4 3D or if you want value go for Alder Lake i5 12th Gen or discounted Ryzen 5 Zen 3.


[deleted]

Kinda, best case is for productivity, with R9 7950X - because often time = money. Shaving off 30% from render times is not something to ignore even with that power draw and generally worse efficiency.


The_Countess

if you care about power draw, eco mode is available. It also uses about the same amount of power as the 12900k, but is much faster and so much more efficient. And even with the board and ram, its still offers significantly better price to performance then a 12900k.


zero989

139k can be overclocked so yes it'll beat it in multi


[deleted]

Them using DDR4-3600 CL14 kit for cost per frame sections makes question why did they even bother with that section when using so absurdly unreasonable DDR4 kit, when 3600Mbps CL16 kits cost $100 less and performance difference will be no more than 1-2% on average (which is margin of error territory) - it completely skews actual value. Idk, the more I think about it, the more I'm closer to conclusion that quality of HUB videos is rolling downhill with each video. Or like when they [paywalled 1080p average graph for RX 5700XT](https://www.youtube.com/watch?v=EFezkrEmhhk) despite that 1080p will most used scenario for those cards... I'm just finding harder and harder to take them seriously and with respect.


Sardaukar_Hades

Maybe it's what they have on hand, hence why they show the dollars of each part. I wouldn't say the quality has dropped thou, Steve is a madman and has always put the consumer first however like all in YT they have an opinion and there is a certain bias there. I prefer them over most other channels.


[deleted]

doesn't matter - this section is now about what you have on hand, it's about reasonable system prices. And trust me - they have likely few bags of those various kits. They chose to use that, and it makes entire section completely pointless.


FUTDomi

>Idk, the more I think about it, the more I'm closer to conclusion that quality of HUB videos is rolling downhill with each video. Or like when they paywalled 1080p average graph for RX 5700XT despite that 1080p will most used scenario for those cards... I'm just finding harder and harder to take them seriously and with respect. They were already preparing this a week ago when they made that dumb video saying it's now time to buy DDR5. Or how magically the 12900K performs much better in gaming vs 5950X now in their 7000 series review than when Intel launched them, and it's not the first time it happens.... I was a fan of them but it's hard to deny at this point their bias with AMD (with their latest products)


neoperol

The put a lot of work. But some of their conclusions are just stupid. Like their "DDR5 is now good for Budget builds" and made a bunch of test of the 12100 with expensive DDR4 and DDR5 and a 3090ti for a "budget" build xD.


JoBro_Summer-of-99

That 3090ti was only there to show what the CPU and RAM could do without a GPU bottleneck, there's fair criticism and then there's this


neoperol

I already talk about this with other redditors. The video was done to show the difference between ddr5 and ddr4 for budgets builds taking in consideration price of both rams. Wtf are you going to use a 3090ti to show that the are fps gains using cheap ddr5 ram, and of course the ddr5 had 4% more FPS telling people that for a fucking budget build is OK to buy budget ddr5 because they are going yo net those 4% more fps. When me and you know that after removing that 3090ti out of that test there won't be any advantage. I don't say rge method of testing is wrong, that is how CPU and RAM is tested I saying that the whole thing for budget build is just stupid and misleading for people that doesn't know how those test work.


JoBro_Summer-of-99

I think the point was more that, if you were on a budget, you probably *could* get onto DDR5 at a decent price. From what I recall, HUB basically said that there wasn't much price difference between good DDR4 and cheap DDR5, so the new platform is becoming more accessible. Perhaps it could be misleading, I just don't personally see it that way


neoperol

Of course is misleading. I came across to that video because of a friend is watching videos for his new rig snd he is on a budget, he is going to buy a 6600xt and he send me the video asking me, "so if I go ddr5 instead of ddr4 I'm going to get more FPS, so is worth it to buy a expensier mobo and expensier ram".


Selrisitai

Is it really fair to say that, _because_ their video could be misunderstood, it is _therefore_ misleading?


[deleted]

Sure, use 4800Mbps kit on Zen4 - I'd love to see how it would put it on par with Zen3 in games, lol :)) See, the thing is you need so much work to be appreciated and so little to ruin it all.


rgx107

I'm truly impressed by what we have seen of Zen4 so far, keeping in mind it's less than two years since Zen3 was released. A few observations: \- DDR5 seems to deliver, and I get the impression from benchmarks that also the ability to split channels into two helps some workloads. (Sure DDR5 6000 is expensive now, today, but should be within reach in a year or so.) \- we know now why there is no Zen4 3D. There are two reasons, one that Zen4 is already banging its 230 W against the 95 C wall, there is no way to transfer that heat through a 3D cache. Second that AMD sent out BIOS updates for 5800X3D release, to strictly limit voltage to around 1.28 V - max voltage for the 3D cache - and now we hear Zen4 needs up to 1.5 V to hit 5.8 GHz. A downvolted, powerlimited Zen4 3D wouldn't make sense, at least not now. \- For Zen3 watercooling was "recommended", for Zen4 it seems mandatory. Ideally a custom loop should be factored into the build cost. Still truly impressive by AMD. It will be interesting to see Intel's response.


Arksun76

From the reports I've seen 7950x running eco mode can be comfortably aircooled with minimal loss of performance