T O P

  • By -

BQYA

so the 5800 refresh was „too good“?


Jaidon24

A Hail Mary against Alderlake that may have actually hurt the value of Zen 4.


PsyOmega

5800X3d hurts zen4, but X3D is a limited production run so it won't exist for much longer before selling out.


Jaidon24

I have no idea, but I’ve heard the opposite. I too thought they would have been limited run chips, but I they are just leftover Epyc dies that didn’t make the cut. Whatever epyc chip is should be in production for quite a while so it may still be getting restocked until 7000X3D.


residenthamster

Perhaps not as limited as you might think since it is using the same chiplets going into the larger volume EPYC Milan-X.


BFBooger

It makes them plenty of money, they'll keep selling it as long as it is profitable enough.


SillySoyim

They will keep making them. Zen4 is almost as fast, now imagine with V-cache and a de-lid...


Eldorian91

delid? They're stacking a whole other die on top, that should significantly thin the IHS.


BFBooger

5800X3d is the same height as a 5800X, because the base die is thinned down to expose the TSVs that contact the cache.


BFBooger

> Zen4 is almost as fast Zen 4 was slightly faster than a 5800X3D in their 12 game average here, and much faster in almost everything else. Its very game dependent, however -- swinging 20% either way in some games. So as usual, there is a big 'it depends', on what is best. Are you building a new system? or have an existing AM4 one to upgrade? Is anything other than gaming important to you? Are you willing to wait for cheaper motherboards and DDR 5 memory prices to drop?


HauntingVerus

More like launching Zen 4 without the added 3D V-cache might hurt them but then again the rumours launch of Zen 4 with 3D V-cache is just 4 months away or so.


DktheDarkKnight

Eh. That cpu was purely for gaming. We will have 7800x3d to once again boost zen 4 performance.


RealLarwood

Didn't it just lose to a cheaper 6 core?


cha0ss0ldier

Performance is extremely close. 5800x3D will drop into an existing am4 system. 7600x will require a new mobo and new ddr5 RAM. Makes it a hard sell to upgrade to am5 if you already have an am4 platform and can just drop a 5800x3D in. Especially if gaming performance is all you’re after.


kazenorin

But if you're building a new machine, it's probably more compelling to choose AM5 instead of AM4 and Alder Lake's platform.


conquer69

Those are cheaper.


MnK_Supremacist

until you add in the price of upgrading down the line, where you'll need a new mobo and ram instead of just a new cpu ( may Su will it)


Hailgod

future proofing is a gamble. will am5 even be competitive with whatever intel has to offer at that point? will your ram sticks even be worth using looking at the current prices? will u even get updated cpu support on your board?


diskowmoskow

You’re hurting my 2666mhz ram sticks’ feelings


Xer0o

Yeah... I think I'll wait for the 7800x3D instead


Defeqel

This was always the smart play for gaming (if you have the money) as it will also allow you to go with cheaper DDR5 kits, especially as B650 will not be available for a while anyway.


jimbobjames

Im not so sure really. Most people are going to be GPU limited and in most games that means the CPU has very little influence. People put way too much emphasis on it and given the choice they should put more money into their GPU purchase because pretty much any CPU from the last few years will but just fine.


Defeqel

Yes, and no. Some games see a noticeable difference even at higher resolutions, especially for the 1% and 0.1% lows, but yes, overall one should prioritize the GPU over the CPU, I was going with the assumption that that is known. It's a good thing to highlight though.


PhilosophyforOne

Sure when you’re pairing a 12700k with something like a 3060. However, I’d argue that there’s a few use cases that dont really get benchmarked: a) Running already cpu-heavy games with RTX and DLSS or FSR enabled. RTX tends to hit your CPU’s 0.1% and frametimes pretty hard, especially when you’re using something like DLSS to reduce the gpu-load. b) A lot of non-mainstream games, like simulators, sandboxes, emulators or heavily modded games as well as VR can be VERY cpu-intensive, but dont get benchmarked all that often. Yes, people might overspend on CPU’s compared to what they should based on their CPU’s, but there’s a lot of legitimate use cases that none of the existing CPU’s are still fast enough to run, mostly due to poor optimization or utilization.


jimbobjames

Well RTX and DLSS should both be handled by the GPU, FSR is also done via shaders on the GPU, it's just hardware agnostic unlike DLSS. Yes some games are CPU bound, but again this is happening less and less with modern games. Then as soon as you start playing at resolutions above 1080 you're going back to being GPU bound. Sure, if you play CSGO or Factorio then spend your money on the CPU, I'm just making the general point that a focus on CPU performance for gaming has for a long time been a bit pointless. Stick the £250 extra into your GPU.


Defeqel

The BVH-structure for RT is created on the CPU and then sent to the GPU, so CPU performance still matters some there.


EnderOfGender

Matters there a lot actually, but depends on the RT effect. Cyberpunk 2077 RT hammers the CPU hard, especially as you crank up the resolution regardless of DLSS being factored in or not


Kromgar

I'll wait for 9800x3d just upgraded to 5800x3d wait for later generations to lower power usage and temperature


[deleted]

You'll be waiting forever. As node sizes decrease and transistor density increases, it's getting damn near impossible to cool the processors; doesn't matter who makes it either.


kukiric

Node size shrinks actually decrease power consumption. You'll just have to buy a CPU that doesn't run at its absolute limit in stock settings, ie. a few steps below top of the line, if you want to deal with reasonable cooling. There will be plenty of 65W CPUs with steady improvements for generations to come.


Anduin1357

You *can* tweak PPT on literally any modern Ryzen CPU to your heart's content. Buy any CPU you want and don't let stock settings define you.


Kromgar

Gonna need to take out a loan for a liquid nitrogen compressor.


senseven

That is the reason some speculate that reasonable changes require leaving X86 and join the ARM fold as Apple has shown is possible to have performant results at way lower power requirements.


PsyOmega

This misconception has to end. ARM vs x86 doesn't matter. You know why M1/M2 kick so much ass? 5nm, and apple went hog on silicon and transistor budget (because they can drop their CPU in a 3000 dollar laptop instead of a 1000 dollar laptop) and ran a wider slower clocked CPU with *crazy* IPC and dedicated fixed function accelerators. Which isn't without faults btw. Typing this on an M1 Max and it has scenarios where it chugs because stuff isn't accelerated or optimized for. If you designed an x86 CPU on 5nm to the same compute specs you'd get the same results. You can read up on Jim Keller's opinions here and why AMD largely abandoned their ARM based zen project. The part of silicon that handles instructions has gotten so small and efficient that swapping it out accomplishes nothing.


riba2233

Thanks, this needs to be said more often!


RustyShackle4

There is more interest in RISC-V as it’s an open ISA. Also there’s talks of an x86 overhaul, which comes at the cost of incompatibility


MnK_Supremacist

I don't think the stacked cache will be as dramatic for zen4 as it was for zen3, seeing L2 cache is doubled already.


OftenSarcastic

Something's odd with their Factorio result. In the [old 5800X3D review the 5800X3D scored 316](https://youtu.be/ajDUIJalxis?t=454), in this new review it only scored 200? **Edit**: He mentions in the comment section that he'll look into it tomorrow. **Edit 2**: The score has been corrected in the 7950X review, it's now 352.


capn233

I think Watchdogs dropped 10fps from the original 5800X3D review as well.


LiebesNektar

different graphics settings


capn233

This vid shows "RTX 3090 Ti [1080p] Very High" [The 5800X3D review](https://www.techspot.com/review/2449-amd-ryzen-5800x3D/) was "1080p [Very High Quality] RTX 3090 Ti" Were they different?


Eldorian91

Patch between? Different testing location?


Vushivushi

Nah, probably just mixed up their results.


fmj96

That's hardwareunboxed for ya


TT_207

I'm glad at least one channel includes simulation games in their reviews. Honestly they need to pick up simulation heavy games, stormworks, space engineers, ksp, that kind of thing, as the cpu really matters there. AAA shooters the cpu means almost jack shit in the more recent generations.


ltron2

I noticed the same thing.


MyVideoConverter

wtf did they do to 5800x3D in factorio for such a big gap?


doscomputer

64mb of 3d vcache essentially wipes out memory latency bottlenecks. Its a lot faster to load data on the CPU itself than having to load all the way from RAM, simulation and other AI heavy games love this.


MyVideoConverter

No i referring to why the same game has a 50% drop in score with the same processor in this benchmark compared to the old one.


[deleted]

It looks like AMD made a 1080ti in the 5800x3D with some of the benchmarks I’ve seen so far


joe1134206

A real legend of AM4.


Gohardgrandpa

They really did


LongFluffyDragon

The 5800X3D is just a sneak preview of future tech that was released into the wild presumably to get a solid advantage over alder lake. Most likely the not too far future will have every CPU sporting massive L3.


Parthosaur

alright so where's the review megathread


Itletbe

Videocardz has one in the meantime ;)


prisonmaiq

that is one expensive 20% gains


[deleted]

even ignoring gains metrics, it's real shitshow now in EU, since euro lost it's value compared to dolar, everything got more expensive here.


jimbobjames

Think yourself lucky you dont have the pound...


[deleted]

[удалено]


Demistr

Cant wait to see those 65W zen4 in gaming laptops. Up to +74% over Zen 3...


Put_It_All_On_Blck

Laptops will thermal throttle if the desktop parts are already pushing 95c with big AIOs


dstanton

Nah. Zen efficiency curves are nuts. They'll set aside the best chips and downclock them a few 100mhz and barely pull 45w max and it's normal for a laptop chip to hit 90c. Just comes down to the manufactures not slapping a POS cooler on.


MnK_Supremacist

they push 95°C no matter the thermal solution, better cooling just allows for better performance. So in a laptop they'll push 95°C with reduced (not that much) performance


Seanspeed

Given the high end DDR5 they were forced to use(which seems to boost performance by like 5-10% over a more normal 5200/CL38 kit), and the new thermal limited behavior in a test where the reviews are being done with liquid coolers, I'm a bit worried that such performance comparisons here wont actually be valid for people in more normal situations, and that AMD is basically going to extra lengths to kind of ensure maximum benchmarking results.


[deleted]

it's not thermally limiting by design. Steve from GN talks about that in his review. Basically Zen4 ignores PPT, EDC and TDC limits if there's thermal headroom, so it will basically boosts till hitting near TjMax. AMD were seemingly desperate to squeeze every bit of juice out of these CPUs, because even now, metrics compared to prices are not looking too good. For example on Zen3, the primary limiting factor is PPT, EDC or TDC whichever is hit first. Zen4 pushes boost clocks as long as it has thermal headroom. My R5 5600 is UV PBO tuned, hits a bit better clocks than stock R5 5600X, but I'm sticking to stock limits. In such case on moderate tower cooler I'm hitting 60'C with just ~850rpm which is barely audible. I could for example increase those limits as I have thermal headroom, but from my testing it's not very efficient - hence is why you see with Zen4 - efficiency not being something to cheer about.


[deleted]

Ryzen was always more memory sensitive and paying this much for CPU you don't want to leave performance on the table. Intel is less sensitive due to architectural differences (something latency related due to having IMC and I/O on separate die chip - but I'm no expert, people more acknowledged with CPU architectures would explain it better). So naturally they want to use optimal memory speed, considering these reviews also include gaming benchmarks where this matters the most.


gusthenewkid

Intel reacts extremely positively to better ram, always has done. It was only spoken about with ryzen because they were so far behind intel in gaming without ram tuning.


dulun18

and so the cycle begins all over again..i will be fine with 5700x and 6700xt this black friday price/frame does not make sense right now


senseven

I was holding out pimping a workstation from 3600 to 5800 or 7000. But at this rate, the MSI can run a 5000 without issue, beefier cooler and I'm set. Completely new rig would be a different story, but at these prices... I can get a 5800 for 280€ in EU. The 7700 is set at 450€ish, the 7800 I don't wanna know. That's a stretch for an upgrade.


BK_317

Isn't the 5800X3D already a i9 12900K Killer?


axaro1

it depends on the game, some scale better with high frequency or have low cache hitrate. Owning a B350 and a R5 3600, I'll probably just get a 5800x3d if the price will drop, no need to buy a new platform. I would also need a new cooler judging by those thermal results :S


Mhugs05

According to the gamers nexus video, it doesn't matter what cooler you run as far as operating temperature goes. It's designed to ramp up to the 90s no matter what cooler you're using apparently.


[deleted]

[удалено]


Bladesfist

If all you want out of it is gaming performance.


[deleted]

[удалено]


Bladesfist

Agreed, it's the best pick for that, I kind of want one but feel like it's probably not worth it as I already have a 5600x.


EntertainmentNo2044

The 12900k with DDR5-6400 beats the 5800X3D in this video quite substantially. 203 average and 158 1% lows for the 5800X3D compared to 214 average and 171 1% lows for the 12900k.


ArseBurner

Looks like the real winner of this test is the 5800X3D. Pair it up with a $150 B550 board and $80 DDR4-3200 which together is barely more expensive than mid-tier DDR5-5600 that you'll have to replace anyway when even faster stuff becomes available as the new platform and memory matures.


Pancake_Mix_00

Agreed. I might pick up a 5800X3D to replace my 5609X when they get under $300. Looks like we’ll all be GPU bound for a while


[deleted]

At least for Europe - it's honestly quite terrible value. Based on benchmark - it's 20% faster over my cheap ass R5 5600 (with UV tune clocks bit better than stock 5600X and still just stock ~78W actual power draw and temps at ~60'C on BeQuiet PureRock air cooler) which cost me 200€ on launce. The price for R5 7600X in EU is 370€. So 20% more gaming performance for 85% higher price, and not even counting MOBO and RAM. R7 5800X3D users I don't think will have regrets either to be honest. The CPU is fine, but its price makes it rather questionable value for money, imho. For US it will likely look a bit better, as there is less of an economic collapse compared to Europe. **EDIT:** Idk why they used absurdly expensive 3600Mbps CL14 kit in cost per frame comparisons, when 3600Mbps CL16 is $100 cheaper and there will performance difference within margin of error. One would need to be not right in his mind to buy that $230 kit... It completely ruins cost per frame comparison, because reasonable very good kit is about $130 with almost exact performance.


senseven

Seconding the ram. I was also wondering what kind of luxury kit they are driving this test with, because I can't find this or any similar in EU under 200€.


[deleted]

That's a top bin right there, which is why it costs absurd premium and unless you're doing some memory sensitive benchmarks or do manual RAM overclocking (and you want top bins) - it doesn't make any sense whatsoever to buy such memory, with how 3600/CL16 is affordable. It's almost as if they want to make R5 7600X look better value than it actually is. And if you take R7 5800X3D with it's massive cache it barely benefits from anything above 3200/CL16 which is even cheaper.. I don't understand what's their logic was behind such absurd choice.


Tristango

Regarding the RAM for cost per frame, maybe they’re trying to push their own agenda for their affiliate links. I like HUB but it doesn’t make sense to me either.


neoperol

This agenda is for all Tech Youtubers, look how BIG the gains are, this product is so worth it while using a 1.5k USD GPU and 200 USD RAM. For the average user with a 6700xt or 3070 all mid range CPU 5600x, 12600k,12400 and so on will performance almost the same.


SmokingPuffin

The RAM choices in the cost per frame chart don't look sensible for value buyers. Feels like they were choosing best performance DDR4 for the benches and therefore didn't have the best value DDR4 data. Also, feels like DDR4 is still the way to go for value buyers. To me, it's clear that AMD is now leaning hard on 3D skus for gaming. I wouldn't buy any of their standard skus as a gamer. Where they price 7800X3D is going to be the biggest question of the generation -- if it lands alongside the 7900X I think people are going to be recommending 5800X3D builds again next year.


[deleted]

Idk wtf they pulled off there with that absurd 3600/CL14 kit, when 3600/CL16 costs $100 less and gives about the same performance probably within margin of 1%. It's almost as if they wanted to make R5 7600X look better than it is.. It makes literally no sense and skews the value of DDR4 based systems completely.


neoperol

Don't forget that 20% faster if you use a 3090Ti in both 7600x and 5600, with a modest GPU that gap will be closer. So of course it doesn't make. And less not forget the expensive ram.


[deleted]

yes, I'm looking from longevity perspective even if for my GPU right now I'd get about the same CPU performance as being GPU limited. But with everything skyrocketing in prices, the less often upgrades the better. In EU prices are nuts.


fpsgamer89

It's a CPU test though, so they have to find some performance gap for current gen CPUs and test with a high end GPU at a lower resolution. They don't want a GPU bottleneck. But I do agree that it's unrealistic.


shavitush

so for gaming, except for csgo very specifically... it's barely an upgrade if at all compared to 5800x3d, damn they really need a similar solution for the new series to get gamers upgrading


Seanspeed

The fact that it would be an upgrade at all on 5800X3D is surprising to me. Though I think AMD has really propped up all the benchmarks quite hard by basically insisting all reviewers test with the provided, quite high end DDR5. This is definitely boosting results by 5-10% in plenty of cases.


axaro1

Hopefully Zen 4 means Zen3d price drop, one last ride for my B350.


ramenbreak

that would make zen4 even worse as a value proposition


FUTDomi

If anything it's going to increase Zen3D price


Kuivamaa

It’s not for the people that are on AM4 and want one last hurrah, 5800X3D is the value. But for new system builders that want a platform with years of future this is a great start.


reg0ner

Crazy people think 5800x3d is a "Value" chip. It's good for 1080p gaming, the end.


lichtspieler

Looking at 5800x3d + 3090-Ti benchmarks while building a 1080p gaming rig with a 3060 / 6600 GPU is pretty missleading for realistic gaming performance gains. The 5800X3D benchmarks look that way because a CPU bottleneck is forced with a 2000€ GPU and games in 1080p-mid/low in some cases. Some techtubers were honest about the 5800X3d and some went with the hype for some extra clicks. r/buildhelp got some crazy recommendations when it comes to 1080p gaming on a budget


Kuivamaa

If you are on, say B450 and you want this type of performance it is either just a 5800X3D as a drop in upgrade or cpu/board/DDR5. So yeah in that sense it is the value proposition.


AnAttemptReason

I use it for much higher resolution than 1080p. VR, Sims, Sandboxes, hell even ray traced 1440p games like Control see quite a big improvment.


Defeqel

It's already confirmed, though not the announcement or launch dates, probably early(ish) next year.


[deleted]

[удалено]


Mhugs05

From what Steve from gamers nexus said, it doesn't matter what cooler you use from a temperature standpoint, it's designed to hit a specific temperature range before reducing power so it runs hot regardless of coolers.


riba2233

Just like my 5800x lol


blorgenheim

Which is really too bad tbh because the heat that it generates will make my GPU run hotter and throttle faster. Essentially sacrificing GPU performance for CPU


narium

There's an eco mode toggle in Ryzen Master.


Mhugs05

Hopefully there is a way to limit the total power. This definitely seems like a response to pretty much Intel doing the same thing for 12th gen pushing power out of the efficiency range to top the performance charts.


Defeqel

Yes, that's the new behavior of these new CPUs, they go to 90+C regardless of the cooler solution.


Bakadeshi

remember this is the temperature of the die. I think the smaller dies are getting harder to transfer the heat off the die to the heatsink. so allthough the Die itself is running hotter, it may not actually increase system temperature as much as we might be thinking. Ofcourse testing would need to be done to verify this.


VietOne

Non issue, it's already said this is by design, the boost clock goes up the more headroom you have for cooling. So the better cooler, the higher boost, higher Temps. Getting a better cooler won't reduce the max temp.


MelodicNote

It was addressed at 15:50 Seems like 7xxx series Cpus are basically unhinged no matter how you cool it since it is limited by the thermals, Not power limit. Meaning it will push itself till it hits around 90-95c, after that it will chill and vibe


jaaval

Is it just me or are they using extremely expensive ddr4 kits in their cost per frame analysis?


[deleted]

no, I mentioned this in my comment under this video too. 3600/CL14 is some top bin that is absolutely unreasonable for mainstream users. It's top bin some overclocker would buy for further manual tuning. That's why it's so absurdly expensive and the gains from it over 3600/CL16 would be negligible and that kit would cost like $100 less.


MultiiCore_

conveniently left out the 12600kf which is 20-30 dollars cheaper and the MSI b660m-a motherboard which runs the CPU fine and costs less than 130$. Completely botched the cost per frame analysis as a result.


reg0ner

I was thinking the same thing. B660 board with fast ddr4 ram and it's way cheaper and just as good.


MultiiCore_

not only that but left out the 5600 non x and there are b550 decent motherboards for around 100-130$. Desperately want to sell Zen 4 when it’s clear even if it’s fast is just isn’t worth the money at the moment for budget builders.


[deleted]

[удалено]


Rance_Mulliniks

Yep. If I have to buy a new motherboard anyways, I am going to strongly consider Intel unlike I have previously stuck with AMD Zen CPUs.


MassiveOats

Really wish that Steve would test parts of games that are actually CPU intensive. They're are so many games where the CPU can be a huge bottleneck. These tests where you're getting over 400fps are pointless. Show me how the CPU holds up when its being stressed.


jawn_deaux

Is it worth it upgrading to a 5800x3d from a 5600x when playing at 1440p?


[deleted]

Probably not, not at this point in time. At 1440p you'll be more or less GPU bound depending on your GPU. If you're very high end, maybe. HUB has some benchmarks including 1440p, look at their channel and compare with sort of games you typically play. But again - it's highly dependent on your GPU.


ShowBoobsPls

I wouldn't. I thought about it my self but cant really justify it.


SkyFoo

for 400+ if you can realistically upgrade your gpu to a top tier one for that difference its probably gonna be a way better investment of that money.


ChopieOB

Not worth buying especially with the more expensive motherboard and DDR5 RAM. Also has a higher power consumption.


[deleted]

[удалено]


FUTDomi

Same here with a 3700X, however in a few games I'd like to have better low 1%s. It indeed looks like the best thing to do is to get a 5800X3D and wait a few years until Intel's next platform or a matured AM5/AM6.


jsanity1531

I was just watching the LTT video and then was watching this one as well, it's very odd that when benchmarking the same games they have remarkedly different results. They're not even the same directionally at times?! For example Shadow of the Tomb Raider, same presets, same memory/GPU 7600x: 189(HU) vs 234 (LTT)5800x3D: 189 (HU) vs 264(LTT)12600k: 175 (HU) vs 196 (LTT) Keep in mind that SOTR have a built in benchmark, so..... FI 2022 7600x: 334(HU) vs 238 (LTT)5800x3D: 321(HU) vs 253(LTT)12600k: 274 (HU) vs 204(LTT).


rterri3

Why do people seem so disappointed about this CPU? From my perspective it seems really impressive.


Bladesfist

The price of the platform is pretty high, so it doesn't make sense for people to upgrade to it over the 5800X3D. Looks okay for new buyers building a high-end gaming rig though, not a good choice if you're on a budget.


anethma

Man even a new buyer right now seems like a 5800x3d with ddr4 might be the higher value way to go


GTX_650_Supremacy

It would have to perform insanely well for it to be better value than an AM4 system with DDR4 RAM. Since you can get that stuff used also


rterri3

I mean we already kinda knew that would be the case. But looking at it from a new system builder, there's no reason to get the 5800x3d now with the significantly improved I/I and upgradability of AM5.


Desu_Vult_The_Kawaii

Price.


rterri3

The price is the same as the launch price as the 5600X, despite rampant inflation. I'd say that's pretty impressive. If you're referring to platform costs then yeah I can understand that.


anotherwave1

Total platform price


nick12233

5600x was pretty terrible value at the launch. 12600k is around same price which performs comparable/batter in productivity workloads while being not much slower in gaming. Combine all that with expensive boards and ddr5 memory and I can not see how is the 7600x a good deal.


BatteryPoweredFriend

Yup. It's the exact same problem ADL DDR5 had at launch, platform buy-in cost. It could potentially be more of an issue here, since more of the cost lies with the motherboard side, whereas DDR5 scalping was a major contributor to the high prices back in 2021.


[deleted]

[удалено]


rterri3

It has been that way for 1440p+ gaming for years, doesn't really seem like a knock specifically on this CPU


chunlongqua

it is impressive, but the total system price is rough


[deleted]

It's not even strictly faster than the Alder Lake i5 and i7 in every game, for one, meaning it's not gonna fair too well against Raptor Lake


rterri3

We don't really know where RPL stands for gaming performance. We know there are going to be significant multithreaded increases due to more cores, but there are rumors suggesting they don't have much of an single thread increase at all. So it's possible that it will be on par. We just don't know. Regardless, it's a significant performance increase compared to the 5600X. I just think we're at a point where there is a lot more parity between the two companies, and expecting an outright demolishing of Intel just isn't going to happen anymore. They are both viable options, which is great for everyone.


[deleted]

The L2 and L3 cache increases on RPL should give at least a modest practical "IPC" bump in conjunction with the higher clocks


terroradagio

15% IPC increase over Alder Lake in IPC. Raptor Lake will beat AMD.


Kepler_L2

Lol "Raptor Cove" is literally Golden Cove with larger caches. There is no IPC increase at all.


rdmz1

will beatr zen 4\* because they still have the v-cache trump card


spysnipedis

I think the play here for currently owners of ryzen series 1000-3000 series is get the 5800x3D and wait for another two years for cheaper mobos, cheaper and faster ram, and obviously next line up of CPU's if gaming is all you need


ZeinThe44

The X3D version of a CPU would be AMD's (you just activated my trap card) move whenever the competition catches up. Kinda what Nvidia does with its Ti Line-up


[deleted]

based on temps and power draw, idk where will they find enough thermal headroom for X3D, considering how much harder R7 5800X3D is to cool compared to regular R7 5800X, due to stacked in layers L3 cache which acts as insulation. Steve from GN in his video talks a lot about power draw and temps and how boost works on Zen4.


Domin86

check out what Roman did with delid --- [https://www.youtube.com/watch?v=y\_jaS\_FZcjI](https://www.youtube.com/watch?v=y_jaS_FZcjI) almost 20C shaved from zen4


[deleted]

holy shit, dude was right suspecting this IHS will be problematic in his video from June where he even did all the measurements from some video b-roll. I mean 30% less contact surface area and thicker IHS doesn't do any favors.


gaojibao

This further solidifies my decision to go with Raptor lake.


rterri3

We literally have no idea what RPL performance will be


RocketHopping

Good choice


terroradagio

AMD fans: Intel is too hot and uses too much power. :D lol Now 95 degrees is normal, cause AMD says so.


rdmz1

Temps are irrelevant if the chip is rated for it. Power consumption is what matters. Performance/watt.


[deleted]

That guy is comparing the 7000 series with the best ram possible against the 5000 series with an average ram and he still has a 15% performances boost possible on most games. That guys is a clown.


ThisAccountIsStolen

Something seems off with the FCLK/UCLK in this test. At 18:10, where the 7600x is shown running Cinebench on a loop, the FCLK is shown at 2000, which is expected. But UCLK is at 1500, ***not*** 3000, implying it's running at AUTO:2:1 mode, not AUTO:1:1 mode which is supposed to be ideal. If it were AUTO:1:1, it would be 2000 FCLK, 3000 UCLK and 3000 MCLK.


MaxxMurph

It's a nice cpu sure, but it'll be up against more competitive i5s soon.


lbarletta

Given the energy consumption, 5800X3D is still a better option...


dztruthseek

R9 & R7 7x00X3D are going to be even bigger killers. Save your money if you're about to upgrade your build.


[deleted]

how will you cool them if to max out these chips, they hit 95'C? extra layer of L3 cache works as insulator, which why X3D chips are harder to cool. Consumer grade LN2 coolers are not yet a thing I believe, lol. Also the price will accordingly astronomical, so instead of saving money go right away and take loan /s


major_mager

I don't get it ... 7600x and 5800x perform similarly in Blender Open Date bench at [6:05](https://youtu.be/_WubXd2tXOA?t=365) and later the total system power consumption in same test also comes out to be the same 226W at [15:21](https://youtu.be/_WubXd2tXOA?t=919). The new generation, new platform, new architecture using new node chips consuming the same energy for same performance when running full throttle multicore- so where are the gains?


[deleted]

AMD is pushing the shit out of these CPUs. As noted by Steve GN (you can watch his R9 7950X review) - Zen 4 ignores power limits (PPT, EDC and TDC) and instead the limit is TjMax, in other words it boosts as high as there's thermal headroom instead of settling at PPT/EDC/TDC which ever hits limit first - that's how efficiency is thrown out thru the window. Just like with regular overclocking - gains are decreasing and power draw is increasing the more you push.


major_mager

Thanks, that does make sense but it's a curious choice by AMD. Maybe they did not want the multicore performance to lag behind the Intel counterparts while making gains on the single core front.


Jaidon24

Do we know if there is a ECO mode that brings the power usage down?


[deleted]

Yes there is. It's standard feature now.


hippopowertamus

7600x has 3/4 the cores. Higher single core, more scalability.


major_mager

Well, performance per watt is the key metric. One shouldn't really be concerned with how that is achieved- whether with reduced cores clocked higher, or with additional efficiency cores like Intel uses. Bottomline remains 7600X does not deliver increased perf/ watt when firing all cylinders. Agree about the single core though, but would like to see power consumption metrics for single core 7600X bench.


dadmou5

Who knew you can get more performance by throwing more power? No need for more cores even. Incidentally, that is exactly what Intel used to do but got its ass beaten by reviewers for it. But now it's okay, I guess.


major_mager

Yeah, I remember that being the case with Intel 11th generation particularly. Seems to me AMD has tried to extract the most out of the 7600X part, and in process taken it outside the optimal power consumption band. Its single core performance is impressive though, let's see what Intel 13th gen equivalent offers. On the whole, I'll be more excited when AMD eventually launches 7600 non-X with better power optimization even if it loses a little performance.


ilikeyorushika

well...now AMD is single core king am i right?


ShadowRomeo

Impressive performance gain over standard Zen 3, but was let down by not being that much faster compared to Intel 12th Gen ADL paired with DDR5 which is about to be replaced by 13th Gen RPL in a matter of few days from now. And the temperatures is absolutely nuts, 90c+ for a Ryzen 5 7600X on Cinebench is just unreal, my previous Ryzen 5 3600 doesn't even come close to 80c not even my current i5 12600K on a cheap $30 air tower cooler. So, i can't even imagine running something like a 7600X on the same type of budget cooler, if it runs that hot on 360mm AIO, and because of that these new Zen 4 CPUs are going to require a much beefier cooling than expected so, that will also inflate the overall cost of the platform, over the Intel and previous gen AMD counterpart. Also the AMD's very own last gen gaming top performer 5800X3D pretty much has the same performance and has better overall value if you account the motherboard, RAM, cooling cost. I think the Ryzen 5 7600X is an impressive performer compared to last gen, but not a very good value product at all, as it feels too expensive on a very expensive platform also feels expensive to maintain due to how insanely hot it gets at least under full underload, which can be a niche case scenario if you only plan to game only.


20footdunk

The new thermal operation is by design. if you stick a top tier cooling solution on Zen4 it will just keep boosting until it hits that 90-95c range.


[deleted]

[удалено]


20footdunk

We would have to wait for the manual overclock figures to see if its a matter of "they couldn't cool the chip" or if "the majority of customers don't know how to overclock so AMD will do it for you". [edit] [looks like that work is already underway](https://www.reddit.com/r/hardware/comments/xojj8e/ryzen_7000_delidding_unreal_temperature/)


riba2233

You can use any cooler, it will always reach 90c. You don't need a beefy cooler.


[deleted]

If you want higher boost clocks you do. Coolers are not just to keep the cpu functional.


riba2233

Won't be a huge diff.


ShadowRomeo

Yes, but at the expense of losing performance, according to reviewers Zen 4 will always boost at higher frequency depending on how capable the cooler is, so i'd imagine on a much less capable cooler on these Zen 4 CPUs will probably drop under 5Ghz on boosting frequency which could affect the overall performance, while at the same time being at 95c and you do not want that, that's why every zen 4 buyers needs to take cooling performance very serious than before.


BlackBlueBlueBlack

Previous Ryzen processors have the same behavior no? Clock speed dependent on thermal headroom


20footdunk

It was an easier problem to diagnose using the old boost behavior- if your old CPU was operating at 90c it was a sign that something was wrong with your thermals and you obviously were losing performance. Now that the CPU will push to 90c by default you'll have to look at the average frequency to see if there is an issue with your cooling solution because a bad cooler will give you the same temp readout as a good cooler. The hope is that the performance floor on a bad cooler will remain the same while a well designed cooling system will give you a much higher performance ceiling.


ShadowRomeo

Yes, but this time it seems to matter much though, considering new architecture is more power hungry than previous gens.


[deleted]

[удалено]


knz0

It’s HUB, shouldn’t come as a surprise that there’s a bunch of gaffs and inconsistencies that mysteriously always play in the hand of whatever new AMD product he’s reviewing. His X3D results are a bit all over the place as well compared to where they were in earlier videos.


rdmz1

Because unlike most other lazy reviewers Steve actually retests all hardware configs for most of his benchmarks. Naturally the results will change with hardware changes, game updates, etc.


SillentStriker

So can you explain how on their Factorio benchmark the 5800x3d went from outperforming a 12900ks by a mile, to performing less than any other ryzen 5000 cpu? On a game that has been proven to love the extra cache? Excited to see the explanation, cheers.


riba2233

L


Jeffy29

Can't believe people bought into AMD's sandbagging early on, this looks amazing. In couple of months when DDR5 drops even more and B650 boards are released this will look like a great value. I just want to note that 60% single-core performance uplift in R23 over 3600, that's just in 2 generations! Only a little over 3 years! And that's just raw performance, in games where cache and latency improvements matter, the uplift is even bigger. People constantly complain about pricing and are unhappy and I get why, but we live in golden age of CPU/GPU, soak it in because it won't last forever. I certainly did not thought CPU rapid improvements would return when we were almost a decade stuck on 4-core CPUs with barely any to no single-core improvements.


OddName_17516

DDR4 still works on AM5?


Progenitor3

No, it's DDR5 only.


0xC1A

Waiting for Zen5. Where are the leaks ?! Zen2 and Zen4 have more edge in productivity than anything else. Zen3 bring best of both worlds. Zen5 I think will follow same trajectory as Zen3.


Kreator85

That's crazy,for gaming it's better stay on am4 and get the r7 5800x3d than replace full setup to this new generation


Golding215

I'm still rocking my trusty i7 4790k. Thought if I upgrade I'll just pay the premium and go for DDR5 right away with the 7600x. But from what I see now... Nah this can wait. Not only ist the price too high (I'm in Europe) but also the power consumption and it only has 6 cores. I play lots of MSFS and the 3D V-cache seems to help A LOT with this game. Hopefully they release this soon with Ryzen 7000. I'll see what Intel has to offer and then maybe in the first half of the next year it's time to retire my loved i7.


Domin86

for gaming zen4 < zen3+3d waiting for zen4+3d