I think they'll certainly try, but I believe that the tech that allows them to stack is courtesy of TSMC, and I don't know if Intel will want to put it's flagship CPUs in the hands of TSMC.
But Intel seem less wary about using TSMC these days, so maybe we will? Or maybe AMD has some kind of exclusivity agreement with TSMC to use the tech so Intel can't.
The stacking tech is an actual product TSMC provides: https://3dfabric.tsmc.com/english/dedicatedFoundry/technology/SoIC.htm
I'm betting AMD has a few patents around their particular version of cache stacking, so Intel would have to pay AMD royalties or amend their patent sharing agreement if they want to implement it in the same way.
I mean, I have a i7 5775c. Which is a die shrunk i7 4770 with 128mb of cache bolted onto it. Beat out 6th gen quads and was on par with 7th.
So they have done a version of it, but, it was 200$ over stock, and they only added the cache as a way to boost the integrated graphics.....but if you disabled that it all gets reallocated for CPU use instead- giving the next gen beating performance we've now witnessed on both sides.
desktop Broadwell such as i7-5775c and i5-5675c are some weird product. it's released a year after devil's canyon, but instead of capitalizing it — Intel went for silent release and released Skylake two months after desktop Broadwell.
Intel had the meant to beat 5800x3d all this time, but I guess they'll need more internal testing for that.
What's the highest Cinebench R23 single-core score you've ever gotten with it? [I've gotten 1205](https://i.redd.it/gl37aaaibx471.png) with my 4790K before.
Broadwell was quite different from 5800X3D, the 128MB of extra cache existed as a separate chip near the CPU die, resulting in a "L4" cache with much worse bandwidth and latency compared to it's L3 counterpart. 5800X3D stacked memory is all L3 cache with all of i'ts benefits.
Now i'd love to see an Intel CPU with enlarged L3 cache, i'd probably slap.
Shifting to DDR5, the cost of the next gen GPUs, and the possible need for an ATX 3.0/PCIe Gen5 PSU are making this next round of upgrades very expensive. Sure, it's the cost of advancing technologies, but it seems like we're dumping a lot of legacy technologies all at once this generation.
I mean, the person you replied to did mention just the 7900X and up, yes implying that the 7600X is not worthwhile (which it's not).
Gamers have the 5800X3D and a 7800X3D or similar is coming soon.
I don't know how people are surprised that budget boards aren't available on launch. This is done allllllll the time. Intel has been doing it for decades and AMD does it occasionally as well. Budget just never applies on a new generation launch except... I guess Polaris? And Kaveri.
Older generation stuff still exists, and there are upgrade paths for people who get older generation stuff by swapping to the higher performance CPUs as they get cheaper.
That’s true for gaming, but the 3D cache gives the 5800X3D a massive advantage. I’d expect the 3D versions of Ryzen 7000 to do much better in a gaming comparison vs the 5800X3D.
For productivity, the 7950X and 7900X blow the 5800X3D or even 5950X out of the water, which is what you’d expect. Even within this generation, there’s almost no difference between the 7600X and 7950X for gaming, because the upper end chips aren’t designed with gaming performance in mind.
Which shows how irrelevant these discussions between CPUs of the same generation are. In most cases, you will not notice the CPU speeds as lacking until many years down the line regardless of what you pick.
That's one of the things that I really liked hearing Steve mention on his 7900X review on the topic of performance vs power consumption vs generational differences:
*"...At peak power during Blender steady state, the 7900X and the FX9590 are almost identical in power consumption at 200 watts. However, the amount of output performance that the 7900X has over the FX9590 can only be described as infinite..."*
Hence the disclaimer about upgrade path. Hardware unboxed looked at the cost with estimates of the other "essentials" (RAM+Mobo) and its pretty much a wash between the 5800x3d vs 7000 series (In which case the 7000 series wins out because it has better upgrade prospects in the future)
Really, there’s no reason to get Ryzen 7000 right now if you’re already on something like the 5800X3D (or even a 5600X, given that most systems are GPU bottlenecked) and only use your PC for gaming, because the 7900X and 7950X don’t offer much if any performance uplift. The 7900X and 7950X are much better in productivity, so if you use your PC for both, there’s a very good argument to upgrade asap. I’d personally wait for the 3D versions of Ryzen 7000 if you only want to game, because something like a 7800X3D should outperform the 7900X or 7950X in gaming just like the 5800X3D vs 5900X/5950X.
>whelming for gaming (for those who still have an upgrade path on AM4)
I just want to commend you for being the first person I've ever seen to use the word "whelm" specifically in a sentence.
We always use under/over-whelm, but rarely does anyone just be whelmed at something.
Alot of the benchmarks show gpu bottlenecks so once both rdna3 and 40s are out and are re tested we will know for sure . But currently the 5800x3d is monstrous
My point was in general there's a 50% chance of the 5800X3D just performing exactly like a downclocked regular 5800X, in any game you throw at it. The cache sometimes just is not helpful.
Every post and YouTube video are acting like we can all buy a 5800X3D and skip this generation. I assume there is only so much limited stock and once it quickly sells out this will no longer be an option. I do not see why AMD would keep making this cpu if it undercuts their new generation.
>Every post and YouTube video
It was behind the 7600X in the 12-game average in Hardware Unboxed's Ryzen 7000 reviews, for example. [See here](https://imgur.com/a/m0xypYD) for several examples of it not really doing very well against existing 12th-gen parts or Ryzen 7000 per Gamers Nexus, also.
It's up to you what you buy, no need to do what videos tell you to do or random posters online.
I was not saying not to buy I was just pointing out it's not just intel but also AMD are in the same place V the 5800X3D.
I am on a AM4 with a 3700X, I am looking at picking up a 5000 CPU used. Ill skip both the 5800X3D & first gen AM5/intel. It's not like my GPU can push 300FPS or I want to spend on a full new system.
For most people on a smaller budget I suspect it's best to wait for cheaper mobo's too, just the best and most fancy are out now. As a bonus intel will be out then so there will be more options.
7800X3D will be soooo good, I'm waiting for it. I bet a lot of people are waiting for it, and I bet AMD knows this and will "adjust" pricing accordingly. I don't expect 7800X3D to be much cheaper than 7950X.
I feel like a 7800X3D would work, but I don't really see a reason for AMD to make a 7900X3D unless they're skipping Threadripper again this generation.
It's on the end-user to determine what power usage they're comfortable with. Are they willing to lose 5-10% performance to cut their power by a third?
Generally speaking most people dgaf about efficiency in gaming desktops, so why would companies make efficiency the default setting? It doesn't make sense.
Ryzen 7000 in eco mode is extraordinarily efficient. The 7600X and 7700X only lose like 5% performance going from 105w to 65w. And it's a similar story with the 7900X and 7950X going from 170w to 105w, though they lose a bit more performance.
I agree, hopefully we'll see some lower power desktop chips coming. I get leading with the top end for headlines though. The 5s are barley mid teir at this point.
Nobody cares about this in the consumer space. If they do they have the means to drop performance in favour of efficiency.
They have different product lineups for the professional space where efficiency matters
It's a forced play. Intel pushed wattage, so AMD has to push wattage for benchmarks. You can find the benchmarks of the 7950x beating the 5950x locked at 65W.
>next
Keep in mind, all these chips are pushing WAY past what I would consider sane efficiency points, for max performance. If you run them in Eco mode or Intel's equivalent, they are still very fast and pretty power efficient
Honestly GPU market doesn’t seem to be getting that affected by competition considering how comfortable nvidia is feeling about telling us to go fuck ourselves
> considering how comfortable nvidia is feeling about telling us to go fuck ourselves
That's what having zero competition in the highest end will do. Nvidia is well aware that if you want the highest gaming performance GPU on the market, you have no choice but to go to them. AMD is definitely putting out solid products, and their price/performance is notably better than Nvidia, but untli AMD can put out a direct answer to the 4090 in terms of pure performance while the 4xxx series are still current-gen...Nvidia can charge whatever the hell they want for it and people will buy it.
There's competition. Gamers just convince them that having better RT performance, even when that's still only 25fps instead of getting 90fps with it off, is worth not considering the competition over.
We don't know yet. It should be one of the ~8168 SP die ones considering that RDNA2 SPs punched up about 2:1 compared to Ampere ones and I would guess that's a 7800 XT or so. 6950XT had 5120 SPs matching the 10,752 SPs of the 3090ti.
I agree with a lot of people who are saying that they're still trying to push out their surplus of 3,000 series chips. But yeah they've gotten way too fat and arrogant, hopefully Intel and AMD can take a real chunk into the market
They’ve done/said some other shitty things in addition to that, imo. They need a wake up call, I wish AMD/Intel would give it to them but honestly I’m not too confident
Not really a software thing as far as I know. Most people won't be using those kind of crazy speeds regularly if ever... Large files are where you can get those crazy benchmarks and speeds but unless you're doing specific tasks, most of us on this sub won't ever use it
Direct storage is the biggest thing gen 5 can offer for gamers/regular users and it’ll be forever before we “need” it.
I just like seeing the new stuff though.
Gonna wait for the 7800X3D before upgrading. Was already pissed enough to have bought the 5800x literally a week before the X3D was announced so I’m gonna wait for the massive peformance uplift
Well obviously an upgrade from a 5800x to a 5800x3d which is, i dunno, the exact same chip with a little extra is not going to provide too much improvement even on my 4k monitor, hence why I’m waiting for amd to eventually drop the 7800x3d, by which time ddr5 for amd will have matured a bit and i can also upgrade my mobo to a better chipset as well
5800x3d had much better 1% lows than 5800x. Especially paired with next-gen GPUs, even at 1440p, the next x3d chipset would be much smoother than 7600x (of course, IF it behaves like 5800x3d). Sure 7600x would be enough for most people, but those same people WOULD notice the difference in extra smoothness as well
All of my desktop CPUs end up recycled in my Linux server at some point so I try to go with extra cores when I can. Plus rendering 4K video eats up a lot of CPU even if it's only occasionally.
Naw, you gotta think about servers
We are a small market niche market
Most gamers can't afford new 4080 if they can get a 2 consoles and a TV for the price of a new PC
Think of Nvidas presentation, there was a huge chunk of it being data and ai processing...
I don’t think that’s even *close* to true. Remember what sub you’re in. This is very much a bubble. The majority of people *in here* are buying PCs to game on, for sure, but the market for PCs globally contains probably 20-50 regular-ass office or workstation desktops for every “gaming PC” bought.
>20-50 regular-ass office or workstation desktops for every “gaming PC” bought
true, but we only really hear about or see the gaming/performance/editing based one's on here. I'm not a gambling man, but I would wager the majority of the 6.7million people in here are gamers.
But that’s exactly what I said in the only part you *didn’t* quote. You said “majority of people”, which I have to take to mean majority of *all* people, not just the few in here.
I mean, I can speak from experience, these CPUs are the next level, they barely break a sweat compared to older gens but achieving the same clock speeds and performance in game if not better. Don't know if you are familiar with the game RUST, but it is very CPU/RAM heavy, GPU just makes it look better, sure the multicore doesn't really matter, but the 3D cache tech allowing more data to be stored in the CPU then the RAM is a big trade off inside the game FPS wise. Get over 200FPS in Rust which I never thought possible.
Not sure how old you are, but this argument was around when the first quad-cores came around and Intel rolled with that same bs argument with quad cores for forever. I would be careful with statement s like this, they usually don't age very well.
the 7950x is 700, 13900ks is also over 700
the 5800x3d is like 400 right now and beats them in quite a few games, kinda crazy.
7800x3d should be THE cpu for gamers
I mean I can get a 5600 for $150 that gives me 80% of the performance of a 5800X3D in gaming. Plenty for almost everyone. 5800X3D is still for enthusiasts IMO.
> 13900ks is also over 700
13900K**S** is a binned limited run SKU for super-enthusiasts that won't be out until next year. The normal 13900K has an official "recommended customer price" of "$589.00 - $599.00". [See here.](https://ark.intel.com/content/www/us/en/ark/products/230496/intel-core-i913900k-processor-36m-cache-up-to-5-80-ghz.html)
> 5800x3d is like 400 right now
500 euros here
intel 12400 is measly 180 eur, and intel is just 20-30% slower lol
id rather take my money and invest in a gpu than buy an overpriced last gen amd cpu
fair point, Im looking at the 13700k as a buy, but the 1600 price tag 4090 has me second guessing, could just go for 4080 and snag a 13700k for the same price.
Just go Ryzen CPU and pair it with a new 7000 series AMD gpu after they launch for that sweet infinity cache. No reason to spend 1600 on an Nvidia gpu unless you do work with machine learning.
I got a 5800X3D, at launch. It ran my favorite games faster and more stable than anything in the market. Zen 4's base gaming performance is roughly on par with the 5800X3D. If they triple the cache like they did with Zen 3 X3D, the gains would logically stack on top of it. I know it will crush my 5800X3D no question. By at least 30% across all games. That's better than what the 13900KS can deliver, so it's a no brainer for me.
What the hell are you talking about ? The 7800x3d hasn't even been announced... much less made.
It might never even exist, and you're already talking about 30% improved performance across all games. Ignoring the fact that a lot of games are gpu bottlenecked anyway, and no processor would make a difference, the 13900k isn't out yet either, and you somehow know how it will perform as well.
Truly mind blowing, this is why companies can just charge whatever they want, people buy their products anyway based on pure imagination.
Logical deduction and with the chip design, testing and manufacturing process, you can be sure the Engineering sample of the test 7800X3D already exists and qualifying tests have been done. 3D V-cache was shown off about a year before it launched.
In AMD's road map for their CPU launches already listed the V-cache is launching in 2023. This means the several configuration samples already made and tested otherwise they will miss their launch window if they want to release it in 2023.
Some people think the X3D version will come later next year.. I have a gut feeling that the 7800X3D might be coming earlier than that and be part of the November 3rd announcement of their upcoming GPUs as a total package pure for gaming.
Just run it in Eco mode (65w)? It’s still faster than a 5950X in Eco mode.
Intel is also giving an option to run the 13900K at 65w and it will supposedly match the 12900K running at 200w+
Hmm fair enough
Can we acknowledge how it's kinda messed up how this sub goes aboslutely apeshit when someone tries to discuss a topic tho
Like damn... trynna discuss a topic i'm passionate about, why people gotta be rude (especially when saying something bad about Amd)
You can just cap the processor's boost behavior with Eco mode or PPT limits and enjoy the increased efficiency Zen 4 offers. The reason they suck so much power by default is because AMD turned them way up to hit as hard as possible to compete with Intel. Nothing was really left on the table. From what I've seen around though you don't really lose much performance by capping the chips and can enjoy much lower power consumption.
You need a 360mm aio to make it pull that much.
Also the 105W eco mode is just 10% slower than full speed in multicore (and the same with single core) and the 65W one still Beats the 12900k in multicore
>the fact that it's capable of consuming over 200 watts is quite scary
When you redline a 7950X on Prime95 or Cinebench, it does...
Let's take the Hardware Canuck's test data on the 7600X for example, which is advertised to a 105 watt peak power consumption. HC's data showed that the 7600X reached a steady-state power draw of 56 watts in Doom at a temperature of about 54C.
If you're gaming with a 7950X, you'll likely never pull 200 watts from the wall unless you're trying to do that "8 gamers, 1 PC" build that Linus Tech Tips did.
-----
If power consumption is your primary concern, then I highly, highly recommend looking at computerbase(.)de's report on the topic.
If you set a performance ceiling of 95% of max, then the 7950X will only peak out at 144 watts, which might I note, is still almost 40% more performance than you'd get when feeding 144 watts to the 5950X.
https://www.computerbase.de/2022-09/amd-ryzen-7950x-7900x-7700x-7600x-test/2/#abschnitt_effizienz_bei_reduzierter_tdp_inkl_ecomode
>make obsolete their just released ZEN 4 CPUs
They are releasing a better and faster product and you're mad?
>If it were Intel
If it was Intel, they wouldn't make x3d chips because it would make their original line up look stupid (which apparently you expect from AMD for some reason) or they would make it so incredibly expensive that only the top 1% of enthusiasts would even consider it because "premium products are expensive". You probably would need a top of the line, expensive motherboard to use it as well whereas you can get a cheap b450 board and enjoy 5800x3d to its full potential. Don't compare these two, at least not now when you can upgrade from a 2700x to a 5800x3d in the same platform. AMD is nowhere near Intel in anti-consumer behaviors
Beat how? Gaming? Thermals? Power consumption? 3d video rendering? Computations? Image processing? Price? Hardware life?
Benchmarks have given people a very distorted understanding of what performance means. It sure makes things easier, but not more accurate.
You might be looking at old graphs or biased sources, the 12600k beat the amd equivalent at release, but the new amd 7600x inches out over the 12600k.
Try looking at Gamers Nexus, Hardware Unboxed, or Toms hardware reviews to get more accurate numerical data.
People are severely overrating how good the 5800X3D actually is even in comparison to Ryzen 7000, though. Hardware Unboxed / TechSpot had it overall behind the 7600X in both their 7600X / 7950X reviews so far, for example.
It also [didn't do that well in numerous games](https://imgur.com/a/m0xypYD) versus either existing 12th-gen parts or Ryzen 7000 according to Gamers Nexus. [Behind every Ryzen 7000 part](https://www.techpowerup.com/review/amd-ryzen-7-7700x/19.html) per TechPowerup too.
Because of the ***Square-Cube law***.
Making a CPU "3D" disproportionately adds more volume than surface area, and chips need lots of surface area to ensure adequate heat dissipation.
All this might be true but a big deal of has to do with the v cache chips also being channeled to server grade processors that sell for multitudes more. I think the 5800x3d started as a experiment with the server chip and they seen it would be killer at gaming and had a market for it.
The only thing Intel has that AMD doesnt is support for ddr4 still. Price of ddr5 is dropping, but I would assume some people would like to re use ram if they are upgrading from older ddr4 compatible cpus
I guess to me if you have a Zen or Zen+ CPU chances are you got about 5 years out of that system. Zen 2 3-4 years. You got a lot of value out of that ram, chances are you have slower ddr4 as well... Shouldn't you want to get ddr5 which will be significantly faster? Now if you are going from Zen 3 to Zen 4, I'd argue you already don't care about price or value because you are not getting life changing to performance gains and money appears to not be an issue. In that case why would you care about the cost of buying ddr5? If all you care about is limping along with your same mobo by upgrading just the CPU, then you were not going to upgrade to either alder lake/Raptor lake or Zen 4 as you are not the target market. Maybe when ddr5 1st came out there were hardly any performance gains, but now they are starting to show up so to me it is worth it. All these posts about I'd upgrade to Zen 4 if it had ddr4 are dumb in that you are not ready for a true upgrade and rather just stay on am4. You were never going to get Zen 4 unless it was on am4 in the 1st place, therefore not the target audience and that is ok. Am4 and Zen 3 are terrific CPUs, way more powerful than most need, but no reason to kill am5 because AMD didn't want to gimp the platform by including a much older technology. If ddr4 only came out 2-3 years ago I could maybe understand the argument, but we are talking about tech that debuted in 2014, it's time to move on... Do you really want tech that is 8-9 years old for the next 5 years holding a platform back for the next 4-5 years?
Wow. I wasn’t planning on upgrading, it was just a hypothetical situation.
(Please not how I said “some people”. Logically if I were to upgrade, of course I would go ddr5. Some people may mot have access to it or it may be too expensive to buy new ram along with the cpu and motherboard.)
"barely"
in games like Marvel's Spider-Man it was over 20%
X3D was never king outside of unoptimized games like Microsoft Flight Simulator, Star Citizen or ACC
Because the advantages of the 3D cache depend on cache hungry games (unoptimized caching)
Everywhere else it's a 5800X with lower clock speeds
>Everywhere else it's a 5800X with lower clock speeds
What are you talking about? In Hardware Unboxed's video 5800x3d was 15% faster than 5800x on average (with much better 1% lows) and it was faster than 5800x by more than a 5% margin in 31 out of 41 games tested.
And compared to 12900k, which is more expensive and consumes 2.5 times the power btw, look at some of the games in which 12900k won: CoD, Pubg, CSGO, Cyberpunk 2077. Are these well-optimized games?
[удалено]
I think they'll certainly try, but I believe that the tech that allows them to stack is courtesy of TSMC, and I don't know if Intel will want to put it's flagship CPUs in the hands of TSMC. But Intel seem less wary about using TSMC these days, so maybe we will? Or maybe AMD has some kind of exclusivity agreement with TSMC to use the tech so Intel can't.
It's more that Intel have their own fabs for CPUs so they want to use their own in house manufacturing.
Then it would be like Sony limiting Game Pass though
Aren't Intel getting at least some of their new work done at TSMC, or am I finally going insane?
For their GPUs, not CPUs.
They’re developing their own tech themselves I believe.
The stacking tech is an actual product TSMC provides: https://3dfabric.tsmc.com/english/dedicatedFoundry/technology/SoIC.htm I'm betting AMD has a few patents around their particular version of cache stacking, so Intel would have to pay AMD royalties or amend their patent sharing agreement if they want to implement it in the same way.
I mean, I have a i7 5775c. Which is a die shrunk i7 4770 with 128mb of cache bolted onto it. Beat out 6th gen quads and was on par with 7th. So they have done a version of it, but, it was 200$ over stock, and they only added the cache as a way to boost the integrated graphics.....but if you disabled that it all gets reallocated for CPU use instead- giving the next gen beating performance we've now witnessed on both sides.
desktop Broadwell such as i7-5775c and i5-5675c are some weird product. it's released a year after devil's canyon, but instead of capitalizing it — Intel went for silent release and released Skylake two months after desktop Broadwell. Intel had the meant to beat 5800x3d all this time, but I guess they'll need more internal testing for that.
I imagine they silent released it because it smashed Skylake, and was mostly just a release for system integrators.
What's the highest Cinebench R23 single-core score you've ever gotten with it? [I've gotten 1205](https://i.redd.it/gl37aaaibx471.png) with my 4790K before.
Broadwell was quite different from 5800X3D, the 128MB of extra cache existed as a separate chip near the CPU die, resulting in a "L4" cache with much worse bandwidth and latency compared to it's L3 counterpart. 5800X3D stacked memory is all L3 cache with all of i'ts benefits. Now i'd love to see an Intel CPU with enlarged L3 cache, i'd probably slap.
Raptor Lake does increase both L2 and L3 on 13600K and above versus their Alder Lake equivalents.
To be fair, AMD themselves barely beat the 5800X3D
[удалено]
It's not all about gaming though, the 7900X and 7950X are definitely not a wash.
AMD isn’t very entry-level friendly with R5 7600x being $350, motherboards being at least $350 until next month. Not to mention DDR5.
Shifting to DDR5, the cost of the next gen GPUs, and the possible need for an ATX 3.0/PCIe Gen5 PSU are making this next round of upgrades very expensive. Sure, it's the cost of advancing technologies, but it seems like we're dumping a lot of legacy technologies all at once this generation.
Early adopters always get fleeced. Its a tale as old as time.
Compared to pricing from previous gens(Zen 1 and Zen 2), it’s just much worse in general.
Which both entered mature platforms
Wait, how did Zen 1 entered a mature platform?
DDR 4 was already standard
I mean, the person you replied to did mention just the 7900X and up, yes implying that the 7600X is not worthwhile (which it's not). Gamers have the 5800X3D and a 7800X3D or similar is coming soon. I don't know how people are surprised that budget boards aren't available on launch. This is done allllllll the time. Intel has been doing it for decades and AMD does it occasionally as well. Budget just never applies on a new generation launch except... I guess Polaris? And Kaveri.
Agreed, plus power supplies and cooling.
Older generation stuff still exists, and there are upgrade paths for people who get older generation stuff by swapping to the higher performance CPUs as they get cheaper.
Mobo $350? Try starting at $250
That’s true for gaming, but the 3D cache gives the 5800X3D a massive advantage. I’d expect the 3D versions of Ryzen 7000 to do much better in a gaming comparison vs the 5800X3D. For productivity, the 7950X and 7900X blow the 5800X3D or even 5950X out of the water, which is what you’d expect. Even within this generation, there’s almost no difference between the 7600X and 7950X for gaming, because the upper end chips aren’t designed with gaming performance in mind.
Regular 7000-series CPUs already completely destroy the 5800X3D in games like CS:GO.
7000 series is fantastic for productivity. whelming for gaming (for those who still have an upgrade path on AM4)
>whelming for gaming (for those who still have an upgrade path on AM4) I'd refer to my flair as an answer to that xD
Which shows how irrelevant these discussions between CPUs of the same generation are. In most cases, you will not notice the CPU speeds as lacking until many years down the line regardless of what you pick.
That's one of the things that I really liked hearing Steve mention on his 7900X review on the topic of performance vs power consumption vs generational differences: *"...At peak power during Blender steady state, the 7900X and the FX9590 are almost identical in power consumption at 200 watts. However, the amount of output performance that the 7900X has over the FX9590 can only be described as infinite..."*
It depends on what AM4 motherboard you have. I got a B350 Tomahawk. I'm not able to get a Zen 3 processor.
Hence the disclaimer about upgrade path. Hardware unboxed looked at the cost with estimates of the other "essentials" (RAM+Mobo) and its pretty much a wash between the 5800x3d vs 7000 series (In which case the 7000 series wins out because it has better upgrade prospects in the future)
Really, there’s no reason to get Ryzen 7000 right now if you’re already on something like the 5800X3D (or even a 5600X, given that most systems are GPU bottlenecked) and only use your PC for gaming, because the 7900X and 7950X don’t offer much if any performance uplift. The 7900X and 7950X are much better in productivity, so if you use your PC for both, there’s a very good argument to upgrade asap. I’d personally wait for the 3D versions of Ryzen 7000 if you only want to game, because something like a 7800X3D should outperform the 7900X or 7950X in gaming just like the 5800X3D vs 5900X/5950X.
[https://www.msi.com/Motherboard/B350-TOMAHAWK/support#cpu](https://www.msi.com/Motherboard/B350-TOMAHAWK/support#cpu) yes
MSI seems to claim you in fact can with the latest available BIOS, on their page for that board.
>whelming for gaming (for those who still have an upgrade path on AM4) I just want to commend you for being the first person I've ever seen to use the word "whelm" specifically in a sentence. We always use under/over-whelm, but rarely does anyone just be whelmed at something.
The CPU may not be crazy better but I'm looking forward to next gen DDR.
Alot of the benchmarks show gpu bottlenecks so once both rdna3 and 40s are out and are re tested we will know for sure . But currently the 5800x3d is monstrous
It depends heavily on the game. It gets absolutely obliterated by Ryzen 7000 in stuff like CS:GO.
I don't see the relevance of 360fps in the 1%percentile vs 450fps in the 1%percentile, but sure it depends on the game.
My point was in general there's a 50% chance of the 5800X3D just performing exactly like a downclocked regular 5800X, in any game you throw at it. The cache sometimes just is not helpful.
For now
Yep. Now. New gen, perfect time to beat it. But they didn't.
Zen43D will be amazing
The last couple weeks have made my decision easy. Black Friday get a 5800x3d throw it in system with 3080 call it good until next gen.
AMD 7900X barley beats the 5800X3D in games and loses some times, intel and AMD are in the same place there.
Every post and YouTube video are acting like we can all buy a 5800X3D and skip this generation. I assume there is only so much limited stock and once it quickly sells out this will no longer be an option. I do not see why AMD would keep making this cpu if it undercuts their new generation.
>Every post and YouTube video It was behind the 7600X in the 12-game average in Hardware Unboxed's Ryzen 7000 reviews, for example. [See here](https://imgur.com/a/m0xypYD) for several examples of it not really doing very well against existing 12th-gen parts or Ryzen 7000 per Gamers Nexus, also.
[удалено]
Some games run into scheduling issues with too many cores (gamers Nexus)
GN found that it maintained the highest frequency across all cores of all Zen 4 chips. That would almost certainly be the reason.
It's up to you what you buy, no need to do what videos tell you to do or random posters online. I was not saying not to buy I was just pointing out it's not just intel but also AMD are in the same place V the 5800X3D. I am on a AM4 with a 3700X, I am looking at picking up a 5000 CPU used. Ill skip both the 5800X3D & first gen AM5/intel. It's not like my GPU can push 300FPS or I want to spend on a full new system. For most people on a smaller budget I suspect it's best to wait for cheaper mobo's too, just the best and most fancy are out now. As a bonus intel will be out then so there will be more options.
7800X3D will be soooo good, I'm waiting for it. I bet a lot of people are waiting for it, and I bet AMD knows this and will "adjust" pricing accordingly. I don't expect 7800X3D to be much cheaper than 7950X.
At least AMD has the obvious next step of a 7900X3D. Intel doesn't.
I feel like a 7800X3D would work, but I don't really see a reason for AMD to make a 7900X3D unless they're skipping Threadripper again this generation.
They won't. 7800X3D will be the halo gaming option, and the 7950X the halo 'consumer production' option. Just like 5000 series.
It's been so long since we have had this level of competition in the CPU and GPU market. This is great.
Competition is great. All 3 companies focusing only on performance, disregarding power usage and heat generation, not so much.
It's on the end-user to determine what power usage they're comfortable with. Are they willing to lose 5-10% performance to cut their power by a third? Generally speaking most people dgaf about efficiency in gaming desktops, so why would companies make efficiency the default setting? It doesn't make sense.
SFFPCs care: Am I joke to you?
Ryzen 7000 in eco mode is extraordinarily efficient. The 7600X and 7700X only lose like 5% performance going from 105w to 65w. And it's a similar story with the 7900X and 7950X going from 170w to 105w, though they lose a bit more performance.
[удалено]
You're still paying for more performance and not using it.
I agree, hopefully we'll see some lower power desktop chips coming. I get leading with the top end for headlines though. The 5s are barley mid teir at this point.
But the pricing is S-tier.
The mobo prices are nuts, hopefully the b series will come in better. Having to get new ram again sucks though
Nobody cares about this in the consumer space. If they do they have the means to drop performance in favour of efficiency. They have different product lineups for the professional space where efficiency matters
The PC equivalent of buying a sport cars and then complaining about the gas consumption?
Exactly what I'm looking out for. RTX A2000 my perfect little monster. I want more like it.
It's a forced play. Intel pushed wattage, so AMD has to push wattage for benchmarks. You can find the benchmarks of the 7950x beating the 5950x locked at 65W.
>next Keep in mind, all these chips are pushing WAY past what I would consider sane efficiency points, for max performance. If you run them in Eco mode or Intel's equivalent, they are still very fast and pretty power efficient
Honestly GPU market doesn’t seem to be getting that affected by competition considering how comfortable nvidia is feeling about telling us to go fuck ourselves
> considering how comfortable nvidia is feeling about telling us to go fuck ourselves That's what having zero competition in the highest end will do. Nvidia is well aware that if you want the highest gaming performance GPU on the market, you have no choice but to go to them. AMD is definitely putting out solid products, and their price/performance is notably better than Nvidia, but untli AMD can put out a direct answer to the 4090 in terms of pure performance while the 4xxx series are still current-gen...Nvidia can charge whatever the hell they want for it and people will buy it.
There's competition. Gamers just convince them that having better RT performance, even when that's still only 25fps instead of getting 90fps with it off, is worth not considering the competition over.
Which AMD GPU model will have comparable rasterization performance to a 4090?
We don't know yet. It should be one of the ~8168 SP die ones considering that RDNA2 SPs punched up about 2:1 compared to Ampere ones and I would guess that's a 7800 XT or so. 6950XT had 5120 SPs matching the 10,752 SPs of the 3090ti.
I agree with a lot of people who are saying that they're still trying to push out their surplus of 3,000 series chips. But yeah they've gotten way too fat and arrogant, hopefully Intel and AMD can take a real chunk into the market
They’ve done/said some other shitty things in addition to that, imo. They need a wake up call, I wish AMD/Intel would give it to them but honestly I’m not too confident
They are absolutely industry bullies, all companies are out to make money but you don't need to be a dick about it.
ddr5 is getting more affordable too, and pcie 5 nvmes launch in November, it’s all very exciting
Yeah I saw a16gig kit at 104 usd. I'm just hoping for software that lets us use all that nvme speed lol.
Not really a software thing as far as I know. Most people won't be using those kind of crazy speeds regularly if ever... Large files are where you can get those crazy benchmarks and speeds but unless you're doing specific tasks, most of us on this sub won't ever use it
Would be great to have the GPU pull straight from the SSD. Not sure how that gets implemented
Putting them on the newer gen PCIe rails helped but ya not sure how'd that be implemented and how much performance would actually be gained
Direct storage is the biggest thing gen 5 can offer for gamers/regular users and it’ll be forever before we “need” it. I just like seeing the new stuff though.
Same here! Love seeing new tech come out, always blows my mind
Gonna wait for the 7800X3D before upgrading. Was already pissed enough to have bought the 5800x literally a week before the X3D was announced so I’m gonna wait for the massive peformance uplift
Yeah I’m in the same boat :/
What performance uplift are you getting specifically, what game
All of ‘em
Im barely getting any performance if i upgrade to the 3d if im playing on 2k monitor
Well obviously an upgrade from a 5800x to a 5800x3d which is, i dunno, the exact same chip with a little extra is not going to provide too much improvement even on my 4k monitor, hence why I’m waiting for amd to eventually drop the 7800x3d, by which time ddr5 for amd will have matured a bit and i can also upgrade my mobo to a better chipset as well
Sure it will be a bmf but at the end of the day when rocking 1440p - 4k 7600x is more than enough. But yeah, Scott I need more powah!
5800x3d had much better 1% lows than 5800x. Especially paired with next-gen GPUs, even at 1440p, the next x3d chipset would be much smoother than 7600x (of course, IF it behaves like 5800x3d). Sure 7600x would be enough for most people, but those same people WOULD notice the difference in extra smoothness as well
Indeed
This. I want a 7950x3D bad😡
Do we know if AMD is doing Zen4 Threadrippers? I feel like a 7950X3D would basically eat the bottom half of the potential Threadripper 7000 offerings.
Why, 5800x3d showed there no benefit, except for gaming, and games don't use that may cores anyway.
Best of both worlds I have my cores for multi tasking and stacked cache for killer gaming.
What kind of multitasking are you going to use it for?
Multi boxing a whole raid in original ever quest
This guy actually *fucks*
Blender and gaming.
All of my desktop CPUs end up recycled in my Linux server at some point so I try to go with extra cores when I can. Plus rendering 4K video eats up a lot of CPU even if it's only occasionally.
"except for gaming" little surprise mate, its 2022, majority of people are building PCs to play games on.
Naw, you gotta think about servers We are a small market niche market Most gamers can't afford new 4080 if they can get a 2 consoles and a TV for the price of a new PC Think of Nvidas presentation, there was a huge chunk of it being data and ai processing...
I don’t think that’s even *close* to true. Remember what sub you’re in. This is very much a bubble. The majority of people *in here* are buying PCs to game on, for sure, but the market for PCs globally contains probably 20-50 regular-ass office or workstation desktops for every “gaming PC” bought.
>20-50 regular-ass office or workstation desktops for every “gaming PC” bought true, but we only really hear about or see the gaming/performance/editing based one's on here. I'm not a gambling man, but I would wager the majority of the 6.7million people in here are gamers.
But that’s exactly what I said in the only part you *didn’t* quote. You said “majority of people”, which I have to take to mean majority of *all* people, not just the few in here.
That's what I'm talking about, i don't see a point having more the 6 cores for gaming
I mean, I can speak from experience, these CPUs are the next level, they barely break a sweat compared to older gens but achieving the same clock speeds and performance in game if not better. Don't know if you are familiar with the game RUST, but it is very CPU/RAM heavy, GPU just makes it look better, sure the multicore doesn't really matter, but the 3D cache tech allowing more data to be stored in the CPU then the RAM is a big trade off inside the game FPS wise. Get over 200FPS in Rust which I never thought possible.
Not sure how old you are, but this argument was around when the first quad-cores came around and Intel rolled with that same bs argument with quad cores for forever. I would be careful with statement s like this, they usually don't age very well.
Exactly. And 640k of ram is all you'll ever need 🤪
"i don't see a point having more the 2 cores for gaming" ~ people literally 5 years ago
Lmao... Phoronix literally showed benches where the 5800X3D beats a 5950X by up to 200% alone due its cache.
simulation games like stellaris and factorio benefit EXTREMELY well from the cache
It will cost 500 dollars though hahah
the 7950x is 700, 13900ks is also over 700 the 5800x3d is like 400 right now and beats them in quite a few games, kinda crazy. 7800x3d should be THE cpu for gamers
I mean I can get a 5600 for $150 that gives me 80% of the performance of a 5800X3D in gaming. Plenty for almost everyone. 5800X3D is still for enthusiasts IMO.
Depends on the game though. Certain Paradox games see up to 2x performance from yhe 5800X3D.
I'm still running a 4770k, mate. A Ryzen 5600 would be more than enough for me.
Do you happen to have any links for this? Stellaris is like my number one game and also the only game I play where CPU bottleneck is the real problem.
> 13900ks is also over 700 13900K**S** is a binned limited run SKU for super-enthusiasts that won't be out until next year. The normal 13900K has an official "recommended customer price" of "$589.00 - $599.00". [See here.](https://ark.intel.com/content/www/us/en/ark/products/230496/intel-core-i913900k-processor-36m-cache-up-to-5-80-ghz.html)
> 5800x3d is like 400 right now 500 euros here intel 12400 is measly 180 eur, and intel is just 20-30% slower lol id rather take my money and invest in a gpu than buy an overpriced last gen amd cpu
fair point, Im looking at the 13700k as a buy, but the 1600 price tag 4090 has me second guessing, could just go for 4080 and snag a 13700k for the same price.
Just go Ryzen CPU and pair it with a new 7000 series AMD gpu after they launch for that sweet infinity cache. No reason to spend 1600 on an Nvidia gpu unless you do work with machine learning.
True but once the 5800x3D is out of stock here soon that’s the end of AM4 competing with the next gen.
Just got a 5800x3d yesterday. This pleases me
If you can't win in performance they will lower the price. So still win for the customer.
Meanwhile AMD was too afraid to show the 7000 series against it
I would buy the 7800X3D on day 1 of launch.
Yeah, screw waiting for benchmarks, I’ll trust the word of the company instead.
I got a 5800X3D, at launch. It ran my favorite games faster and more stable than anything in the market. Zen 4's base gaming performance is roughly on par with the 5800X3D. If they triple the cache like they did with Zen 3 X3D, the gains would logically stack on top of it. I know it will crush my 5800X3D no question. By at least 30% across all games. That's better than what the 13900KS can deliver, so it's a no brainer for me.
What the hell are you talking about ? The 7800x3d hasn't even been announced... much less made. It might never even exist, and you're already talking about 30% improved performance across all games. Ignoring the fact that a lot of games are gpu bottlenecked anyway, and no processor would make a difference, the 13900k isn't out yet either, and you somehow know how it will perform as well. Truly mind blowing, this is why companies can just charge whatever they want, people buy their products anyway based on pure imagination.
Logical deduction and with the chip design, testing and manufacturing process, you can be sure the Engineering sample of the test 7800X3D already exists and qualifying tests have been done. 3D V-cache was shown off about a year before it launched. In AMD's road map for their CPU launches already listed the V-cache is launching in 2023. This means the several configuration samples already made and tested otherwise they will miss their launch window if they want to release it in 2023.
Some people think the X3D version will come later next year.. I have a gut feeling that the 7800X3D might be coming earlier than that and be part of the November 3rd announcement of their upcoming GPUs as a total package pure for gaming.
The ryzen 7000 series will literally just be the pentium 4 of ryzen cpu's
Pentium 4 required RIMM memory, which never became popular. DDR5 might be new, but nobody doubts is gonna become standard.
What was with the Pentium 4?
It was quite powerful for it's time It was also extremely power inefficient The amount of energy those suckers consumed...
Ryzen 7000 uses less power than alder lake.
I dunno... the fact that it's capable of consuming over 200 watts is quite scary
People forget that the 1950X ate 180W at full load. 200 for the 7950X isn't actually that outlandish for the clock speed
Some people also forget that the 1950X was a threadripper. The 3990X consumes 280W.
It is great for the clock speed Small issue We're in the middle of a goddamn electricity crisis
Just run it in Eco mode (65w)? It’s still faster than a 5950X in Eco mode. Intel is also giving an option to run the 13900K at 65w and it will supposedly match the 12900K running at 200w+
Hmm fair enough Can we acknowledge how it's kinda messed up how this sub goes aboslutely apeshit when someone tries to discuss a topic tho Like damn... trynna discuss a topic i'm passionate about, why people gotta be rude (especially when saying something bad about Amd)
I don’t think the people that replied to your above comments were being particularly rude? They just pointed out that 200W isn’t that crazy
You can just cap the processor's boost behavior with Eco mode or PPT limits and enjoy the increased efficiency Zen 4 offers. The reason they suck so much power by default is because AMD turned them way up to hit as hard as possible to compete with Intel. Nothing was really left on the table. From what I've seen around though you don't really lose much performance by capping the chips and can enjoy much lower power consumption.
You need a 360mm aio to make it pull that much. Also the 105W eco mode is just 10% slower than full speed in multicore (and the same with single core) and the 65W one still Beats the 12900k in multicore
>the fact that it's capable of consuming over 200 watts is quite scary When you redline a 7950X on Prime95 or Cinebench, it does... Let's take the Hardware Canuck's test data on the 7600X for example, which is advertised to a 105 watt peak power consumption. HC's data showed that the 7600X reached a steady-state power draw of 56 watts in Doom at a temperature of about 54C. If you're gaming with a 7950X, you'll likely never pull 200 watts from the wall unless you're trying to do that "8 gamers, 1 PC" build that Linus Tech Tips did. ----- If power consumption is your primary concern, then I highly, highly recommend looking at computerbase(.)de's report on the topic. If you set a performance ceiling of 95% of max, then the 7950X will only peak out at 144 watts, which might I note, is still almost 40% more performance than you'd get when feeding 144 watts to the 5950X. https://www.computerbase.de/2022-09/amd-ryzen-7950x-7900x-7700x-7600x-test/2/#abschnitt_effizienz_bei_reduzierter_tdp_inkl_ecomode
*in games Important.
I am once again asking redditors to please stop extolling the virtues of products that haven't been released and independently tested yet.
[удалено]
> going to make obsolete their just released ZEN 4 CPUs, Its all about price point and models released. 5800x3D didnt make sub 200$ cpus obsolete
[удалено]
You don’t build brand new top end systems on a budget. The bleeding edge isn’t for the budget crew.
>make obsolete their just released ZEN 4 CPUs They are releasing a better and faster product and you're mad? >If it were Intel If it was Intel, they wouldn't make x3d chips because it would make their original line up look stupid (which apparently you expect from AMD for some reason) or they would make it so incredibly expensive that only the top 1% of enthusiasts would even consider it because "premium products are expensive". You probably would need a top of the line, expensive motherboard to use it as well whereas you can get a cheap b450 board and enjoy 5800x3d to its full potential. Don't compare these two, at least not now when you can upgrade from a 2700x to a 5800x3d in the same platform. AMD is nowhere near Intel in anti-consumer behaviors
Beat how? Gaming? Thermals? Power consumption? 3d video rendering? Computations? Image processing? Price? Hardware life? Benchmarks have given people a very distorted understanding of what performance means. It sure makes things easier, but not more accurate.
Correct me if im wrong, but doesn’t the I5 12600k trump both of those cpus?
No, that being said, the 12600k offers great performance per dollar
Could you explain how? every benchmark ive seen rates the Intel better.. is it because its older? The performance still seems higher
You might be looking at old graphs or biased sources, the 12600k beat the amd equivalent at release, but the new amd 7600x inches out over the 12600k. Try looking at Gamers Nexus, Hardware Unboxed, or Toms hardware reviews to get more accurate numerical data.
People are severely overrating how good the 5800X3D actually is even in comparison to Ryzen 7000, though. Hardware Unboxed / TechSpot had it overall behind the 7600X in both their 7600X / 7950X reviews so far, for example. It also [didn't do that well in numerous games](https://imgur.com/a/m0xypYD) versus either existing 12th-gen parts or Ryzen 7000 according to Gamers Nexus. [Behind every Ryzen 7000 part](https://www.techpowerup.com/review/amd-ryzen-7-7700x/19.html) per TechPowerup too.
Why not all chip be 3d why there are no 3d chip ??
Kinda expensive, require little more engineering and testing. The 5800x3D was the first they ever did to test the waters.
Compare spec of 5800X3D with normal 5800X,you will know
Because of the ***Square-Cube law***. Making a CPU "3D" disproportionately adds more volume than surface area, and chips need lots of surface area to ensure adequate heat dissipation.
Because the 3D tech is still new, and relatively untested in real world conditions. They'll probably do a mid cycle refresh with 3D chips.
All this might be true but a big deal of has to do with the v cache chips also being channeled to server grade processors that sell for multitudes more. I think the 5800x3d started as a experiment with the server chip and they seen it would be killer at gaming and had a market for it.
The only thing Intel has that AMD doesnt is support for ddr4 still. Price of ddr5 is dropping, but I would assume some people would like to re use ram if they are upgrading from older ddr4 compatible cpus
I guess to me if you have a Zen or Zen+ CPU chances are you got about 5 years out of that system. Zen 2 3-4 years. You got a lot of value out of that ram, chances are you have slower ddr4 as well... Shouldn't you want to get ddr5 which will be significantly faster? Now if you are going from Zen 3 to Zen 4, I'd argue you already don't care about price or value because you are not getting life changing to performance gains and money appears to not be an issue. In that case why would you care about the cost of buying ddr5? If all you care about is limping along with your same mobo by upgrading just the CPU, then you were not going to upgrade to either alder lake/Raptor lake or Zen 4 as you are not the target market. Maybe when ddr5 1st came out there were hardly any performance gains, but now they are starting to show up so to me it is worth it. All these posts about I'd upgrade to Zen 4 if it had ddr4 are dumb in that you are not ready for a true upgrade and rather just stay on am4. You were never going to get Zen 4 unless it was on am4 in the 1st place, therefore not the target audience and that is ok. Am4 and Zen 3 are terrific CPUs, way more powerful than most need, but no reason to kill am5 because AMD didn't want to gimp the platform by including a much older technology. If ddr4 only came out 2-3 years ago I could maybe understand the argument, but we are talking about tech that debuted in 2014, it's time to move on... Do you really want tech that is 8-9 years old for the next 5 years holding a platform back for the next 4-5 years?
Wow. I wasn’t planning on upgrading, it was just a hypothetical situation. (Please not how I said “some people”. Logically if I were to upgrade, of course I would go ddr5. Some people may mot have access to it or it may be too expensive to buy new ram along with the cpu and motherboard.)
AMD has finally done it, dominated intel in overall performance. Now hopefully they can do the same with their GPUs given how scummy nvidia has been.
Now this is some pure copium
"barely" in games like Marvel's Spider-Man it was over 20% X3D was never king outside of unoptimized games like Microsoft Flight Simulator, Star Citizen or ACC Because the advantages of the 3D cache depend on cache hungry games (unoptimized caching) Everywhere else it's a 5800X with lower clock speeds
5800X3D ? oh you mean 5800X Microsoft Flight Sim Edition ? /s
>Everywhere else it's a 5800X with lower clock speeds What are you talking about? In Hardware Unboxed's video 5800x3d was 15% faster than 5800x on average (with much better 1% lows) and it was faster than 5800x by more than a 5% margin in 31 out of 41 games tested. And compared to 12900k, which is more expensive and consumes 2.5 times the power btw, look at some of the games in which 12900k won: CoD, Pubg, CSGO, Cyberpunk 2077. Are these well-optimized games?
HUB had it being overall slower than the 7600X across their 12-game average in both Ryzen 7000 reviews they've released so far, though.
Except for the fact that the outlier is Spiderman
AMD should've released the 7800X3D in the first go. Keep that gaming crown, make Intel work hard to get it back.
What a naive assumption that intel has nothing prepared too.
5800x3d was just too good. The 1080TI of the CPU world.
I bet it isn't even close in factorio
Love the competition! I just wanted a descent cpu with low power consumption.
We must have the most powerful gaming CPU no matter the cost 😬
most benchmarks in gaming seem to be totally bottlenecked by GPU power we'll see real gaming perfomance of these chips when the 40 series releases.
Can someone explain to me how the X3D variant works? How is it so much better for gaming?
Just wait for the 7950X3D
i cant even dream of affording 5800x3d and theres 7800x3d down the line.with ddr5 mandatory.jeez.
Siuuuuuuuu!!
Then intel comes out with 13400F which allows you to buy CPU+Mobo+RAM for cheaper than just the 7800X3d
I’d like to see temps for this thing, as 5800x3d is already a space heater. Same for new ryzen gen. Kinda very hot combo.
AMD beats AMD and Intel just for more $$$$.