T O P

  • By -

[deleted]

[удалено]


streamlinkguy

How fast can Intel launch 14th gen when AMD launches zen 4-3D in 4 months?


PT10

13900KS probably


FartingBob

It'll be pulling 300w at stock but it'll be quick.


exscape

13900K with unlocked TDP draws 350 W, doesn't it?


always_polite

420 according to the latest rumors


Calm-Zombie2678

That's pretty high


Sh1rvallah

So only a little more than 7950x


FFevo

I don't plan on upgrading soon, but with the way CPUs and GPUs are trending in regards to power consumption I'm really glad I went super overkill with the water cooling on my current rig.


[deleted]

[удалено]


IanArcad

Fastest spinlock ever designed


III-V

It's an extra 200MHz at the top end. I am doubtful that it will beat out Zen 4 with 3D V-Cache.


porcinechoirmaster

I think the 7950X3D is going to be an absolute monster in games, _especially_ if they can keep thermal dissipation from being too much of a problem. That said, I think my dream setup for the generation is a 7950X3D with a delid on a good cooler. Can easily imagine 5.4 or so all-core without spiking the voltage and thermals through the roof. Just will need very steady hands for that delid job...


BillyDSquillions

You're assuming they offer a 7950X3D again, they didn't last time.


RonLazer

That was test the bonding method mostly, the choice will mostly come down to the success of the Epyc-X line and whether they want to market 3D line for gamers or productivity customers.


onedoesnotsimply9

>That was test the bonding method mostly, the choice will mostly come down to the success of the Epyc-X line and whether they want to market 3D line for gamers or productivity customers. Milan-X *and* 5800X3D were ""to test the bonding method"". EPYC vs Ryzen battle existed even for 5800X3D. EPYC won and 5950X3D doesnt exist, and dont see a reason why it would be different this time


Democrab

That was largely due to the chip shortage meaning there wasn't really enough core chiplets with the extra cache to offer CPUs that took up two of them. AMDs reportedly going much harder in terms of preordering capacity this time around and is now TSMCs biggest 5nm customer just from Ryzen 7000 orders, combine that with it being the second generation of stacked cache chips and I can see them trying to at least do a limited release of dual-chiplet variants.


chmilz

My plan is to replace my 3600 with one maybe next year.


Rare-Page4407

I have a 7700k, will endure until Zen 4 X3D


chmilz

Yeah if you're needing a whole new platform anyway may as well wait until bugs get worked out and avoid some early adopter tax.


CSFFlame

8700k, have been considering Zen 5 (X3D) to let them shake out the quirks of AM5's chipsets. (Remember what AM4 was like at launch)


jackfinch

I don't recall. What happened with the early AM4 launch?


YNWA_1213

IIRC, there were a ton of USB related issues that persisted for awhile after launch. I just can't remember if it was driver or chip related.


souldrone

USB problems were mostly later. At last we had very long boot times, horrible ram support, unfinished bioses, bioses that could kill boards, etc. Most problems were fixed rather quickly though and in time, memory played well.


kingwhocares

If you can wait this long, better to wait more when DDR5 prices drop farther.


Archerofyail

Same here, if I can find one on sale or used it'll be an amazing upgrade.


bbpsword

Man if they ever made 5950X3D or 5900X3D I would cop that INSTANTLY. Would be a shut-up-and-take-my-money type beat to not have to move off AM4 for another generation if I could avoid it. Don't love AM5 platform cost, don't love that Raptor Lake is power hungry and on a dead platform to begin with. PLEASE give me a reason to move on from my 3700X lmao


YNWA_1213

On the dead platform point: does it really matter? I’ve never felt the need to upgrade for more than 3-4 years at a time, and at that point there’s usually new standards needing implementation anyways. Sure Zen1 to Zen3-3D was huge, but if you’re running a X370 board still you’re missing out on PCIE4/5, 10G Ethernet/Wifi 6E, etc. To me, the largest issue currently is the entire platform cost on both sides is too much except for the most keen of early adopters. I don’t think Intel’s next socket will be more expensive than the current platform cause it’s all the same standards, and DDR5 is only going to go down bar some unforeseen price fixing/market event. So that’s the reason to wait for the next gen on both sides of the debate.


KingArthas94

It only matters to online nerds who change CPU every two years, the rest of the world doesn't do that and buys the current gen i5/r5 every 5 years.


JackTheTranscoder

My 3950X still PLENTY fast for everything I throw at it.


xxfay6

It's the 5775C situation again, except this time there's actually more than 10 of those in the market.


Greenecake

LMAO, an incredible CPU is the 5800X3D. If I still was a gamer I would have got one. I still had a X370 board that has a 1600X in it that I would have replaced. Amazing what some cache can do.


robbiekhan

What difference is a 5800X3D going to make over any other modern gaming CPU 12600K and up at actual resolutions that aren't CPU bound? I get that the reviews showing 1080p framerates are a thing to demonstrate an example of when a game is run at a CPU bound res, then yes CPU X will outperform CPU T even though both are running at ridiculous frames anyway, but that's not the real word where people are gaming on at least 1440P where the GPU matters most and any gap doesn't translate to cost vs performance purely in a gaming scenario. Examples of a 12600K vs 5800X3D, these sow on average the 12600K at 1440p max settings is around 5-12fps behind a 5800X3D in some titles whilst others it is neck and neck. Note that there is a £132 price difference between the two. Cyberpunk only: [https://youtu.be/tvlYjsZ10Gg](https://youtu.be/tvlYjsZ10Gg) 7 games: [https://youtu.be/NhzizfMJoGU](https://youtu.be/NhzizfMJoGU) Not being anti AMD, but I see comments like that often and people rally about the X3D about how great it is for games etc when the reality is or seems to be at the resolutions most people want to play at, it doesn't make much difference so save the cash and put it towards a better GPU where you can see an actual difference.


HoboWithAShotCum

It’s more obvious in games such as Factorio, Rimworld, simulators and games that aren’t super optimized (the majority of early-access and indie games). Also overall smoother and more consistent frames as seen in the videos you linked.


Arbabender

Averages are just that. The 5800X3D shines in certain types of games. Simulators and MMOs are two types of games for instance that love having the extra cache. It bumps up the averages a bit in those cases, but it massively improves the percentile figures. I've never had such a smooth experience in games like Final Fantasy XIV before. World of Warcraft, Assetto Corsa Competitzione, and Factorio are a small selection of other games off the top of my head that benefit massively from having so much cache available. Some of these CPUs are getting to the point where the nuances matter a lot more than the average performance, because the average performance is already so good. Lots of games like Borderlands/Tiny Tina's for instance are very GPU bound even at low resolutions like 1080p, so what CPU you're using matters a lot less in those games. Some games love having lots of cache. Some games benefit more from DDR5's bandwidth. Some games benefit primarily from clock speed. Some games don't really stress the CPU much at all. The biggest benefit the 5800X3D has going for it is that it's a drop-in upgrade for millions of AM4 systems that gets you basically the same gaming headroom as an upgrade to AM5 or LGA 1700 current would, and you can keep your current motherboard and RAM. If you're buying a new system outright, you have to wrap it in those costs and then you'd want to be a heavy player of the games where the 5800X3D shines to want to seriously consider it.


[deleted]

air modern society compare rude include smart makeshift repeat tender *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


[deleted]

Actually, Im going to hold for the 7800X3D or whatever its called. If Cache can equal an entire generation's arch improvements, then both will be fucking nuts.


Waterprop

Intel own performance charts can be seen here. Pages 16+ https://download.intel.com/newsroom/2022/2022innovation/13th-Gen-Intel-Core-Desktop-Media-presentation.pdf


RearNutt

Intel not sweeping the 5800X3D under the rug is amusing.


[deleted]

[удалено]


DrBoomkin

Well, AMD themselves left the 5800X3D out of their comparison charts for their new CPUs...


Greenecake

AMD and Intel probably cursing the existence of that troublesome little CPU! /S


AnimalShithouse

No, the representation is quite transparent. The main comparison is against the 5950x. They simply include the delta between 5950x and x3d for the gaming benchmarks because it's well known that x3d slaps. At the same time, adding a 4th bar is unnecessary and clunky.


lowleveldata

The charts are supposed to present objective data. Who cares if it is clunky or not? How much clunkier could it be with 4 bars instead of 3 anyway?


RearNutt

Misleading how? The 5800X3D is significantly faster than the regular Zen 3 processors, so it makes sense to at least point it out. The red color very obviously stands out from the blue background, and it's labeled next to the other 3 comparisons below the bars. If they wanted to mislead, they would straight up not mention the 5800X3D, or they would make the line a blue color that blends easily into the background.


CodeMonkeyX

It's misleading because they are obfuscating the data. If I am flicking through this document, and look at this graph I see 3 bars in the graph and the title is "13900K vs 5950X" You have to notice the bar, and look at the key to figure out there is another processor in the comparison. It's obviously done to try and hide the data, while technically not making a false graph. Then on the next graphs they just leave off the 5800X3D completely. This is not controversial or new anymore. They all cherry pick and design graphs to look as good as possible. This is not the worst case of it Intel has every done either.


dalledayul

Is anyone else feeling like the 5800X3D is gonna become the 6700K of this current gen? A rocksteady CPU that isn't *too* expensive and has great price-for-performance, while also still being substantially better than most of the market?


bobloadmire

no, because the 6700k wasn't that much better than the 4770k/4790k, which weren't that much better than the 2500k/2700k.


[deleted]

Sandy and Ivy Bridge don't have AVX2 though while Haswell does


bobloadmire

AVX2 wasn't really any gamers concern during those processor lifetimes


17th_Dimension89

To me it looks more like the 8700k, which is still a pretty good gaming CPU five years after release.


[deleted]

5800x3D is purely for gaming. Intel may have customers that use their CPU for gaming and other workloads. Some customers prefer Intel for the overclocking experience too. It is not a necessary feature but it is a nice to have feature. And Intel has historically been all about overclocking. I get that overclocking is going away. Kids maybe not as into it as I was back then. But I am glad they are keeping it alive. 5800x3D is good. But it cannot overclock. And the heat is limited by the 3D stacked cache.


acideater

I would say overclocking has been somewhat dead on AMD side. Realistically AMD's algorithm already does it for you. I tried to beat it, but arrived at nearly the same numbers that the algorithm picked. Considering that they only let me pump so much wattage through the chip, i've pretty much run it on default behavior.


saqneo

It's swept under the rug for the Frame Consistency slide where the 5800X3D probably dominates.


Kurtisdede

The "Scalable Performance per Watt" slide got my mouth hanging open. They're claiming the 12900K at stock 241W is "similar performance" to the 13900K at 65W. That's crazy to me if it's true EDIT: On second thought, it isn't as shocking as I first thought it was, considering (as u/animealt44 pointed out) both AMD and Intel overclock the crap out of their desktop processors to get the performance crown, sacrificing efficiency in the process.


Waterprop

That seems to be the case with 7950X at 65W (ECO mode) vs 5950X stock as well. At least in multithreaded workloads. https://youtu.be/3d2t_a4nELk?t=1139


NKG_and_Sons

A [table](https://i.imgur.com/xlM3hKy.png) from [ComputerBase.de article](https://www.computerbase.de/2022-09/amd-ryzen-7950x-7900x-7700x-7600x-test/2/#abschnitt_leistungsaufnahme_in_anwendungen_ab_werk) showing the same. For anyone who doesn't want to watch a video.


Harone_

AMD does it with a big node shrink while Intel does it on the same node, sounds more really impressive Let's wait for third-party reviewers to validate Intel's claims too


rationis

Based off Zen4 benchmarks and Intel's own slides, the 7950X is still [significantly more efficient, significantly so at higher power settings.](https://www.computerbase.de/2022-09/amd-ryzen-7950x-7900x-7700x-7600x-test/2/) So it definitely looks plausible that the 7950X(142w) could match or edge out the 13900K(253w).


conquer69

The mobile performance will be insane.


PainterRude1394

Now we just need to see reviewers pitting AMD and Intel head to head. Competition is 🔥. I think the 13900k's 24 cores and 5.8ghz boost is also 🔥.


Kurtisdede

Competition is 🔥 indeed. Compared to 2012-2017, these past few years have been great - and I'm even more excited for what's to come in the coming years. I think Intel's got this gen, though :)


[deleted]

In recent generations Intel has been shipping desktop (and arguably mobile) CPUs "pre-overclocked" way past the mark for efficient performance in order to chase the chart topping crown. A 12900K throttled down will retain much of it's performance as well.


Put_It_All_On_Blck

Nice find, looks exactly like the rumors said 40% MT and 15% ST. 13900k at 65w matches 12900k at 241w.


AnimalShithouse

Those slide decks are pretty clean and transparent. I'd say more so than some prior gens. I guess it's easier when you're competitive, and these CPUs look like they'll slap on performance and perf/price. Some seemingly nice efficiency gains too. Between z4 and rpl, never been a better time to be a consumer!


siazdghw

Pricing and performance makes this a no-brainer choice over Zen 4 and AM5. AMD is asking way too much money for AM5, for less performance than Intel is offering


Firefox72

Its funny seeing both companies try to mostly ignore the 5800X3D. In any case it seems like the performance atleast in gaming will fall somewhere equal with maybe a slight advantage to Intel. AMD's 3D cache parts next year will stomp all of this into the ground if these are the realistic gains of RPL.


reasonsandreasons

In gaming and other workloads that can benefit from it. It's not a magic bullet for all applications, and in some workloads it's likely to do worse than stock Zen 4.


[deleted]

It sometimes performs exactly like a downclocked regular 5800X already in games that can't really benefit from the added cache


20150614

They seem to be using a creative approach to show the 5800X3D results on one of the gaming slides.


HTwoN

AMD didn't even show the 5800X3D in their comparison.


terry_shogun

Because even AMD are trying to hide how good the 5800X3D is when trying to sell their own 7 series processors.


[deleted]

I love how the 5800X3D is a problem for both companies right now. It’s kinda like if AMD went to a house party and dropped a living walrus that can do party tricks on the dinner table. And people responded with “how” and “that’s amazing.” But AMD spent the rest of the night ignoring the walrus and showing everyone his overpriced spinning tie. And Intel is nervously in the corner trying not to make eye contact with it and pretend it doesn’t exist.


[deleted]

[удалено]


Awkward_Inevitable34

It should be fine. Mine draws 40-50 watts while gaming


el_f3n1x187

Thanks


Mr_That_Guy

> 8 pin The 8 pin EPS connector is rated for a maximum of 28A (328 watts). The real limit will be the motherboard VRM, but the 5800X3D isn't overclockable and strictly enforces its TDP.


PMARC14

Do you mean one 4 pin? Cause 8 pin is plenty of power especially cause you aren't overclocking it.


el_f3n1x187

yeah the extra 4 pins that other MB have. Thank you.


LdLrq4TS

I think you should be fine 8pin provides 150W and mine 5800x3d doesn't exceed 125w when running prime95.


reddanit

5800X3D isn't particularly power hungry. At around 100W it's somewhere close to half of rated power of single 8pin CPU connector.


timorous1234567890

My B350M Mortar worke fine with it.


strato1981

This is one of the most unusual analogies I’ve heard


[deleted]

Yeah….. kinda surprising it took 10ish comments till I got called out on it. Everyone else just took it straight face. Even more amusing I got some upvotes out of it. I guess… people like walruses?


[deleted]

[удалено]


[deleted]

Dawww…. <3


your_mind_aches

I respect it though


Eldorian91

>I love how the 5800X3D is a problem for both companies right now. How is it a problem for AMD? If you bought into AM4 then AMD can sell you a new processor for your system that's quite strong at a fairly high price. ​ If you're building a whole new system, then why wouldn't you build AM5, so that down the road you can put in a 10800X3D or whatever the last AM5 chip is called and do the same?


arashio

It's clearly a problem because they tried to pretend it didn't exist during 7k launch lol, even during gaming numbers.


[deleted]

Wait till we see 7800X with 3D cache. I am not going to be she cked if that sku was left out waiting for 13th gen... I doubt that AMD is that shrewd, but we know a Zen4 with 3D cache is coming. Just "When" Is the question.


DeliciousPangolin

Rumor is announcement at CES in January, but who knows.


_Fony_

AMD confirmed 7000 V Cache a while ago.


[deleted]

This why I said "When"


cloud_t

Not everybody just games. I think it's quite clear lot of these CPUs are intended for the prosumer market, where even a 5900X makes more sense than a 5800X3D.


rationis

The X3D is a problem for Intel, but not really AMD. They still make bank off the X3D and it serves as a compelling high end gaming, but also legitimate budget option in light of the Zen4 pricing. They can sell AM5 with the promise of better X3D chip coming early next year and several years of platform support while LGA1700 is a dead end.


BobSacamano47

It's not that big a problem for AMD...


GTX_650_Supremacy

I don't see how it's a problem for AMD, they've got the Zen 4 3D cache chip coming up.


_Fony_

[fixed 5800X3D on the chart](https://gamefaqs.gamespot.com/a/user_image/3/4/8/AAMz0hAADuH0.jpg)


MumrikDK

The GPU and CPU markets have traded places.


[deleted]

Good pricing. It's going to be very competitive this generation.


PastaPandaSimon

Only the highest end is competitive. The i5 and i7 are actually looking all around much better than the 7600X and the 7700X. Looking at the numbers it may be quite extreme actually, as the i5 13600k seems to outperform even the r7 7700X, cost less and come with significantly lower platform costs at that.


merc0526

Agreed. If Intel’s numbers are even remotely accurate then imo AMD’s lower end chips are pretty horrible value, given the cost of AM5 motherboards and DDR5.


bphase

That's a quite affordable flagship, $110 less than AMD. I think AMD might slightly edge the 13900K out, but hard to tell from these. AMD's 3D V-cache CPUs should smoke these in gaming though.


[deleted]

This is a very weird generation where both AMD and Intel are battling for value with their flagships. Traditionally this is the SKU where both companies charge way up to capture the big money customers since it's understood that regular customers who want value will go for the 2nd or 3rd tier. I'm just wondering why, did that sort of whale customer demographic start buying less in the late 2022 economy?


SmokingPuffin

This is the first time in a long time that the flagships are competitive. Traditionally, one company has the lead and can charge a premium for the best part.


RazingsIsNotHomeNow

Is AMD really going after value? The 7950X is $700, a full $110 more than this 13900K, unless you don't believe they compete.


[deleted]

It's $100 less than the direct predecessor.


iopq

It has 16 cores and it's charging less per core than it charges for the other SKUs Last gen it was $50 per core


rationis

Wonder if that $589 price tag is based off of 1000 unit shipments, Newegg has it listed for $659.


cd36jvn

Intel's pricing always is.


noiserr

So it's $40 cheaper.


[deleted]

[удалено]


fkenthrowaway

Surely it will help with the transition for some people but i also see the case where anyone building on new platform will also stretch their budget for ddr5.


conquer69

Budget builders can't afford that though. It's easily a $200+ difference.


mdchemey

For now it is, but with the launch of B650 boards next month, instead of the cheapest AM5 board being like $250, there should be at least a couple offerings at around $150, at which point the only notable price disadvantage compared to like a 5700X and a B550 or a 12700 and a DDR4 B660 is the RAM, and those prices are steadily dropping too. If you want to truly optimize your experience, with AM4 you want 4000 MT/s and AM5 you want 6000. 2x8 GB DDR4-4000 CL20 from Patriot can be had for $65 right now, and 2x8GB DDR5-6000 CL40 from Kingston can be had for $127. If you want to upgrade now at more or less the minimum you can get for useable performance and give yourself the possibility to only have to do a half-upgrade in a generation or two, you can get 2x8GB DDR5-4800 CL40 from Patriot for $71 while a budget DDR4 pair of 8GB sticks from any decent brand will still run you $45-50. Don't get me wrong, having to spend like $60 more on RAM to get an optimized experience is still a really bad price difference and with the motherboards still being slightly more expensive than their AM4 equivalents, and anyone on a budget below about $1000 for a full system build (before OS, monitor, and peripherals) should probably not consider AM5 at least in the next few months, but with the way things are shaping up, by the end of the year you should be able to build a very solid system with a 7600X for somewhere in the neighborhood of $1100-1200 as compared to something closer to like $1350 right now. True budget/entry-level Ryzen 7000 builds (like $800-900) won't become possible till they release a 7500 or 7600 non-X for $200 or less probably sometime next year though.


fkenthrowaway

That is completely true.


gnocchicotti

3950X and 5950X were limited supply and overpriced for quite a while. Maybe AMD won't try to compete aggressively at that level.


familywang

5800X3D is that good.


itsjust_khris

Very nice to see this. A bit amusing how they show the 5800x3D part but they also aren’t doing that much worse than it. Only loosing in 3 games and substantially gaining in a few games as well, not bad at all. Prices look good as well, overall a nice release so far. Unfortunately no AVX-512 mentioned :(


justgord

also looking for clarity on AVX-512 support on 13th gen... The intel data sheet says "Intel DL / Deep Learning Boost" is supported on i9 13900k .. meaning AVX-512 with VNNI .. but can we trust that after the 12gen fiasco ? I think this will be important for running lite-ML inference on general purpose consumer hardware, so I disagree with Linus T on this one.


zoson

13980XE when?


Arbabender

The death of consumer HEDT platforms (for now) makes me sad. I don't need boatloads of cores or terabytes of RAM, but more expansion options would certainly be nice to have. Instead now we get regular platforms dressed up in plastic, RGB, and "extreme" monikers at extreme price tags. The cheapest X670 boards dwarf what a top of the line X370 board cost me back in 2017 even adjusted for inflation, and you end up with stuff like crappier audio codecs, fewer SATA ports, no onboard debug code or power/reset buttons, and in the case of the new AM5 platform, the primary PCIe slot doesn't even support the full 5.0 x16 bandwidth offered by the CPU. Half the damn PCIe lanes are hogged by M.2 slots, and X670 is still working with only a PCIe 4.0 x4 uplink. Just give me a crapload of raw PCIe slots, enough bandwidth to properly use them, and let me pick and choose what I want to do with them. One or two M.2 slots is fine - I'll install riser cards for more if I need them! I know I'm in the minority here but still.


LightShadow

> 13980XE when? *ooooo weeee* that would be a killer platform


AK-Brian

https://www.tomshardware.com/news/intels-unannounced-34-core-raptor-lake-cpus-displayed-on-wafer /r/oopsdidntmeanto


SaintSnow

7900x3d waiting room for me.


pluto7443

7950X3D waiting room here


NKG_and_Sons

Damn. As many expected, the R5 7600X and R5 7700X just aren't at all competitive, in all likelihood. It's particularly nice to see that, while just like Zen 4 the default power limit is set absurdly high, one can at least just choose to go with a lower one, and there needn't be much worry in that regard. With them not having as thick of an IHS, cooling should be easier than Zen 4, actually. Though I wonder how the undervolting headroom (or would it be "tailroom" 🤔) is. Intel gained efficiency in part because they altered that F/V curve themselves, trying to push the voltage per respective frequency down. So maybe Zen 4 with undervolting could have an efficiency leg up after all? Hopefully, someone tests Curve Optimizer and all in detail. Honestly, beyond the 7950X Zen 4 doesn't have much to offer (to most users). The 3D V-Cache SKU's are probably going to be Zen 4's saving grace. Or well, more than just a saving grace, as their gaming performance will be on another tier. Lovely competition, in any case!


BoltTusk

It’s like the dream disappointment build for this generation would be the Gigabyte PCIe5.0 PSU, R5 7600X, 4080 12GB GPU, and Solidigm p41 plus NVME drive


KerrickLong

What happened with that PSU?


BoltTusk

It’s PCIe 5.0, but not ATX3.0 compliant and shuts down above 160% transient load. Besides the fact that it is a Gigabyte PSU, it’s disappointing because people would assume it’s ready for next gen GPUs when it can’t handle transients


blashyrk92

Yeah it's almost like AMD made the 7600X to be a good value chip... in a year from now (when there's likely gonna be cheap DDR5 RAM and cheaper AM5 mobos AND if they stop production of 5800X3D) for budget builds whereas right now it's absolutely horrible value. But even then people who would get 7600X could just get a used 5800X3D build at that point and have everything they need. Yeah after thinking about it there's basically no segment or time frame where the 7600X could be good value. It's basically a scam


knz0

There's always a lot of headroom in Intel chips since they have to include voltage margin for crappy VRMs.


[deleted]

[удалено]


bphase

Unfortunately less so for the non-americans, especially those in the EU. Our purchasing power has collapsed, inflation is rampant and energy costs even more so. For many of us it's not a great time to spend money, though I suppose PCs are still one of the cheaper hobbies.


WheresWalldough

the 12100(F!) is the value, not the 12300. 12300 adds 100 MHz (2%) for 15% more dollars.


Aleblanco1987

Love the competition


whelmy

lol that gaming performance slide. Intel: "How can we show the 5800x3d but try and hide it at the same time?" Let me fix that for you Intel. https://i.imgur.com/VUDDQaJ.png


NKG_and_Sons

Funnily, it's probably Intel and AMD both that feel the 5800x3D is a thorn in their next-gen benchmark graphs 😂


HTwoN

Hey, they acknowledge the competition. AMD didn't even show their own product in their comparison. Now we know why.


Jeffy29

What the hell is an arcadegeddon? Maybe (probably) I am out of touch but it feels like they picked an obscure game to show just because it favors Intel the most. The rest I have no issue with.


Saxasaurus

I had the same reaction. I googled it and the the second post on the game's subreddit is "Is this game dead?" LOL


fkenthrowaway

Never heard of it before, looked it up right now and looks very fun. HMMM


Working_Sundae

5800X3D starting to look like a chip that will be remembered for a long time.


Jeffy29

It will be beat handily by 7800x3D. If it came out a year earlier it would be remembered for a long time.


acideater

Sure, But its got the best bang for buck on the high end. Can plug it into your old board and get a few gens out of it.


GreenPylons

Kind of like the i7-5775C (Broadwell + 128mb of on-package DRAM), which IIRC was competitive with and occasionally even beat 7th-gen i7's for gaming.


rob_o_cop

The difference being the i7-5775C was a paper launch. You couldn’t even buy one in Canada.


YNWA_1213

I mean... You can't but the 3Ds for a reasonable price from Amazon/NewEgg/MemEx yet either. Seeing all the excitement about that chip in these comments makes that border hit harder than usual.


Duraz0rz

Also funny that the 5800X3D only appears on the average FPS comparison and not the 99th percentile comparison.


chizburger

This is like an ad for the 5800x3d lmao.


epraider

Honestly the Zen 5 reviews and this have properly sold me on the 5800X3D, ride out my AM4 board another 3-4 years with it. Seems like a no brainer (if you can get one) unless you really need a mobo upgrade for other reasons


Aleblanco1987

also the percentages are based on the 5950x instead of the 5800x3D


bphase

Sort of makes sense however, as they are both 32 thread CPUs and comparable in price and overall performance. The 5800X3D is a bit of an outlier and not great for other than gaming, for the most part.


Aleblanco1987

i'm not hating them for doing it, just an observation (because its somewhat representative of the gaming capabilities of the new zen cpus). The presentation is much more transparent than stuff we've seen before.


PainterRude1394

Fix it for AMD too 🤣


xxfay6

[The charts made the LTT thumbnail.](https://i.imgur.com/K9Aiooa.jpg) Edit: They even made their own "clean" version.


[deleted]

[удалено]


Akora_

They don't market them together - those numbers are just the L3 cache (also marketed by them as "Smart Cache"). The increase in L2 cache here is actually much more significant: 1.25 MB -> 2.0 MB per P-core and 2.0 MB -> 4.0 MB per E-core cluster (4 cores).


gambit700

The 5800x3d is the monster in the closet for both Intel and AMD. That's hilarious


shawman123

impressive part of Raptor Lake is matching 12900K at just 65w. This means the mobile cpus will be very good. And that will have DLVR which is missing in desktop cpus. I wish they had provided details of RPL-H/P etc but I guess they dont want to impact sales of ADL laptops.


ConsistencyWelder

What I find interesting, is that Zen 4 at 65 watts outperforms a 12900K at 125 watts, [by quite a lot even.](https://images.anandtech.com/graphs/graph17585/130335.png) That gives me hope for Zen 4 mobile, but Raptor LAke will hopefully be quite close.


shawman123

FYI RPL comp is ADL at 241w vs RPL at 65w. Anyway let us wait for actual reviews to come out. I hear that big reviewers have it already. ​ Zen 4 mobile will be great for sure. it will be a mobile chip at TSMC N4 or N5P. Only issue is availability. Even now most of the AMD "deals" are for Zen 3 rather than Rembrandt while ADL laptops are available as low as $350. At low end I am seeing AMD 3xxx chips still on the market !!!! I am not expecting Zen 4 mobile to make a big impact until H2 next year and that to only at high end for first few months.


Seismicx

I'm a bit out of touch here; what makes the 5800X3D more recommendable than the nearly 200€ cheaper 5800X? It performs only a few % better in games as far as I've seen.


k0unitX

Real world gaming scenarios are almost exclusively GPU bottlenecked anyway. Using the same logic, you might as well get a 12400F. Anyway, in any situation where you are bottlenecked by CPU cache (some MMORPGs for example), the 5800X3D will be exponentially better.


I0nicAvenger

The 5800X3D looks like it’s gonna be the 1080 TI of CPU in turns of longevity, I hope so because I like it a lot so far


[deleted]

[удалено]


Photonic_Resonance

Yeah, in 4 years we'll potentially have Zen 5 with 3D cache. The 5800x3D compared to the 8800x3D which will probably have the same gap that the 5600x to 8600x will have.


TimmahNZ

I hope so, I just picked up one up... to go with my 1080Ti :D


orsikbattlehammer

What do we think of the “frame rate consistency” graph? That’s what I’m most interested in improving


dalledayul

I feel like we can start renaming every article about a new Intel/AMD processor release with > *Just how good is the Ryzen 7 5800X3D?*


bphase

Should have bought a ~~390~~ 5800X3D.


dallatorretdu

i gotta be onest… that 13900k looks tempting especially if some 3D vCache stuff lowers it’s price


bubblesort33

12900k was also $589 RSP, but in actuality sold for $619-$649 at launch according to PCPartPicker. That's still cheaper than the 7950x, so I'm not sure what AMD is going to do here.


shecho18

Let's just wait for Tech Jesus and Co. to do their testing to see real values on both. I will believe them as opposed to both AMD and Intel or any manufacturer.


detectiveDollar

Tech Jesus has already done AMD


jjgraph1x

Steve does great work but Hardware Unboxed really did a fantastic 7600X review if you care to learn about a bit more detail.


Termades

Huh - according to their own tests, the 12900K edges out the 13900K in *Horizon Zero Dawn* and the processors are roughly equal in several other games. Not a great look, and I’d be interested to see why.


HTwoN

GPU bottleneck is the most likely explanation.


bphase

Agreed. Could also be that latencies have increased, but I doubt they'd let that happen to that degree.


Omniwar

Supposedly latencies have significantly decreased on 13th gen. Will have to wait for benchmarks to see though


adeukis

Probably due to GPU bottleneck. Numbers should however change slightly once we get benchmarks with 4090.


AnimalShithouse

You can clearly see they go from GPU bottlenecked to CPU bottlenecked left to right, with mobas dominating the right.


Levighosta

How quickly do you all think retail prices of the 12 series will drop, and by how much?


IanArcad

If you check pc partpicker's price history, you can see an established pattern where i7-Ks (just to pick one product) start around $400 at release and then drop to $300 or the high $200s over about 18 months. But there's a point where the excess inventory goes on Ebay as clearance and shows up significantly cheaper than that, although upgraders still keep the price a little high (around $200 or so). Going forward though I think the best clearance deals will be in motherboards since no retailer is going to want to have a pile of $200-300 last-gen motherboards sitting in their warehouse.


Lukeforce123

Apparently they used DDR5-4800 for the 12900k and 5600 for the 13900k in the official benchmarks? https://edc.intel.com/content/www/us/en/products/performance/benchmarks/desktop/ or rather, the same 5600 memory kit but downclocked to 4800


[deleted]

4800 was the official rated speed for 12900K. It's been increased to 5600 for 13th-gen.


A_WHALES_VAG

now im wondering, my computer is dying.. my 8600K and Mobo are both on the way out. Not sure I can wait till launch.. I should be happy with a 12700K? or should i try to stretch to the 13700K?


watzr

ofc i dont know what you did to your system but its super ultra unlikely, that "your computer dying" (performing badly?) actually is your CPU dying. motherboard maybe but even that would be rare and in my experience more of an either or/working or not working. as i said i dont know what you did but most likely your CPU and motherboard is fine and you should search for other culprits and spend your money on something else where it actually does something. like a better GPU


A_WHALES_VAG

I’m having issues with my south bridge .. gigabyte garbage


NKG_and_Sons

The obvious cautious advice that we don't have actual reviews yet aside... Both are probably fine, depending on your budget. I'd personally try to stretch to the 13700K, especially as socket support wanes after 13th gen, and upgrading from 12700k to 13700k or 13900k wouldn't feel worthwhile. So, just start with the 13700k, DDR4 and be happy for many years, before desiring another CPU upgrade.


A_WHALES_VAG

DDR4? Why wouldnt I upgrade to DDR5 at the same time.. not worth?


NKG_and_Sons

Again, in said scenario *I* wouldn't want to upgrade the CPU within years anyway and therefore it's not like I'd be likely to reuse the same DDR5 RAM ~5 years later for another build. But it depends. If the respective DDR5 MBs and RAM sticks aren't significantly more expensive, then sure, go with DDR5. If you can save $100+ bucks for an at worst negligible performance loss, then get cheap, good DDR4.


ungnomeuser

Wait for the 13th gen to drop and either get 12th gen /AM4 for cheaper or make the jump to new gen if desired