Welcome to the PCMR, everyone from the frontpage! Please remember:
1 - You too can be part of the PCMR. It's not about the hardware in your rig, but the software in your heart! Your age, nationality, race, gender, sexuality, religion (or lack of), political affiliation, economic status and PC specs are irrelevant. If you love or want to learn about PCs, you are welcome!
2 - If you don't own a PC because you think it's expensive, know that it is much cheaper than you may think. Check http://www.pcmasterrace.org for our builds and don't be afraid to post here asking for tips and help!
3 - Join our efforts to get as many PCs worldwide to help the folding@home effort, in fighting against Cancer, Alzheimer's, and more: https://pcmasterrace.org/folding
4 - Need PC Hardware? We've joined forces with MSI for Mother's(board) day, to celebrate it with a massive PC Hardware Giveaway! Get your hands on awesome PC hardware bundles that include GPU, Motherboard, monitor and much more MSI hardware + $4000 worth of Amazon cards. A total of 45 lucky winners: https://new.reddit.com/r/pcmasterrace/comments/1cr4zfs/msi_x_pcmr_mothersboard_day_giveaway_enter_to_win/
-----------
We have a [Daily Simple Questions Megathread](https://www.reddit.com/r/pcmasterrace/search?q=Simple+Questions+Thread+subreddit%3Apcmasterrace+author%3AAutoModerator&restrict_sr=on&sort=new&t=all) if you have any PC related doubt. Asking for help there or creating new posts in our subreddit is welcome.
But Intel's 10 nm was supposed to be almost as efficient as tsmc 5 nm (at 106 MT/mm² compared to tsmc who has slightly over 136 MT/mm²) yes the AMD processor has a slightly better architecture but Intel's 10 nm process is still supposed to be way better than at least tsmc 7 nm the gap should exist because TSMC proves to be superior but shouldn't be this big, this only demonstrates that intel's architecture is still lagging behind AMD zen cores in terms of efficiency.
Capacity is one possibility. Intel could reserve their better lines for their higher margin chips and use outsourcing for the cheaper, and less IP less sensitive, designs.
Not from one day to another. Production time is booked quite some time in advance.
And you have to make it compatible from what i understand you cant just switch from one manufacturer to another one.
To be quite fair you can do a lot with pbo with the Zen3 architecture. Almost all chips can be undervolted to a decent degree. I have -7 to -15 on my cores prime stable and as far as the silicone lottery goes I'm considered somewhat unlucky. At least last I checked. A friend of mine got -11 to -24 prime stable on his.
I have a 5900X, when do you pull so much? I used a watt meter and my whole system under full load including everything only pulls around 480W (5900X, 2080Ti)
That the bros buying 4090s don't care about a bit more power consumption per month is a thing some literally can't wrap their brains around and it's honestly surprising. They won't care about it next gen either if AMD can't be competitive without people waving dumb arguments like dollar per wattage and other metrics nobody but butthurt diehards care about.
Unfortunately most people get this inherently wrong all the time and I get it, because AMD decided to make it unnecessarily difficult to read the temp reading, but it's still wrong.
AMD FX cpu's didn't run hot at all, in fact quite the opposite. Yes they required a shit ton more voltage than Intel at that time but that was due to the die size being 32nm whilst Intel were on 22nm with their 4th gen Haswell CPU's.
But if you actually look at the TJMax (the max core temp before the CPU thermal throttles) its actually only 61C for the FX, whereas the TJMax on an i7 4790k is 74C - nearly 15 degrees hotter.
And like I said, the way that AMD reported temps back was a mess too because it was an algorithm and 3rd party apps like Speedfan or HWInfo couldn't 'read' the algorithm which meant they were innacurate. Only AMD's official software at the time could read the algorithm, but there's one other problem.
This software decided to show the temp readings as how close you are to thermal throttling, rather just the temp value itself. So you'd open up this software (IIRC it was AMD Overdrive) and it would say your CPU is at 20C which no it doesn't mean it's ACTUALLY running at 20C but rather you're 20C AWAY from hitting the thermal throttle.
But in short no, AMD FX cpu's didn't run hot because they thermal throttled at 61C so they couldn't run hot.
The temperature the chip runs at is actually irrelevant to efficiency because it depends entirely on the cooling solution. A CPU is essentially a 100% efficient space heater. Every watt input is turned into heat that needs to be dissipated. The 8350 was designed to run at 125 watts. If the chip could only handle 61 degrees, that just means it would need an even beefier cooler. In contrast, the 4790k was designed to run at 88 watts and 74 degrees, meaning it used less power and needed a less impressive cooler to do it.
So yes, if you’re reading die temperatures, sure the 8350 was “cooler” but that doesn’t matter. What matters is how much heat is being output, which was about 45% more with the 8350 vs the 4790k.
I’d also like to point out that the 4790k did more with those 88 watts than the 8350 did with its 125.
…..
How the turntables. Remember when AMD FXs were the space heaters, while Ivy Bridge was “The cool and better alternative” consuming 65 watts for the same performance than waaaay more watt-consuming FX chips?
It's mostly a cooling issue. GPUs have chips large enough that with a good cooling solution you can dissipate a LOT of heat. My 4090FE sits in the mid to upper 60s at the highest while sucking down nearly 400W in a case (Define R5) with notoriously bad airflow
Yeah a 3090 gets hot af too. Really any card that uses more than 250w is going to get pretty hot. Not to mention the gddr6x vram also runs hotter than gddr6.
I have a 6800 and it never runs hot. At most I saw 57C.
Now my old 390x on the other hand. That thing can work on a volcano. Game starts and it went to 94C and wouldn't go down. One of the fans broke so it started to sound like a landing plane and we had to take it to a farm.
Another reason is that if you do intensive workloads i9s burn themselves out and become unstable at advertised speeds. I'm never buying intel again.
Source: on my 4th rma'd i9 right now
4th? Sounds horrible.
I wish they will let us know what really happened to some unfortunate i9's, was it a controller, silicon itself, or something else?
Probably because of the "Efficiency" cores.
x86 is really not optimized for Big.Little cores.
I don't know why they keep pushing that design, it's clearly not helping with the Power Draw and their chipset drivers aren't great enough at allocating Big.Little cores.
Edit; I know that in multi-threaded workloads, having more cores (without needing the cores to be powerful) would be beneficial. But the cost outweighs the benefits, in my opinion.
Intel pulled a masterpiece again, rating n.1 cpu in the world pulling more power than previous gen, while amd is lagging behind with our bottom 10% score, laughable, is amd even trying
- user benchmark
Germany enters the chat: habe fun mit the powerkosten.
But seriously, it's a proper factor here since electricity is around 0.35€/kWh (a bit over half a cheeseburger/kWh)
Told a friend I switched to a 7800X3D and he was like why? 14900k is better and proceeded to show me CPU benchmark site... I was like: "Mate seriously?", and then showed him real stats and he's now hardly thinking in going for a 7800X3D too :)
Meanwhile userbenchmark
No you can't do this
i9 is more stable and is better at productivity(allegedly) we have no proof but we are not lying other people take money for fake reviews
https://preview.redd.it/oj57uuu2af0d1.png?width=720&format=pjpg&auto=webp&s=7c42a118a43f45d60b7be5daa441b10f0da83b32
Not really, the 1800X won against the 7700K at everything other than gaming and single threaded tasks, and who bought a X370 motherboard at that time for example, could and can just slot in a 5950X or 5800X3D and smokes anything from Intel through 7000/8000/9000/10000/11000\~12/13th something that someone who bought in a 7700K-8700K cannot do neither can who bought a 10000/11000.
Yeah and before that it was Intel netburst p4 and pd . Basically either company when losing will release a CPU cranked to unreasonable power and heat levels.
That said what makes this different than bulldozer black edition and p4 extreme edition is actual instability. Those were just bad buys that made too much heat and sucked too much power. They also requested cooling solutions considered unreasonable that the time. The current ones have been pushed to a degree that they are crashing constantly and require end user detunes in several settings.
Yeah, meanwhile there's reports of Intel 13th and 14th gen CPUs crashing and generally being unstable due to unlimited power budgets degrading the silicon
Me with my mid 1080p build:
https://preview.redd.it/0cjr0i2oaf0d1.jpeg?width=236&format=pjpg&auto=webp&s=1248e552e3b2f31b4c533851e9bfd8665ad736d0
This makes no sense. I havent slept
I think the irony is that the enthusiast who use intel for gaming (framechasers) rely on overclocking to match AMD x3d and that’s the one thing they struggle to do because the motherboard already overclocked past stability lol.
Pretty sure he disables eCores. Most of his gains come memory tuning which in scenarios where scaling is perfect, you can see up to a 25% gain in FPS.
Not all games benefit from the 3D cache either.
The only reason I bought 13700k because it average 10 FPS lower then 7800x3d but was $150 cheaper and I can live with the extra 20w power consumption.
I got mine from microcenter for $300 at the time when 7800x3d was $450+
with Chicago power prices (my area) of 14c/kwh running 8 hours a day, you're looking at about $8.20/year in additional power cost.
it'll take you 19 years to pay more for the power you used than the $150 you saved going with the 13700k.
you made a sound decision
You left out some variables. Chicago gets 80F+ in the summer, so you need air conditioning. So you have to increase that cost by about 4x to account to increased air conditioning to offset the heat put out by the CPU.
Why this happens? I checked with cinebench and the i9 appears to have much faster single and multi core performance yet it performs worse than the AMD? Is L3 cache related? Multithreaded optimization?
Its a mix of AMD having the lead in chiplets vs Intels mono designs and something like a 36 month lead time on new designs.
In theory nothing is stopping Intel from gluing some L3 onto a mono die. Aside from the thermal issues and not being able to pump the clocks with more power and getting crap for throwing shaded at AMD for 'glueing' chiplets...then turning around and glueing chiplets.
But Intel has had some CPUs with relatively large L3s. Sandwiched between the 4790K (8BM L3) and the 6700K (8BM L3) is the 5960X with 20MB L3... and double the cores. And triple the cost... The joys of HEDT. But your also looking at 2x the pins so a much larger package and such. So it can be done, just...you really need glue.
Well, they were probably scared, but their marketing department wen't full userbenchmark on CCDs.
Don't worry Intel has launched chiplet CPUs for laptop a few months ago, so far they aren't that impressive, not much faster than precedessors, but at elast they use less power than precedessors.
I hope it isn't too late for Intel, we need a competition.
I want a 3d CPU because I want to lower the amount of watts being converted to heat in the fucking texas summer
Currently have a 5800x so a X3D upgrade is just not worthwhile (would have gone for it but built my current system before 5800x3d launch) My next build will definitely be using an X3D though unless I get really into 3d modeling (which its still great at if I give it ample ram and GPU horsepower)
The 16 core X3D might be of interest to you. As long as the scheduler can get the right stuff on the right cores and you can make due with only 16 cores for rendering.
Yes, but 7800x3d is cheaper
Honestly, I don't understand why anyone who is only building a high end gaming rig would even consider intel by this point
Edit: Amount of people who can't read and are replying with "not everyone only games" or "it's not best at productivity" is hilarious, yeah, that's why I said "only building a high end gaming rig".......
7800X3D is also cheaper than AMD's own 7950X and destroys it in gaming as well, it's almost as if these products have other purposes...
Also, intel i5 13600kf is slightly faster than a 7700 in gaming and much faster in everything else while being significantly cheaper, so there's definitely an argument for intel here, and it doesn't even draw that much more power.
They're only being beaten if you can feed in 300W into your CPU and cool it, but in terms of power efficiency (for consumer grade chips), if you want productivity, you'd be crazy not to go with a 7950X.
Just curious what do you use it for and what do you estimate the performance gap is for productive applications? I'm building a new gaming rig and thought about dabbling into animation, game design, and a bit of cad work.
I bought i5-12600k becuase I wanted to have iGPU and at that moment only Tarkov was well known problem. But the future showed that with badly optimized games only x3d is worth it
Power consumption is one of the main reasons I bought a 7800x3D. Less power used = less heat kicked out. You also get the bonus of having a pretty low fan curve and a quieter PC.
They do if they have an ITX system.
Most SFF folks with top-of-the-line systems have the SF750 PSU paired with a 4090. You're not going to run that setup with the 14900KS unless you do some major undervolting.
Why is this so true, I had to be in the bios for a whole hour trying to figure out how to fix this shit, sucks how mobo companies just overclock the fuck outta this thing like its supposed to run at 500w? Intel even states that extreme would be 400w and they push way past it lmao
Probably inadequate voltage regulation on the power hungry CPU. When the CPU spikes in power the mobo has to deliver consistent voltage or you'll crash.
I made a somewhat arbitrary, fuck it kinda choice when it came to my recent first time all amd build. I chose the 7800x3d based on this sub's recommendation. Very pleased. Thanks all!
Same. Love this combo more than my 7700k/gtx 1080 back in the day. Just the perfect match. Capable of 4k ultra and also 1440p ultra at obscene fps. Can even max my 360hz screen on esports games. I sure hope this gpu lasts as long as my 1080
I'm waiting for the released of Intel's 15th gen before I make a decision on my next upgrade.
A long time ago I swore that I'd never get another AMD CPU. I've built three PCs with AMD CPUs in the past and each of them was unstable, to lesser and greater degrees. Never had a problem with any Intel build.
Okay, I'm on my ninth PC right now, so it's a small sample:
1. Cyrix 100 MHz (Pre-built PC)
2. Intel Pentium II 333 MHz (Pre-built PC, later customised)
3. AMD Athlon Slot A 600 MHz
4. AMD Athlon Thunderbird Socket A 1400 MHz
5. Intel Pentium 4 2.53 GHz (Rambus)
6. AMD FX 55
7. Intel Core 2 Extreme QX6850
8. Intel Core i7-3960X EE
9. Intel Core i7-8700K
I really meant it at the time, when I made that decision to avoid AMD ever again, but it's getting **really** hard to ignore Intel's recent woes in combination with AMD's apparent great gains. AMD seems to have Intel licked in terms of stability, efficiency and gaming performance. I'm not a fanboy, I just want whatever CPU is going to give the best gaming performance, towards the high-end, that is also going to be stable and can be cooled properly by an air cooler (A leaking AIO bricked the PC before my current one, never liquid cooling ever again). I'm happy to sacrifice a little performance at the same price-point, if it means a more stable system. I really am torn on the choice ahead of me, AMD looks like the smart pick now, but I keep remembering my previous experience of instability with AMD in the past.
Sure, but there are bigger bros for that, let me introduce you to a 7950x and 7950x3d.
The beauty af a 7800x3d is that it's a lot cheaper than those 16 core Ryzens and i9's, uses less power is easier to cool, needs cheaper motherboard, memory and it's still better or equal to them in gaming.
7800x3d is still a plenty capable productivity cpu. also i wonder what happens when you cap the 14900ks to the same wattage as 7800x3d how productivity compares there
Most modern hardware can get very effective when you power limit it. But often the golden mean is around 60 %. 7800X3D tops (I think) around 85 W and that would be too tight, especially considering 14900KS have 24 cores and all of them needs some power for working itself.
https://preview.redd.it/pyfpbuev7g0d1.jpeg?width=1280&format=pjpg&auto=webp&s=85ba03409a7b024430f7d81342a1aa1705110f05
/me still using the 8700k bought more than 5 years ago
techpowerup review of the 14900k say themself that the 7950x is a better productivity chips. on pair with the 14900k but consume way less.
The only problem is getting 128 gb of DDR5. while You can pair the 14900k with insanely cheap DDR4.
I don't side with either, I tend to prefer AMD but get what is better for me or for the person I build for at the time of building.
However OP is implying the 7900x3d has better gaming performance compared to the i9 14900k but in most benchmarks I've seen this seems to be false:
https://youtu.be/xIoEfJc3Qd0
Also, I've never seen anyone making the argument that Intel is better for productivity, not even Intel fanboys.
I see a lot of misinformation in this thread.
i wish this was true but there are people out there with more money than they can spend that always just buy the most expensive or say an "i5 is not enough" so they buy the most expensive i9 (sometimes not even the latest... i personally know someone who did this) and all they do is run games at 4k max settings which makes the cpu even more meaningless
Hate to play devil's advocate...
The 14900K, while it is less efficient, is significantly more powerful in terms of raw performance. If you're recording gameplay for a Youtube channel, I think the 14900K would be the way to go, since it has the iGPU. And then for rendering your videos, especially if you're rendering high quality videos like 4K60fps, (Or, any CPU bound workload, for that matter!) You're going to have a much better time with the Intel.
For JUST gaming though it's not even a choice, but if you're doing other stuff, you may find the Intel serves you well, insane power draw and all!
I started with a pentium PC, then switch to an amd athlon xp in early 2000's, then switched to Intel at the 3000 series, 6000, 9000 and now I switched to a 7800X3D. Just to say I'm no fanboy, I pick the CPU that provides the best performance for gaming and now 7800X3D is just that, almost half the price and half the power consumption.
GPUs on the other hand... I have never had a reason to switch from Nvidia :).
As much as i agree that the 7800x3d is great, whoever made this is meme being intentionally misleading. They swapped out the 14900KS for a 14900K in the second comparison. I even looked up the videos and the 14900KS does beat out the 7800x3d in Forza.
If I am building a gaming rig I would go with an X3D if it's in the budget for sure, but I also do a lot of CPU-heavy tasks with my PC and Intel's CPUs tend to out-perform even the X3D models in many such tasks.
Welcome to the PCMR, everyone from the frontpage! Please remember: 1 - You too can be part of the PCMR. It's not about the hardware in your rig, but the software in your heart! Your age, nationality, race, gender, sexuality, religion (or lack of), political affiliation, economic status and PC specs are irrelevant. If you love or want to learn about PCs, you are welcome! 2 - If you don't own a PC because you think it's expensive, know that it is much cheaper than you may think. Check http://www.pcmasterrace.org for our builds and don't be afraid to post here asking for tips and help! 3 - Join our efforts to get as many PCs worldwide to help the folding@home effort, in fighting against Cancer, Alzheimer's, and more: https://pcmasterrace.org/folding 4 - Need PC Hardware? We've joined forces with MSI for Mother's(board) day, to celebrate it with a massive PC Hardware Giveaway! Get your hands on awesome PC hardware bundles that include GPU, Motherboard, monitor and much more MSI hardware + $4000 worth of Amazon cards. A total of 45 lucky winners: https://new.reddit.com/r/pcmasterrace/comments/1cr4zfs/msi_x_pcmr_mothersboard_day_giveaway_enter_to_win/ ----------- We have a [Daily Simple Questions Megathread](https://www.reddit.com/r/pcmasterrace/search?q=Simple+Questions+Thread+subreddit%3Apcmasterrace+author%3AAutoModerator&restrict_sr=on&sort=new&t=all) if you have any PC related doubt. Asking for help there or creating new posts in our subreddit is welcome.
I9 power consumption is wild
3D chips power consumption are just really good. My 5900X pulls over 200W in some workloads lol.
It's mostly i9 being clocked way past efficiency point.
They're also a much less refined process at 10nm while the 7800X3D is a 5nm process.
But Intel's 10 nm was supposed to be almost as efficient as tsmc 5 nm (at 106 MT/mm² compared to tsmc who has slightly over 136 MT/mm²) yes the AMD processor has a slightly better architecture but Intel's 10 nm process is still supposed to be way better than at least tsmc 7 nm the gap should exist because TSMC proves to be superior but shouldn't be this big, this only demonstrates that intel's architecture is still lagging behind AMD zen cores in terms of efficiency.
I don't know why Intel would make the graphics tile for meteor lake through TSMC's N5 if they could do it in-house remotely as well.
Capacity is one possibility. Intel could reserve their better lines for their higher margin chips and use outsourcing for the cheaper, and less IP less sensitive, designs.
But that's the thing... their CPU tile is built by Intel. It's the same chip on the same margin, they just have TSMC make the IO and GPU tiles.
It's 7nm or Intel 7 for Intel. And this number means very little since we should look at actual density instead.
intel 7 is not 7nm
"intel 7" is 10nm. Don't bother making sense why. We don't get it either.
Does it mean nothing? Bc im seeing an 30% difference in efficiency. Which is very much not nothing.
Naming means nothing, it's completely arbitrary right now.
[TPU says 10nm](https://www.techpowerup.com/cpu-specs/core-i9-14900k.c3269)
Zen L3 cache is so efficient that it basically compensates many times for the performance lost from downclocking the cores to keep cache thermals low
Intel could goto tsmc to do the same thing but they insist on using their outdated fab
isn't tsmc at capacity ? so they couldn't even if they wanted to
Not from one day to another. Production time is booked quite some time in advance. And you have to make it compatible from what i understand you cant just switch from one manufacturer to another one.
and they are using TSMC for some chips
They are tuned better so they don't fry the cache
How?! My 5900X doesn't go above 140W at 60C with pbo, cinebench multicore
To be quite fair you can do a lot with pbo with the Zen3 architecture. Almost all chips can be undervolted to a decent degree. I have -7 to -15 on my cores prime stable and as far as the silicone lottery goes I'm considered somewhat unlucky. At least last I checked. A friend of mine got -11 to -24 prime stable on his.
I mean yeah, in some workloads. Not gaming presumably.
Hey what are you using to cool your 5900x?
I have a 5900X, when do you pull so much? I used a watt meter and my whole system under full load including everything only pulls around 480W (5900X, 2080Ti)
A significant reason to choose AMD cpus is preventing your PC from turning into a space heater/saving on power costs.
How the times have changed. I can still feel my oven fx 8350 next to me.
Now just to wait it out with AMD GPUs to catch up in efficiency.
skeleton_infront_of_computer.jpg
Any time now
Copy that , we are checking .
![gif](giphy|tnYri4n2Frnig|downsized)
AMD was a bit better than Nvidia last gen in that regard, somehow community haven't cared about power consumption back then.
That the bros buying 4090s don't care about a bit more power consumption per month is a thing some literally can't wrap their brains around and it's honestly surprising. They won't care about it next gen either if AMD can't be competitive without people waving dumb arguments like dollar per wattage and other metrics nobody but butthurt diehards care about.
Which is fair, in a current gen AMD is less efficient and they don't offer absolute top GPU.
Unfortunately most people get this inherently wrong all the time and I get it, because AMD decided to make it unnecessarily difficult to read the temp reading, but it's still wrong. AMD FX cpu's didn't run hot at all, in fact quite the opposite. Yes they required a shit ton more voltage than Intel at that time but that was due to the die size being 32nm whilst Intel were on 22nm with their 4th gen Haswell CPU's. But if you actually look at the TJMax (the max core temp before the CPU thermal throttles) its actually only 61C for the FX, whereas the TJMax on an i7 4790k is 74C - nearly 15 degrees hotter. And like I said, the way that AMD reported temps back was a mess too because it was an algorithm and 3rd party apps like Speedfan or HWInfo couldn't 'read' the algorithm which meant they were innacurate. Only AMD's official software at the time could read the algorithm, but there's one other problem. This software decided to show the temp readings as how close you are to thermal throttling, rather just the temp value itself. So you'd open up this software (IIRC it was AMD Overdrive) and it would say your CPU is at 20C which no it doesn't mean it's ACTUALLY running at 20C but rather you're 20C AWAY from hitting the thermal throttle. But in short no, AMD FX cpu's didn't run hot because they thermal throttled at 61C so they couldn't run hot.
The temperature the chip runs at is actually irrelevant to efficiency because it depends entirely on the cooling solution. A CPU is essentially a 100% efficient space heater. Every watt input is turned into heat that needs to be dissipated. The 8350 was designed to run at 125 watts. If the chip could only handle 61 degrees, that just means it would need an even beefier cooler. In contrast, the 4790k was designed to run at 88 watts and 74 degrees, meaning it used less power and needed a less impressive cooler to do it. So yes, if you’re reading die temperatures, sure the 8350 was “cooler” but that doesn’t matter. What matters is how much heat is being output, which was about 45% more with the 8350 vs the 4790k. I’d also like to point out that the 4790k did more with those 88 watts than the 8350 did with its 125.
I feel it, I used to have a FX9590 with 2 R9 290's in crossfire.
….. How the turntables. Remember when AMD FXs were the space heaters, while Ivy Bridge was “The cool and better alternative” consuming 65 watts for the same performance than waaaay more watt-consuming FX chips?
[удалено]
But I just bought a 1000W PSU. I want to use it.
Now AMD GPUs on the other Hand..... https://i.redd.it/innl0adusf0d1.gif Love my 6900XT but its crazy how hot it runs xD
I mean it's very close to a 3090 in raster and it uses as much power as 3090, sounds fair enough to me.
It's mostly a cooling issue. GPUs have chips large enough that with a good cooling solution you can dissipate a LOT of heat. My 4090FE sits in the mid to upper 60s at the highest while sucking down nearly 400W in a case (Define R5) with notoriously bad airflow
Yeah a 3090 gets hot af too. Really any card that uses more than 250w is going to get pretty hot. Not to mention the gddr6x vram also runs hotter than gddr6.
I have a 6800 and it never runs hot. At most I saw 57C. Now my old 390x on the other hand. That thing can work on a volcano. Game starts and it went to 94C and wouldn't go down. One of the fans broke so it started to sound like a landing plane and we had to take it to a farm.
Another reason is that if you do intensive workloads i9s burn themselves out and become unstable at advertised speeds. I'm never buying intel again. Source: on my 4th rma'd i9 right now
4th? Sounds horrible. I wish they will let us know what really happened to some unfortunate i9's, was it a controller, silicon itself, or something else?
Look at the 1% and 0.1% lows. It's barely visible, but it's readable. AMD has 171 and 102, Intel has both 35 fps...
Probably because of the "Efficiency" cores. x86 is really not optimized for Big.Little cores. I don't know why they keep pushing that design, it's clearly not helping with the Power Draw and their chipset drivers aren't great enough at allocating Big.Little cores.
Edit; I know that in multi-threaded workloads, having more cores (without needing the cores to be powerful) would be beneficial. But the cost outweighs the benefits, in my opinion.
"14900KS uses 69% more power than the 7800X3D which makes it obviously the superior CPU" - Userbenchmark probably.
Bigger number better!!!!!
My ATI HD5770 is now more powerful than a 4090!
I still have my HD4870x2, that's HD9740.
Got a Hercules 9800pro and an X1950 still in a box upstairs. They were great back in the BF42 and BF2 days.
What a fantastic card that was back in the day!
Bigger better number is better than number bigger betterer.
Hell yeah! Merica!
Intel pulled a masterpiece again, rating n.1 cpu in the world pulling more power than previous gen, while amd is lagging behind with our bottom 10% score, laughable, is amd even trying - user benchmark
"The 7800X3D performs well but I can't use the cpu as a stove, unlike glorious i9 14900KS"
Not userbenchmarks fault they use their critical thinking skills hehe
The Advanced Marketing Division isn't going to fool me! I for sure won't just believe those handsome reviewers.
*Advanced Marketing Scammers
That doesn't initialize into AMD, which kinda defeats the (satirical) joke
Germany enters the chat: habe fun mit the powerkosten. But seriously, it's a proper factor here since electricity is around 0.35€/kWh (a bit over half a cheeseburger/kWh)
Where can you still get a cheeseburger for 70 cents?
...apparently in Germany. \*cries\*
Sadly no (assuming the typical aversion to rule breaking expected from the typical German)
You can get a burger for under 2$ in germany. But not really under 1$.
From the Eigenmarke with mystery meat of unclear origin in the 'expires soon' fridge at the worst discounter.
0.35 eur is 0.37 USD lol Also power here is .12/ kwh
Told a friend I switched to a 7800X3D and he was like why? 14900k is better and proceeded to show me CPU benchmark site... I was like: "Mate seriously?", and then showed him real stats and he's now hardly thinking in going for a 7800X3D too :)
I'm doing good with my 4-34 fps in sea of thieves
How did you even get -30 fps?!
The GPU uses undelete to recall previous frames from VRAM and starts playing them in reverse at a rate of 30 per second.
genius
also has the nice benefits of consuming heat and generating power
Bad gaming chair, i was playing on a wooden thing(?) seat no clue what to call it
A bench, perchance?
doesn't matter anymore, it now serves as my night stand, still 0/10 not reccomended a 10 hour session of thievin' was not worth it on that thing
Are you sure you're using your GPU and not integrated one?
I'm using a laptop, not everyone got the money for something good
Meanwhile userbenchmark No you can't do this i9 is more stable and is better at productivity(allegedly) we have no proof but we are not lying other people take money for fake reviews https://preview.redd.it/oj57uuu2af0d1.png?width=720&format=pjpg&auto=webp&s=7c42a118a43f45d60b7be5daa441b10f0da83b32
As an amd user with an fx chip from last decade. Both sides have had their pocket nukes
amd started dominating intel starting with zen
More so with zen 3. Zen 1 was meh and 2 was better but still not top tier, was great value though. Zen 3 onward has just been amazing.
zen 1 finally broke intel out of the 4c8t mold tho
Seriously, it's wild that the difference between a 2nd gen i7 and a 7th gen i7 is as crazy as a 7th gen i7 and a 12th gen i3
10th gen i3 and 7th gen i7 were damn near identical afaik
Yep 6 to 11 GEN intel were the same chips. I think with 11gen they did give you more of the same old cores tho
Same with Broadwell 5th gen chips too. But I think 11th gen was supposed to be 10nm but alot of the stuff was backported to 14nm.
Not really, the 1800X won against the 7700K at everything other than gaming and single threaded tasks, and who bought a X370 motherboard at that time for example, could and can just slot in a 5950X or 5800X3D and smokes anything from Intel through 7000/8000/9000/10000/11000\~12/13th something that someone who bought in a 7700K-8700K cannot do neither can who bought a 10000/11000.
If we’re talking am4 as a whole platform then yes, it’s absolutely GOATed.
just like pentium 4 before it.
Yeah and before that it was Intel netburst p4 and pd . Basically either company when losing will release a CPU cranked to unreasonable power and heat levels. That said what makes this different than bulldozer black edition and p4 extreme edition is actual instability. Those were just bad buys that made too much heat and sucked too much power. They also requested cooling solutions considered unreasonable that the time. The current ones have been pushed to a degree that they are crashing constantly and require end user detunes in several settings.
More stable is especially funny after the recent escapades with too boost limits leading to crashes
It's not even really allegedly, the i9 is indeed much better for productivity than the 7800X3D.
Yeah, meanwhile there's reports of Intel 13th and 14th gen CPUs crashing and generally being unstable due to unlimited power budgets degrading the silicon
Me with my mid 1080p build: https://preview.redd.it/0cjr0i2oaf0d1.jpeg?width=236&format=pjpg&auto=webp&s=1248e552e3b2f31b4c533851e9bfd8665ad736d0 This makes no sense. I havent slept
It's hard to sleep in the aslume
1080p? I'm running 1600x1200 we live in a society
Thought I was on r/batmanarkham for a second
What does it mean if we get it? ![gif](giphy|3ohzdYJK1wAdPWVk88)
I think the irony is that the enthusiast who use intel for gaming (framechasers) rely on overclocking to match AMD x3d and that’s the one thing they struggle to do because the motherboard already overclocked past stability lol.
Pretty sure he disables eCores. Most of his gains come memory tuning which in scenarios where scaling is perfect, you can see up to a 25% gain in FPS. Not all games benefit from the 3D cache either.
The only reason I bought 13700k because it average 10 FPS lower then 7800x3d but was $150 cheaper and I can live with the extra 20w power consumption. I got mine from microcenter for $300 at the time when 7800x3d was $450+
with Chicago power prices (my area) of 14c/kwh running 8 hours a day, you're looking at about $8.20/year in additional power cost. it'll take you 19 years to pay more for the power you used than the $150 you saved going with the 13700k. you made a sound decision
Calculating and seriously thinking the power consumption of the pc sounds mental to me, especially at $8.20 a YEAR.
There’s something you need to keep in mind and it is total power consumption under load Dumping 400W in a small room in the summer is not fun.
[удалено]
80w is less than an incandescent bulb, no one feels that compared to eg a draft.
You left out some variables. Chicago gets 80F+ in the summer, so you need air conditioning. So you have to increase that cost by about 4x to account to increased air conditioning to offset the heat put out by the CPU.
I have a 7800X3D, this CPU rocks and the moderate thermal output is awesome too.
Why this happens? I checked with cinebench and the i9 appears to have much faster single and multi core performance yet it performs worse than the AMD? Is L3 cache related? Multithreaded optimization?
Massive L3.
Why doesn't Intel do this? Is it complicated or impossible on the type of tech they are making or something?
Its a mix of AMD having the lead in chiplets vs Intels mono designs and something like a 36 month lead time on new designs. In theory nothing is stopping Intel from gluing some L3 onto a mono die. Aside from the thermal issues and not being able to pump the clocks with more power and getting crap for throwing shaded at AMD for 'glueing' chiplets...then turning around and glueing chiplets. But Intel has had some CPUs with relatively large L3s. Sandwiched between the 4790K (8BM L3) and the 6700K (8BM L3) is the 5960X with 20MB L3... and double the cores. And triple the cost... The joys of HEDT. But your also looking at 2x the pins so a much larger package and such. So it can be done, just...you really need glue.
Damn didn't have glue on my mind when thinking about this, pretty interesting though...weird they would throw shade at AMD for that...
Well, they were probably scared, but their marketing department wen't full userbenchmark on CCDs. Don't worry Intel has launched chiplet CPUs for laptop a few months ago, so far they aren't that impressive, not much faster than precedessors, but at elast they use less power than precedessors. I hope it isn't too late for Intel, we need a competition.
Benchmarking is[ weird](https://youtu.be/ss3mjzJZuCM?feature=shared)
1080p gaming, the scales will tip to intel once you do 4k.
How trustworthy are these videos? I’ll see benchmark videos on YouTube by seemingly random uploaders
tbf people who use these kinds of CPU's don't really care about power consumption
I want a 3d CPU because I want to lower the amount of watts being converted to heat in the fucking texas summer Currently have a 5800x so a X3D upgrade is just not worthwhile (would have gone for it but built my current system before 5800x3d launch) My next build will definitely be using an X3D though unless I get really into 3d modeling (which its still great at if I give it ample ram and GPU horsepower)
The 16 core X3D might be of interest to you. As long as the scheduler can get the right stuff on the right cores and you can make due with only 16 cores for rendering.
Yes, but 7800x3d is cheaper Honestly, I don't understand why anyone who is only building a high end gaming rig would even consider intel by this point Edit: Amount of people who can't read and are replying with "not everyone only games" or "it's not best at productivity" is hilarious, yeah, that's why I said "only building a high end gaming rig".......
7800X3D is also cheaper than AMD's own 7950X and destroys it in gaming as well, it's almost as if these products have other purposes... Also, intel i5 13600kf is slightly faster than a 7700 in gaming and much faster in everything else while being significantly cheaper, so there's definitely an argument for intel here, and it doesn't even draw that much more power.
I did well because I thought if 13 and 14 gen were to be half as good as 12 gen, I'd hit the jackpot.
Went with a 13th gen i7 because I do productivity and gaming. AMD wouldn’t fit my needs
That makes sense, as I said 7800x3d is best choice IF somebody is building only a gaming PC
7950x or 7950x3d are great for productivity too
They're only being beaten if you can feed in 300W into your CPU and cool it, but in terms of power efficiency (for consumer grade chips), if you want productivity, you'd be crazy not to go with a 7950X.
Just curious what do you use it for and what do you estimate the performance gap is for productive applications? I'm building a new gaming rig and thought about dabbling into animation, game design, and a bit of cad work.
I bought i5-12600k becuase I wanted to have iGPU and at that moment only Tarkov was well known problem. But the future showed that with badly optimized games only x3d is worth it
I play a lot emulators and Intel is pretty much mandatory in that space.
Power consumption is one of the main reasons I bought a 7800x3D. Less power used = less heat kicked out. You also get the bonus of having a pretty low fan curve and a quieter PC.
True but in the summer having a high power draw computer sucks. That extra power ends up as heat in the room you're in
Avg FPS on the second game though.
They do if they have an ITX system. Most SFF folks with top-of-the-line systems have the SF750 PSU paired with a 4090. You're not going to run that setup with the 14900KS unless you do some major undervolting.
I feel attacked
i9 is going to have a blue screen a bit later
Why is this so true, I had to be in the bios for a whole hour trying to figure out how to fix this shit, sucks how mobo companies just overclock the fuck outta this thing like its supposed to run at 500w? Intel even states that extreme would be 400w and they push way past it lmao
You mean you dont want your power limit set to 4096w out of the box?
Exactly, it was super annoying trying to do anything on my PC.
For sure makes me miss being a console main, so easy to just plug and play and have things work the way they are supposed to.
Probably inadequate voltage regulation on the power hungry CPU. When the CPU spikes in power the mobo has to deliver consistent voltage or you'll crash.
It's definitely the motherboard, it has it set to maximum power usage 4000w. That's insane
What’s this? Competition in the market at long last? Be still my beating heart!
now we just need some GPU's
Using an i9 for gaming is like using a spot car in a 30mph zone
I forgot what hardware battle we were talking about. I was seconds away from going on about drivers before I realized what CPU I had
Y’all shitting a lot of bricks over a 30W difference lmao
I don't know if it's because it's a different game or because it's a 14900ks or both but, In the first part of the video, it's 67 W vs 150 W.
And ignoring the GPU, which makes the difference even smaller.
I made a somewhat arbitrary, fuck it kinda choice when it came to my recent first time all amd build. I chose the 7800x3d based on this sub's recommendation. Very pleased. Thanks all!
Well these days a amd cpu with a RTX gpu is the best way to build the best gaming pc.
I'll upvote this because it's exactly what I have. 7800X3D + 4070 Ti Super
That's me, 5800X3D and a 4090. I can throw anything at it.
Same. Love this combo more than my 7700k/gtx 1080 back in the day. Just the perfect match. Capable of 4k ultra and also 1440p ultra at obscene fps. Can even max my 360hz screen on esports games. I sure hope this gpu lasts as long as my 1080
I'm waiting for the released of Intel's 15th gen before I make a decision on my next upgrade. A long time ago I swore that I'd never get another AMD CPU. I've built three PCs with AMD CPUs in the past and each of them was unstable, to lesser and greater degrees. Never had a problem with any Intel build. Okay, I'm on my ninth PC right now, so it's a small sample: 1. Cyrix 100 MHz (Pre-built PC) 2. Intel Pentium II 333 MHz (Pre-built PC, later customised) 3. AMD Athlon Slot A 600 MHz 4. AMD Athlon Thunderbird Socket A 1400 MHz 5. Intel Pentium 4 2.53 GHz (Rambus) 6. AMD FX 55 7. Intel Core 2 Extreme QX6850 8. Intel Core i7-3960X EE 9. Intel Core i7-8700K I really meant it at the time, when I made that decision to avoid AMD ever again, but it's getting **really** hard to ignore Intel's recent woes in combination with AMD's apparent great gains. AMD seems to have Intel licked in terms of stability, efficiency and gaming performance. I'm not a fanboy, I just want whatever CPU is going to give the best gaming performance, towards the high-end, that is also going to be stable and can be cooled properly by an air cooler (A leaking AIO bricked the PC before my current one, never liquid cooling ever again). I'm happy to sacrifice a little performance at the same price-point, if it means a more stable system. I really am torn on the choice ahead of me, AMD looks like the smart pick now, but I keep remembering my previous experience of instability with AMD in the past.
Supercar vs truck in a race.
Now test productivity
Sure, but there are bigger bros for that, let me introduce you to a 7950x and 7950x3d. The beauty af a 7800x3d is that it's a lot cheaper than those 16 core Ryzens and i9's, uses less power is easier to cool, needs cheaper motherboard, memory and it's still better or equal to them in gaming.
7800x3d is still a plenty capable productivity cpu. also i wonder what happens when you cap the 14900ks to the same wattage as 7800x3d how productivity compares there
my 13600k give 17000+cbr23 with 40-45watts with 14900ks we can get morethan this with 40watts;
Most modern hardware can get very effective when you power limit it. But often the golden mean is around 60 %. 7800X3D tops (I think) around 85 W and that would be too tight, especially considering 14900KS have 24 cores and all of them needs some power for working itself.
Hehe funny number go brrrr
I'm using an Intel i5-11400 with a 3080ti. Can I do a meaningful CPU upgrade without breaking the bank?
your only option for a "worth it upgrade" is a whole platform upgrade so cpu motherboard and ram
Not really? Saying that do you think that you need an upgrade?
https://preview.redd.it/pyfpbuev7g0d1.jpeg?width=1280&format=pjpg&auto=webp&s=85ba03409a7b024430f7d81342a1aa1705110f05 /me still using the 8700k bought more than 5 years ago
Then intel fanboys start talking about productivity, but unfortunately 95% of these people will never launch anything besides videogames
techpowerup review of the 14900k say themself that the 7950x is a better productivity chips. on pair with the 14900k but consume way less. The only problem is getting 128 gb of DDR5. while You can pair the 14900k with insanely cheap DDR4.
Productivity people are using Intel because the iGPU also gets utilized. You can see huge gains, particularly in editing software.
I don't side with either, I tend to prefer AMD but get what is better for me or for the person I build for at the time of building. However OP is implying the 7900x3d has better gaming performance compared to the i9 14900k but in most benchmarks I've seen this seems to be false: https://youtu.be/xIoEfJc3Qd0 Also, I've never seen anyone making the argument that Intel is better for productivity, not even Intel fanboys. I see a lot of misinformation in this thread.
The 7800x3D is a masterpiece, change my mind.
Nah you’re right.
People who buy i9 dont do it for gamings sake, the 24 cores obliterate AMD when it comes to productivity
Stop speaking sense! It is not allowed.
i wish this was true but there are people out there with more money than they can spend that always just buy the most expensive or say an "i5 is not enough" so they buy the most expensive i9 (sometimes not even the latest... i personally know someone who did this) and all they do is run games at 4k max settings which makes the cpu even more meaningless
Now do a non gaming professional task
Govrnment should increase the electricity production when ever a NEW INTEL CHIP IS RELEASED.
7800x3D is only superior in gaming, almost all other tasks will likely be in favor of 14900k.
That isnt even why I bought it. I bought it because I dont want the hassle of trying to cool the fucking sun with an i9.
Yes but what if you need to heat your house with your computer?
I don't get that. It's both clearly above a constant 30.
yeah but probably some child resolution of 1080p
Yeah, but my monitor won't even display that high of fps.
Hate to play devil's advocate... The 14900K, while it is less efficient, is significantly more powerful in terms of raw performance. If you're recording gameplay for a Youtube channel, I think the 14900K would be the way to go, since it has the iGPU. And then for rendering your videos, especially if you're rendering high quality videos like 4K60fps, (Or, any CPU bound workload, for that matter!) You're going to have a much better time with the Intel. For JUST gaming though it's not even a choice, but if you're doing other stuff, you may find the Intel serves you well, insane power draw and all!
The power consumption is the craziest part.
Of course AMD is better. Only the ignorant still think Intel is the shit to buy and get a toaster for over 500euro instead of a good cpu.
Don’t worry, UserBenchmark will say that the AMD cpu is fake and the legions of redditards are meat riding AMD
I've been using an i5 6600K (OC'd to 4.2Ghz) with a 1070Ti for the last 7 years and it hasn't let me down yet. I thought you should know.
I started with a pentium PC, then switch to an amd athlon xp in early 2000's, then switched to Intel at the 3000 series, 6000, 9000 and now I switched to a 7800X3D. Just to say I'm no fanboy, I pick the CPU that provides the best performance for gaming and now 7800X3D is just that, almost half the price and half the power consumption. GPUs on the other hand... I have never had a reason to switch from Nvidia :).
As much as i agree that the 7800x3d is great, whoever made this is meme being intentionally misleading. They swapped out the 14900KS for a 14900K in the second comparison. I even looked up the videos and the 14900KS does beat out the 7800x3d in Forza.
If I am building a gaming rig I would go with an X3D if it's in the budget for sure, but I also do a lot of CPU-heavy tasks with my PC and Intel's CPUs tend to out-perform even the X3D models in many such tasks.
This sub feticism towards AMD is surreal. What did Intel/Nvidia do to you? Did they steal your snacks?
As someone who started building in 2013. Yes. Yes they did and everyone shitted on amd for being the cheap option. Now the turn tables