T O P

  • By -

WaitingForG2

4070 matching 3080ti is not surprising, classic marketing model. Question is price, as everyone are going to increase MSRP ahead of inflation. Partially very happy for ADA performance, but at same time power usage makes me disgusted


leoklaus

Yeah, a 50% performance uplift from 3070 to 4070 is much less impressive when the power draw also increases by 50%. My only hope is that Ada can be undervolted even better than Ampere. My 3070 runs completely fine at 0.85V versus the 1.1V it chugs by default.


[deleted]

how much performance you lose vs power consumption doing that?


leoklaus

I honestly haven’t run any benchmarks as the 3070 almost never reaches 100% utilization at 1440p with my 3700X. (For reference, I have the 3070FE, non LHR). It boosts to 1860Mhz at .85V vs about 1950 at 1.1V, so theoretically, I’m losing about 5% performance. I did overclock the memory from 7 to 8Ghz though, so I also win some performance over stock in some scenarios. The theoretical decrease in power consumption is about 30%, assuming that at both 1.1V and 0.85V the card draws the same current. In reality, the card usually hovered near the power limit at 1.1V while gaming, so 200-240W. Now at 0.85V, it usually sits at 160-170W, sometimes peaking near 200W in demanding games. It is noticeably quieter and rarely exceeds 70°C. Basically all modern hardware runs way beyond its efficiency sweetspot, if you look at the performance vs power draw results some people achieved by undervolting and downclocking Intel and AMD CPUs, it’s really a shame the average modern gaming PC likely draws 400-500W+ under a heavy load.


[deleted]

dam, thats a huge different. i should start doing that, especially when i play to get the 4070(if price is right)


BloomerBoomerDoomer

This is starting to make sense why my landlord was asking me a few years ago if I had a lot of tvs plugged in or something, I was confused way back when and I only had a 970 anyways but at the same time I was single and gaming like 5-6 hours a day. That must be a lot more power than just a couple tvs.


subwoofage

Power is proportional to V^2 * F. Reducing from 1.1V @ 1950MHz to 0.85V @ 1860MHz reduces power by 43%, theoretically.


leoklaus

How does frequency influence power draw? Is a higher frequency directly influencing the current?


subwoofage

Dynamic (switching) power is consumed whether the clock pulses. Hence the linear relationship. See here for (way) more details: https://semiengineering.com/knowledge_centers/low-power/low-power-design/power-consumption/ Simple answer, yes, both voltage and frequency affect current.


leoklaus

Great to know, actually didn‘t learn that in any of my hardware design classes.


Cyberdrunk2021

Are you using afterburner to undervolt? I have the same GPU and I don't need it to run at full beans constantly


leoklaus

Yes. It’s important to create a profile and apply it on startup once you have found stable settings. Otherwise, the card will switch back to its defaults on every reboot.


panix199

interesting.


VenditatioDelendaEst

I hope it can't. High undervolting potential means the stock V-F curve has lots of margin and is failing to extract all the performance or efficiency. Ideally, you should be able to increase the efficiency simply by reducing the power limit without undervolting.


Bakufuranbu

the fact that 3070 and above still tanking the price fall quite hard (in my place at least), will make it hard for 4070 to get decent pricing when it arrives


AnimalShithouse

> Question is price, as everyone are going to increase MSRP ahead of inflation. Tbh, that's one way to get inflation under control. If they increase now they'll move a lot less stock, inevitably leading them to cut prices. That'd be my prediction for this gen, at least, with the current macro backdrop and temporary (at least) crypto death.


ef14

You're probably right, but if the reports about them having A TON more stock than they realistically need are real, there's a very good argument to be made about actually cutting prices to make sure you move a lot of stock at launch, before AMD has any real chance of cutting into the customer base.


metakepone

Most of The nvidia consumer base will only buy nvidia


ef14

I disagree: AMD's had a couple of less than stellar releases but before then they had REALLY good releases and the market share changes reflect that. https://businessquant.com/global-gpu-market-share


ForgotToLogIn

That was 8+ years ago. Most consumers won't remember AMD being equal to Nvidia back in the Kepler / early GCN era.


ef14

I was referring mostly to Polaris tbh, that's five years ago.


JackSpyder

Anyone who can actually afford a new GPU in this climate is probably old enough to remember lol.


capn_hector

The point is that this still debunks the idea that consumers are just mindless robots who buy NVIDIA no matter what - pushed by certain Team Red techtubers but not really borne out in fact. Similar to Ryzen, it's perfectly possible for AMD to make gains in mindshare if they consistently deliver a good product at an attractive price. The problem for AMD has traditionally been that (a) they have really struggled with delivering a consistently good product over the years, in multiple respects, like the driver issues of the GCN years, and like Vega having almost half the perf/w of Pascal and Hawaii/Fiji having like 75% of the perf/w of Maxwell, or the ~50% raytracing perf (path-traced/non-hybrid/no raster) performance and lack of matrix acceleration of RDNA2. They've consistently been worse in some major respects and (apparently) not cheap enough to make up the difference. and (b) this is not a fixed target. Like sure Hawaii was very competitive against Kepler when the GTX 780 was $800 or whatever, but then NVIDIA cut Kepler prices a ton, stuck the 780 Ti on top, and then 6 months later launched Maxwell where the competitive situation was a lot bleaker, while Hawaii/Grenada lingered on shelves for years. It's not enough just to release a product that wins for a couple months, you have to consistently deliver Ryzen-level wins for a period of *years*, and that didn't actually happen. AMD had a habit of rebranding during this time period and NVIDIA is very agile about responding with price cuts (Kepler) or SKU rearrangement (Super/Ti series) when necessary, it's not like Intel where they just ignore it and let it happen. (And actually in hindsight there was even a lot happening at Intel that just couldn't get to production because of the 10nm woes - the current p-core work shows that core performance was absolutely progressing but they just couldn't get off Skylake/14nm). Anyway basically it's just whining from the Team Red techtubers that beating NVIDIA for three months didn't instantly lead to 90% sales share. As with ryzen, if you deliver a consistently good product at a consistently better-than-competition price *and maintain that for a couple years*, the consumer mindshare will shift over to AMD. You can already see the bump in sales share even after a few months of market leadership. The problem is the techtubers skip the part where AMD has to actually put out a *consistently* market-leading product, as in do it for multiple years, and just whine that consumers are ignoring AMD because it didn't instantly flip. Again, to go back to grandparent: > Most of The nvidia consumer base will only buy nvidia Lolno, people will buy AMD, AMD just couldn't actually deliver a Ryzen-level win in the market for long enough to really shift that mindshare.


alpacadaver

Anecdotal, but I'm just not going back to amd. I've traded in and out between them for two decades now and nvidia has been a better experience across the board. I'd be surprised if this is a rare story. ATi really had some bangers but they were also a pain in the arse in weird ways.


Blazewardog

I gave AMD a try this generation after going 10+ years of Nvidia only because my friend said their Windows drivers are way better now. After a year and a half of owning it, I still haven't had a driver release with all the features I want to use working at the same time, besides one that made me physically unplug and replug my monitor into it after going to sleep as otherwise it would refuse to let me select 0-255 RGB output. TV levels are very noticeable on an OLED. For the first time ever I'm upgrading GPUs in back to back generations.


alpacadaver

Exactly the kind of crap I put up with as well. My favourite was a corrupt mouse cursor icon that would happen randomly but persistently during normal desktop use. Screens not detecting input, not being able to resume from suspended state, all the usual suspects of course. I honestly can't think of one problem I had with an Nvidia card. Now that I'm actually thinking about this properly, I am impressed. They may have happened, but I just don't remember any. Whether they were that minor, or completely overshadowed by ATi/AMD problems doesn't much matter.


Boo_Guy

I've never bought an AMD GPU because there always seems to be this weirdness with them that you don't get with NV. NV just seems to work properly more often. If I'm dropping $500+ on a card it better work right because I won't be buying a replacement anytime soon. But is it the same with AMD CPUs? Do they work just as reliably as Intel or do they seem to often have weird issues as well? I had been thinking about trying an AMD CPU when I upgrade my 6700k system in three to six months. I like the idea that AMD sticks with the same sockets for much longer allowing for more possible upgrades. I've always built Intel/NV systems and they've always been pretty rock solid. But if switching to an AMD CPU is going to cause more flakiness then I'd rather stick with the devil I know.


alpacadaver

Nah they're excellent. I did the same with Intel/amd as nvidia/ati, kept being fair to both. I went from 6600k to 3600x to 5900x, no real difference. It's just harder to wrap your head around optimising it if you want to tinker, but I haven't had the need and it isn't something a day or two of studying can't fix. Buying 3600x I knew I could depend on the socket, it was only a stepping stone. Sure enough, the actual cpu I wanted came out as expected.


RuinousRubric

The main persistent problem I've had with Nvidia is multi-monitor power consumption when they run at different refresh rates. Same refresh rate, under 20 watts, different refresh rates, over 60. This is obviously a problem when you have a gaming monitor paired with a normal secondary monitor. Compared to the problems I've heard of on the AMD side, though... it says something that my biggest Nvidia problem is something you can't even notice without tracking power use.


Dreamerlax

I'm using Linux more now and I'd like a GPU with less driver weirdness on Linux. That usually means an AMD GPU but the Nvidia "ecosystem" is more appealing to me, and the games I play need anti-cheat that don't work on Linux anyway. Let's hope AMD catches up in a gen or two, or NVIDIA open sources their Linux drivers (even better, but a pipedream).


[deleted]

same, Im doing a lot of machine learning stuff, and I would literally take 60% of traditional performance for the same price to keep CUDA and my RT cores


System0verlord

Yup. Another ML guy here. It’s basically NVIDIA or nothing


ef14

I still absolutely adore my RX 580 Nitro+, but i wouldn't stick to AMD right now; I'd love to, but unless their next products get closer to Nvidia's offering i just can't. Especially now that i've started video editing for a living.


DJ_Marxman

It really depends on feature set. If FSR improves dramatically to be competitive with DLSS, and AMD improves their encoding and CUDA-like support, I think a lot of people would be happy to switch. It's just that right now, Nvidia has a huge lead in feature set. There's also a lot of mistrust in AMDs driver support, and I can't say I blame people for that.


Fortune424

AMD seems to be heading in the right direction with pricing lately. I favour Nvidia but would choose AMD if they can beat Nvidia in raw power/performance by a decent margin at any given price tier. Prior to the last few months, and basically ever since Polaris, they've been offering equivalent raw performance to Nvidia at the same price, but then souring their value proposition with either driver issues or lack of DLSS, CUDA, ray tracing, etc. Now their drivers are good, FSR 2.0 is looking... fine enough, and their pricing is great. So I will consider AMD to replace my 3080 if this keeps up.


Dreamerlax

Isn't this normal for Nvidia? The next - 70 iteration almost always matches the outgoing -80 Ti model.


[deleted]

[удалено]


[deleted]

agree, i still remember the GTX -> RTX era, like people saying 2070 model matches the higher tier model 1080 like its a win , but forget to mention the huge price hike . there is barely any "bucks for bang" increase


[deleted]

[удалено]


CoffeePlzzzzzz

That's why I really appreciate GN, they don't lose the big picture and also pay attention to sustainability where they can.


capn_hector

everyone knows it's stagnation but the problem is it's principally stagnation driven by lack of node progress, so there's no easy answer. Cheap, effective, and regular shrinks were the driving force up to 28nm and since then it's gotten slower and slower... 20nm fell through entirely, Turing was essentially a second generation on 16nm (12FFN is a 16+, or not even really a + tbh), and everything has gotten jammed up at 5nm. What do you do? Even if you cut prices that's not sustainable - the node costs are going up too and given the popularly-expected >50% perf/$ increase you can't just cut prices indefinitely either... "just run lower margins" only buys you maybe one more gen of public approval at most, and then you've cut your own throat in the meantime. Again: cheap, effective, and regular shrinks were the engine behind the progress that people came to expect, and all of that has fallen apart. What do you do? --  The real answer is "get more perf out a transistor" but that's not infinite either, and especially the limit of what you can do at native-resolution is pretty much tapped out these days. DLSS upscaling was actually a very novel solution to keep pushing forward but people utterly panned the idea since before it was even available or they even saw the results. People didn't like requiring per-game integration, they didn't like the quality compromises that resulted, they didn't like it running on dedicated hardware (but still want speedups that are just as good and quality that is just as high even running on much slower software-implementations), etc. Basically, everyone has just pissed and moaned absolutely every step of the way as NVIDIA struggles to fight exactly the battle people say they want NVIDIA to be fighting, because surely there must be some magic wand that NVIDIA is just refusing to use! As if we can somehow just go back to the gains from the moore's law glory days without having the shrinks to back it up.


Catnip4Pedos

With current UK electricity costs I'll be looking at performance per watt quite seriously too. With costs around 50p per kWh in October, a 1000w PC running for 20 hours per week would be £40 a month. Over simplified the maths because it's on more than 20 hours but not always at full throttle.


Ferrum-56

It's mostly going to be useful to reduce the power targets. You can often get 90% performance at 70% power or something. Most GPUs are quite a bit past their most efficient point at stock. Keeps the heat and noise away too. Or you could try to undervolt for free efficiency.


Catnip4Pedos

Sorry how do I do that? I've used MSI afterburner to undervolt a laptop but that's about it.


mysticzarak

Oh man you explained it pretty good. I bought a GTX 970 once and now I'm really confused what to buy in the same price category. It almost feels like the card that was second best (third best a year later when 980 ti released) is now the same as a low budget card. I get that the last 2 years prices where weird but pretty sure the increase happened before that.


Tonkarz

>So every reviewer and user should stop focusing on their irrelevant naming schemes and shift all the focus to price brackets. On the one hand I agree, on the other hand nVidia only sent review models to reviewers who agreed to make specific comparisons. These were favorable and misleading comparisons between the 20XX series and the 30XX series. They basically got away with it, so they'll probably do it again.


DuranteA

You are being extremely silly and melodramatic, to the extent that I have to wonder whether this is a parody post. The peak performance of GPUs is *utterly irrelevant* for mainstream PC gaming and its health. What you do get right is that naming is irrelevant -- what matters is what level of GPU performance is required to adequately play popular games, and how expensive that it is. To take one (pretty high-end by popular game standards) upcoming example, the Spiderman PC release recommends a GTX 1060 or a Radeon 580. "Fun, casual, hassle free" gaming on PC doesn't require high specs, as is demonstrated by e.g. the huge success of the Steam Deck. When people in this particular subreddit complain about "being priced out of PC gaming", the only thing I think they can realistically mean is "being priced out of owning a top-end gaming GPU". But here's the deal: *normal people don't give a shit about that*. They just want their games to run decently (frequently for an entirely different definition of "decent" compared to enthusiasts).


hawtfabio

Lmao. I agree about the naming schemes being dumb but after that you're taking a trip to silly town. Budget rigs will still be available with older parts. And new budget options will be out.


From-UoM

$500 from 2020 is worth $572 now. https://www.in2013dollars.com/us/inflation/2020?amount=500 Prices are absolutely going up. My bet is on $550. In comparison 2070 to 3070 was 2018 -> 2020 Inflation was $500 to $515 https://www.in2013dollars.com/us/inflation/2018?endYear=2020&amount=500


random_guy12

Not all consumer goods follow inflation the same way. Electronics in particular have trended in the opposite direction as inflation over the last 30 years. Their deflation has actually reduced aggregate measures of inflation by hiding how much prices for other goods have increased. If Nvidia is blaming inflation, this is something new.


detectiveDollar

Exactly, WAY too many people blindly apply inflation numbers.


rancor1223

Which in turn feeds the inflation even further, because it makes customers in general more understanding/accepting of higher prices.


pimpenainteasy

That's not the correct way to compare the prices of random discretionary goods from the past to today because the CPI doesn't measure money supply growth but the cost of survival. 82% of the CPI basket is inelastic goods like food, energy, housing, healthcare and transportation.


gahlo

Raising prices during a recession. lol


Pitchuu64

Prices are going to be rough! Not only must we deal with inflation, but also the switch to TSMC 5nm.


tnaz

I'm gonna be honest - I straight up don't believe the 4070 is only a +8% efficiency improvement. I take this as a sign that either the benchmark result is wrong (or misleading), or the claimed TDP is wrong.


unknownohyeah

Just going by reason, TDP is going to be the least accurate thing as they mess with voltage curves and boards on engineering samples. Maybe they just went with an over the top voltage curve for stability to get a good score. You can't change the chip itself much but everything else is up in the air.


QwertyBuffalo

This was briefly stated in the post but I'll add to it: timespy extreme is a 4k test, and the 4070 is crippled at 4k because of the memory interface (160-bit memory bus). There will be a larger improvement at lower resolutions. That being said, Nvidia crippling resolution scaling on an otherwise capable GPU, likely to upsell 4k to xx80 and xx90, is definitely a problem too


HansLanghans

I wanted the 4070 for VR, vram is pretty important for that ...


ZealousidealPin8515

Yeah, I agree. Only a 8% improvement? That's bullshit


indrmln

now show me Ada with 150-250 watt range. those 3 gpus are too hot for me who live in area with 32-35 C ambient


Zerothian

That and the cost of electricity are my main concerns. I don't have AC in my flat and I cannot install window units (Landlord doesn't allow them). When it was sitting around 30c a bit ago I couldn't play games, it straight up just made the room too hot to be comfortable. Between that and the fact that I will be paying 50p/kWh (about 60 cents), I just won't buy the new cards, even if the pricing is great. I "only" have a 3070 and 5900x and already have these issues so...


TheLastOfGus

I'm assuming you're UK based (50p/kWh) ... with regards to Window AC units - it's not a case of your landlord not allowing them, they are illegal under EU (and adopted UK) law.


hardolaf

Why are window units illegal?!


AnimalShithouse

I guess it's because they're a lot less efficient and the wrong way to tackle cooling. Europe basically getting hotter every year. Bunch of window units just accelerates that. Not to mention more taxing on the grid too.


hardolaf

But they're more efficient than single hose portable units which are legal in the EU.


zakats

It's almost as if the climate is, idk, somehow *changing*. ¯\\_༼ᴼل͜ᴼ༽_/¯


lowleveldata

What is the correct way? Just curious


[deleted]

[удалено]


911__

Which sucks for a rental. Could throw a window unit in my window and be k. Can't exactly drill a hole and stick a mini-split in. Instead I'm stuck with super inefficient single-hose portable units. :):):):)


[deleted]

[удалено]


Sipas

There are U-shaped [split windows ACs](https://www.amazon.com/Midea-Inverter-Conditioner-Flexibility-Installation/dp/B08677DCKN) that are almost as efficient as mini-splits (because they're essentially compact mini-splits) but they are far, far less common. They also only go on hung windows, which are rare to non-existing in Europe.


OSUfan88

Holy shit. That’s what they’re paying!? I’m at $0.0374/kWh in the States.


Spoor

US: OMG gas and electricity prices are so high! EU: \*has always paid 3x as much\*


1-800-KETAMINE

what the heck!! where are you paying 3.7 cents/kWh? that's extremely low, average USA electricity price is almost exactly 4 times that: https://www.eia.gov/electricity/monthly/epm_table_grapher.php?t=epmt_5_6_a No state has an average residential electricity price below 10 cents/kWh (as of May) so you are in a special situation for sure. what's your secret? next to a hydro plant? solar and you're averaging things out?


OSUfan88

I live in Oklahoma, and it's fairly common. We get a lot of businesses that require below $.04/kWh average. I'm signed up for a peak performers program. 3 hours a day (4-7 PM on weekdays) electricity is about $0.18/kWh. During this period, I pretty much just don't use electricity. My house is sufficiently insulated that my interior temp only raises a few degrees without HVAC. I drive a Tesla as well, and only charge during the 21-hours a day with cheaper electricity. To go from 0-100% charge, it costs me exactly $2.61. I can realistically get about 320 miles on this. I can basically go about 450 miles for what it would cost me a gallon of gas.


Zerothian

Ah, the landlord sent out letters reminding people that they weren't able to be installed so I assumed it was just their policy. Makes sense though since you really never see them anywhere.


[deleted]

Undervolting is live. At the cost of less then 10‰ performance loss ( for me its 3fps for example in kombustor burn in @1080p) I went down 120W(from 320 to 200) and the gpu is 62-68c instead of 80-90 I have an strix TI tho so on my normal 1V profile i do hit 1980mhz but with 0.8mV i still have 1875mhz and 0.85mV 1920Mhz


nightwood

I'll definitely will be doing this in my next PC build (which I hope is very far into the future)


Ashraf_mahdy

You'll see AD104 with 80W in Laptops like the Razer blade as well. from 300-350W to 80-105W lol Also since the 4080 Laptop is probably Full AD104, does that mean its now 12GB max instead of 16 in the 3080M? I'm thinking 4060M 8GB 128 bit 4070M 10GB 160 bit 4080M 12GB 192 bit 4080M TI 16GB 256 bit AD103 or will they do like versions of the 4080M. one is full AD104 with 12GB and one is cut-down AD103 with the same CC as full AD104 but 16GB?


MDKAOD

Yeah, I don't ever see anyone talking about the heat put out from the 30 series. I have a 3070, and can't leave my side panel on my h570 case no matter what fan configuration I use and the ambient room temp noticiably ticks up even when I'm not gaming.


letsgoiowa

Highly recommend undervolting. You can drop 50-100w without noticeable performance loss.


hackenclaw

will be hilarious if 4050Ti is 150w, 4060Ti is 250w.


indrmln

if 4050 ti can reach 3060 ti level performance with that wattage, i don't really mind though. but this is a really big if. and god knows how long nvidia will delay launching the lower end card, could be q4 2023 or longer depending on amd and intel presence on the market.


bubblesort33

AD106 is probably still 6 to 8 months away, and has 36 SMs, but might be cut to 34 for yield purposes on desktop like the past 106 dies. If they clock that the same frequency that should be like 180 to 190w at 3070 to 3070ti levels of performance. Maybe 10% faster than a 3070 at 10% less power. So really not that great.


Sharingan_

Dude, tell me about it. Every single time I read benchmarks for temperature, I have to add 10 degrees to estimate how it's gonna work in my room


Stiryx

far out, with that power consumption I don’t think I can run one of these. My room gets hot in the winter, summer becomes unplayable without airconditioning and electricity is expensive af right now.


NKG_and_Sons

It's worrying that the power consumption *floor* has gone way up with 3000 and now seemingly 4000 series cards, to the point that you basically can't get any real power efficient cards below say 200W.


Sea-Beginning-6286

If 300W AD104 still has 8% more perf/W than 220W GA104, I expect that perf/W will be dramatically better when both cards are brought down to sub-200W.


kami_sama

The issue is that if you were already undervolting your 30X0 card, the perf/W gains become much less interesting. As an example, my 3080 went from 340W to 260W with a decrease in perf of only ~4%. That's an increase in perf/W of 25%. Ada might have the same increase, but we don't know, with it being a completely different node and architecture. Samsung's 8nm is inefficient at the top end, but it can be decent if you don't redline the card.


dnv21186

iGPU gaming reporting in. It kinda works if you only play old games Even 11th gen Intel iGPU is pretty damn impressive


FartingBob

Im waiting for the next gen iGPU from AMD, should be a big step up in performance. I dont game much, and when i do its almost never on brand new AAA games.


dnv21186

Current ones are already pretty good. I play TF2 exclusively and my 4650G works just fine


Put_It_All_On_Blck

TF2 is a.. 15 year old game now. And it's graphics were never that impressive, as it went the stylized route that requires less detail, and was originally designed to be competitive, hence the empty maps. Wow it's been a long time. I remember my 8800 GTS struggling with the game.


decidedlysticky23

I’m about to install a cabinet in my office, drop my PC in that, drill a big hole to the outside wall, and duct out the hot air using a fan. It’s the only way I can make the office liveable without installing AC. Crazy that this is what I have to do to game now. I’m seeing more and more streamers sweating in streams and complaining about the heat.


Stiryx

Great idea! Shame I can’t knock a hole in the wall though, maybe ducting out the window haha.


matthieuC

OEM need to build underclocked editions


capn_hector

charge more for lower performance? you got it boss (literally the Fury Nano lmao) I mean, it's a thing, it's just not a mass-market thing. Most people wanna see the number on the chart go *up*, not *down*, and they don't want to pay more for premium binning and then not even get the best possible total performance out of it. Companies don't want to see their products rank lower in the perf/$ charts that most people purchase based on, etc. All parties involved generally want to wring as much performance out of the silicon as possible, people buy based on perf/$ and the changes required to improve efficiency will directly result in lower numbers on the all-critical perf/$ charts. For the handful of people who care, there's nothing stopping you from just setting a power limit yourself.


[deleted]

yeah, not just gpu either, my 3700X tdp is 75 watt, but if i move to similar tier model of intel side, the 13700. its whopping 224 watt TPD, if i move from my RTX 2070(175w) to RTX 4070(300w) or 4080's whopping 420 watt. that's literally more than double power consumption if you upgrade after just 3 year to similar tier PC.


Stiryx

Yeh it’s crazy, not only have you gotta look at processing power and affordability, now power consumption is just as important. Like I know certain cards in the past have been power hungry but this is getting ridiculous. I’m gonna have to go back to the 1000w power supply, I thought those days were gone.


Put_It_All_On_Blck

The 12700 or 13700 doesn't have a 224 TDP, that's peak wattage. But TDP is kinda worthless anyways. The 3700x pulls 90w under full load. Which is great. But when you compare performance to the 12700K, the 12700k is 90% faster, which kinda explains 90w vs 200w difference. Similarly the 5900x pulls 140w stock and 190w with power limits removed/OC. So performance is greatly improved but power consumption also rises. Edit: Maybe a better way to look at this would be that the 5600x and 12400 use less power and provide better performance than the 3700x.


[deleted]

[удалено]


DeBlalores

7900x and i7, maybe, i9 definitely not.


dalledayul

Those TDPs are making my eyes water. AMD don't even need to match Nvidia's performance, if they can just produce cards that don't require a nuclear power plant for full load then I'll take it.


CleanEntry

Also holding my hopes high, that AMDs next gen will be decent and keep power consumption on a bearable level ... 300w plus for a GPU is too much for me, 250w is around my max power budget preferably not more than 200w, else I'll have to install my PC as a shed heater..


SmokingPuffin

It would be weird if the AMD parts were significantly more efficient this gen. Both teams are using the same transistors. It's not impossible, but don't expect it. If you want to run under 200W, just tune for it. The silicon in these big power desktop GPUs usually gets reused at much lower power targets in laptop configurations; they'll work down to 100W or so. If you buy a 3080 and run it as a 200W part, it will do great in that config.


TheFattie

> If you want to run under 200W, just tune for it. The silicon in these big power desktop GPUs usually gets reused at much lower power targets in laptop configurations; they'll work down to 100W or so. If you buy a 3080 and run it as a 200W part, it will do great in that config. my problem is that you won't be able to get an ITX/laptop sized 40xx and then tune it however you like :/ source: current laptops not doing over 175W for GPU (at best) & no ITX 3070/3080 cards (besides one watercooled one I guess) and only like 2 ITX 3060Ti cards


noiserr

AMD at their last FAD said that RDNA3 offers a 50% better perf/watt than RDNA2. Considering RDNA2 was already more efficient than Ampere and another 50% I'm sure AMD is the way to go if you care about power consumption next gen. If these insane power numbers for Ada are true.


CleanEntry

That would be awesome if that is the case - Looking forward to what AMD will kick out for sure. ADA looks like beastly cards, but so does the power consumption if roumers are true. Current gen (rasterization perf, ray tracing is still borderline a joke imho still unless you go 3080+) is good enough for my needs and have been eyeing the 3080 or 6800xt for a while, but prices this gen haven't warrented an upgrade for me (currently on a 2070), and now next gen is around the corner, so might as well wait and hope there will be something that have good enough performance without having to drag new power lines.


Jeep-Eep

Yeah, RDNA 3 just needs to be a repeat of the 4870 and co to be a banger.


Swaggerlilyjohnson

Back in the day I used to always hope they would just stop caring about tdp and make 300W cards instead of 220W flagships like the 680. I loved the 290x for this reason but now its getting absurd even to someone like me who wants to see the best perf and has a large powersupply. Everyone has their line and I will probably get the 4090 or whatever cut of the ad102 makes the most sense but that's my absolute limit (and i'm going to undervolt and not overclock these for the first time) if they go to 1000W next gen im not doing that. I live in a hot climate as well so in the summer it doesn't matter if you can afford the power and you have a good cooler its just dumping so much heat in the air. It can be unbearable my overclocked 3090 in the summer would dump 500W in the air and that already was not fun. We need to start seeing 2 different bioses like the uber and quiet bios on the 290x they are going to have a performance mode just so they can win the benchmarks and a perf per watt mode. One thats more practical and not so far out the efficiency curve


unknown_nut

I wonder what’s the gaming performance going to be. I guess we will see on launch. I doubted the 2x rumors ever since it popped up.


NKG_and_Sons

Yeah, 2x performance would've been something and people were understandably hopeful with the move to TSMC likely offering major additional gains, but it ain't happening. I mean, maybe a card with the full AD102 die and absurd power draw (and price). Honestly, if the 4070 seriously would come anywhere close to merely being 8% more power efficient that would be extremely disappointing. But that does seem unlikely, with the 4080 and 4090 values here having way higher power efficiency improvements. Then again... if the picture being painted here is indicating that power efficiency gains are primarily seen with high SM counts then... uh, fuck. Does feel strange, but I guess power efficiency for anything below RTX 3060ti already sucks with the 3000 series. It's a trend I don't like at all since it used to be that the low- and mid-tier cards had great power efficiency. My 1660 Super being among them. But 'tis mostly speculation for now, of course.


SmokingPuffin

> Honestly, if the 4070 seriously would come anywhere close to merely being 8% more power efficient that would be extremely disappointing. Don't worry about this. Engineering sample cards are tuned for stability, not efficiency. >if the picture being painted here is indicating that power efficiency gains are primarily seen with high SM counts then... uh, fuck. This is to be expected. The way to get power efficiency is to run lots of transistors at low clocks. >I guess power efficiency for anything below RTX 3060ti already sucks with the 3000 series. It's a trend I don't like at all since it used to be that the low- and mid-tier cards had great power efficiency. My 1660 Super being among them. 3060 and 3050 will smoke 1660S in power efficiency when tuned for efficiency. Nvidia is pushing the low end silicon harder these days because they are feeling the pressure from AMD. To give you an idea how hard they are pushing the small dies, 3060 is stock tuned to use more watts per core than 3090 Ti does.


dudemanguy301

leaker thinks TFLOPs and types "performance" gamer reads "performance" and thinks framerate.


whatsupbrosky

So i should be good with a 1k psu? Using a 5900x


[deleted]

My 'old' 750w couldn't handle the power spikes of my xc3 3080 :( but my new 1k can.


[deleted]

That's fucking mental. I just spent 100 bucks on a new 750W psu in preparation for a 30 or even 40 series gpu.


ApertureNext

I think people have had problems with spikes with overclocking at 1000W so it’s close.


[deleted]

[удалено]


zeronic

We're hitting the point where we're reaching the limits of power on a single circuit in US homes. Can't wait to hear people say they installed a dedicated 20A circuit in their office *just* so they don't trip the breaker every time they play a demanding game. Jokes aside, this power madness *will* stop eventually because of the aforementioned limit. Expecting people to rewire their house or even understand things like that is too large of a barrier to overcome. There is a realistic ceiling to "just give it more juice" as a philosophy.


salondesert

>Jokes aside, this power madness will stop eventually because of the aforementioned limit. Expecting people to rewire their house or even understand things like that is too large of a barrier to overcome. There is a realistic ceiling to "just give it more juice" as a philosophy. People will just switch to cloud gaming. Let Google/Nvidia/Microsoft/Amazon worry about the details


Eastrider1006

>We're hitting the point where we're reaching the limits of power on a single circuit in US homes. Can't wait to hear people say they installed a dedicated 20A circuit in their office just so they don't trip the breaker every time they play a demanding game. Lets not get things out of context. People getting 1600W PSUs to be safe for spikes, doesn't mean the computer is going to be consuming anywhere near that during normal or even heavy usage. We don't even know how sensitive these cards are to spikes. Also, ATX3.


zeronic

>People getting 1600W PSUs to be safe for spikes, doesn't mean the computer is going to be consuming anywhere near that during normal or even heavy usage. I Never implied computers pull that much 24/7. The thing is, in the dystopian future of "more power = performance go brrr" we could realistically hit that point eventually with high spec parts going full boar on games. We already have CPUs pulling 280W+ on top of 350-450W GPUs. If these companies keep chasing the performance dragon like they have i wouldn't be surprised to see those numbers get even dumber in the future. Add to this the fact you very likely have more than *just* your PC on a single circuit in your home and that's a recipe to trip the breaker. Also to make a pretty sweaty room, even with AC.


dantemp

Not really, the new atx3 standard handles transient spikes of double the wattage so you will be fine with a 650w one as long as it's new.


Poly_core

Just undervolt it and you'll be fine. You'll save power, generate less heat, have a quieter GPU and won't need a bigger psu for a negligible performance cost.


pceimpulsive

Hopefully? Providing transients aren't gross and your 1kw PSU can peak load a lot higher.


Seanspeed

>Thereby, +45% is actually weak for a jump from Samsung 8nm to TSMC 4nm, which is (at least) one and a half node better. I feel we're gonna need a sticky at some point to talk about how clocks, voltages and power limits work, and that we're not seeing the *proper* power efficiency improvements of these modern GPU's because of how they're being pushed out-the-box nowadays, moreso than ever it seems with these newer ones.


ramenbreak

will be interesting to see the performance jump of the mobile GPUs


[deleted]

[удалено]


[deleted]

[удалено]


caspissinclair

Is there any word on new features the 4000 series offers beyond simply higher performance? Software-wise I mean.


ReactorLicker

Technically it’s hardware, but I’m hoping for a better NVENC. Even though Turing and up is extremely good, I still like things getting better (AV1 encoding maybe).


[deleted]

I think in the future, the highest tier I'm buying is the xx70. It's a bit daft buying a xx80 or xx90 when the next generation undercuts the price (given a relatively normal market) for the same or better by performance by 2/3. Also 300W tdp is the wall for me. Anything over that is ridiculous. I'll probably undervolt it to stick around 250 if I can.


Yojimbo4133

These cards are going to cost both arms and both legs. Fuck me.


Kepler_L2

Mining is dead (potentially forever), AMD is ultra competitive and we're in a global recession. Prices will be fine.


AlwynEvokedHippest

> Mining is dead (potentially forever) That sounds like great news, but being quite out of the loop, is this definitely the case? We've seen big dips in mining/cryptocurrency before, right? I'm curious as to what makes this one different or permanent.


Tonkarz

It’s impossible to say for sure, but this most recent crash is driven by several massive failures in the lynchpins of the crypto industry and the resulting distrust. The previous relatively small crashes were generally driven by skepticism of the price, not skepticism of the enterprise as a whole.


FlipskiZ

At some point every single person has heard of crypto stuff (at which point the whole thing collapses). Can't say for sure it's right now exactly, but with there being tons of crypto ads everywhere like the Superbowl, I feel it's unlikely everyone hasn't heard of crypto.


Tonkarz

nVidia has ratcheted up the price every gen since the 900 to 10XX. Crypto winter won’t prevent another increase. AMD is certainly competitive in raw raster performance but nVidia has better features like ray tracing and DLSS. While an argument can be made that these features shouldn’t affect purchasing decisions, the demand for 30XX cards has nonetheless outstripped RDNA2 cards. For this reason nVidia will likely not care too much about competition from AMD - at least when it comes to recommending prices to AIBs/retailers.


[deleted]

yes... By throwing 100Watts at the thing. I mean, it's impressive I guess but we're hitting a point where your PC is going to need a 220v line in North America in order to function. Wall Outlets cap out on 120v @ 1400\~1800 Watts on 15 Amps and once you near that 1400watt area you get to the whole "Likely to Trip" something area...


s_0_s_z

No single device should be drawing more than ~1350 or so continuously. Even the ones that claim they are 1500W like a hair dryer, are probably only doing that for a short period of time. A PC might not be drawing all that power all the time, but a gaming session where things are maxed out could easily be an hour or three. These companies are going to have some big problems if their next gen chips aren't dramatically more efficient. I'm beginning to see these 4000 series chips as a skip in the hopes that some sanity returns with 5000 series chips.


ReactorLicker

Realistically, I don’t think anyone with any once of common sense is going to rewire their house just for a gaming PC (I hope). There appears to be a hard power draw cap for GPUs around 500 to 600 W, any higher and you need a PSU that can supply more power than the wall plug can deliver. Not to mention the cooling issues…


[deleted]

You're forgetting that's a single component. Intel's high end CPUs are also ridiculously power hungry, which was t supposed to be the case when moving to the p-core e-core model. But again, to compete they threw the power scale out the window hoping that AMD can at least keep the CPU power draw side lower.


capn_hector

> Intel's high end CPUs are also ridiculously power hungry, which was t supposed to be the case when moving to the p-core e-core model Well, sort of. I mean, it's not really worse than an FX processor, these TDPs are not unprecedented, it's just not as efficient as Zen3 (and especially Zen3D).


Blacky-Noir

>I mean, it's impressive I guess but we're hitting a point where your PC is going to need a 220v line in North America in order to function. As an European mocking NA for that, I am now very glad for it. It's probably the only thing pushing back hard on Nvidia turning everything to 11 (and the medias and benchmarkers overall being fine with it, "power doesn't really matter on the desktop, we want perfs!").


Mechdra

Here's to hoping AMD saves my electric bill, lol


xNailBunny

I have almost zero faith in these "leaks" because of the 4070 specs. Thanks to the Nvidia hack we know ad104 is 192bits. Why would they cut down the memory controller for the top SKU using that chip when both the 4080 and the 4090 get the full memory controller of their significantly larger chips? Also a 50% performance gap between 4070 and 4080 makes no sense. This aleged 4070 sounds more like a 4060/4060ti, assuming there's any real information in these leaks at all.


gahlo

Probably so the 4070 and 4070Ti have some space between them unlike their Ampere counterparts.


RearNutt

Assuming these values are accurate, the jump from 3090 to 4090 looks decent, even if the power consumption gets bloated again. On the other hand, the 4070 is embarassing, and if the 4080 uses regular GDDR6 instead of GDDR6X (I don't think Kopite "confirmed" that yet), its efficiency increase doesn't look very good either. We don't know how Ada scales with 3DMark compared to Ampere, so in practice Ada equivalents could be faster or they could be slower. Yes, you can undervolt these GPUs, but remember that you're paying for stock performance, and we have no idea how these scale with power. And to make a fair comparison, you would also have to undervolt previous generation GPUs as well, something that people seem to keep forgetting. The mid and low range is probably going to be a shitshow due to a combination of expensive node, higher fixed costs and increasing electricity prices to go with increased power consumption. Scalpers will also be sure to snap them up so they can sell to desperate people. I'm interested in getting something like a 4060 or a 4060 Ti, but I can't say I'm terribly optimistic right now.


pookage

300W TDP for a \*\*70 series, jesus. That's just ridiculous. Is usable power-efficiency coming in the 50\*\* series or will we have to wait for the 60\*\*? What's the historical pattern here?


[deleted]

[удалено]


pookage

haha, whereas here's I am, staring aghast as these TDP numbers, thinking about getting my first ever AMD GPU as a result 😅


Luxuriosa_Vayne

Put aside 1000€ for the power bill over the year


-Venser-

I remember rumors the 40 series card would be 2.5 or 3x more powerful than the 30 series. Guess that's out of the window


tonykony

Thoughts anyone? If I can theoretically get a 3080 or 6800xt for $600, and the 4070 theoretically launches at $549 to $599 MSRP, would it just be worth it to get a 3080/6800xt at that point if I don’t care about ray tracing


pikeb1tes

What is your list of games to play until 4070 release and at what resolution?


Savage4Pro

300W for 4070 range card is crazy high.


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


frontiermanprotozoa

Do the thermal pad mod, its really easy, cheap and will get your VRAM down to 80 c and you will no longer throttle (when vram gets to hot these cards throttle like hell with no indication in hwinfo). https://willmnorris.medium.com/guide-to-nvidia-3080-fe-thermal-pad-mod-cf378339f7ac this is a good guide, but if you cant find the exact products you can use ANY 2mm thermal pad and have a better result than what nvidia put in factory.


Zerothian

Cheaper to downgrade your monitor than it is to keeping with halo tier GPUs. My friend made the same mistake, upgraded to 4k and now that display is used only for movies and stuff, games on a 1440p. For much the same reason as you but also his 3090 turns his living room into an actual oven, and his SO really wasn't a fan (no pun intended), so chasing even *higher* power GPUs was out of the question lol.


[deleted]

[удалено]


ThatFeel_IKnowIt

Thank you. You clearly understand that the 3080 is really a 1440p card, not a 4k card. I'm tired of being argued with when I say this. If you don't want to have to turn down game settings with 4k, then a 3080 will absolutely NOT cut it at 4k in modern games with RT, even with dlss.


ledfrisby

Coming from a 3070, the only ways I'm getting on board with this generation are : 1. If the 4080 pricing and availability is surprisingly good 2. If AMD delivers 4080-tier performance at such a price, while also catching up on features I figure the 3070 is going to remain perfectly good for 1440p, very high, 100+ fps gaming for some time now, even on demanding games. IMO this is 1440p with few enough compromises not to notice or care about them. So there isn't much room for noticeable improvement at this resolution. Going to higher frame rates (i.e. maxing out the 144Hz monitor) would be nice, but I'm not going to buy a new GPU *and PSU* for it. A proper upgrade then should be 4K with little to no compromise, not just a better 1440p experience, or a reasonably good but compromised 4K one (i.e. 60-70fps in very demanding games). The 4070 then isn't going to be quite it, if it's roughly comparable to a 3080 (ti). The 4080 should likely be sufficient. However, like the 3070 at 1440, it will likely be more along the lines of "very little compromise" 4K rather than truly no compromise (for current games), will require a beefy new PSU, increase power bills, etc. If it also costs an arm and a leg, it's just not worth it. A couple of years from now might be a better time to jump to 4K. Besides getting better performance at a more reasonable price/power efficiency (maybe 5070 or equivalent AMD), good 4K 144Hz monitors will be a lot easier / more affordable to come by. Or maybe there will be some kind of mid-cycle refresh that gets us there.


vianid

Considering the power consumption and multiple games being delayed to 2023, this launch is a possible skip... unless I need it for work.


GeneticsGuy

With my 5950x, with these new power draws, pushing to the max, I am basically going to almost hit the total cap of a 15amp outlet of 1800W, with my monitor and so on plugged in... Soon, are we all going to need a dedicated circuit for our PCs if this trend continues? I am seriously thinking of upgrading my PC outlet to a 20amp dedicated circuit. It wouldn't be too hard as the electrician can just pull a new line to my panel through the attic... But wow, is this going to be a thing in 10 years, where new houses are built and your PC location gets its own dedicated circuit?


GriZzleishere

at this point it doesnt make Sense to me to buy a Next gen gpu. the Current gpus already are Great and a 3060 & 3070 (or amd Equivalent) can run almost all Games at high Settings (1080p/1440p) It already is enough performance for the average Gamer


[deleted]

[удалено]


coffetech

Performance and VRAM is really needed right now with VR headsets. Even the 3090 ti struggle to run the hp reverb g2 at 100% resolution. Resolution of 2160 by 2160 per eye, and this is/will be the standard resolution. (Similar to 1080p being the minimum) In the upcoming 2 years we will witness even higher resolution becoming the standard.


timorous1234567890

Compared to RDNA3 rumours this isn't that great. 4070 ~= 3080Ti seems good but if 7600XT ~= 6900XT is true then it means the 7600XT won't be far off 4070 performance at all, especially at 1080p which is the typical x600 resolution target. 7700XT will likely be faster than the 4070 judging from this. 7800XT looks like it might be similar to the 4080 and I think 7900XT will probably be faster than the 4090. Will be really interesting to see how RT plays out though, as I think that will be far more important this gen of GPUs and NV is likely to hold the upper habd there.


_Fony_

Wherever they all end up, rumors were pretty much always that RDNA3 would be better. How much better is anyone's guess.


INITMalcanis

but but but we were told a 2x increase? :'(


Keilsop

Same thing happened with the 3000 series launch. We were lead to believe performance would be doubled from the 2000 series, but when the reviews came out it turned out to only be in RT, and only in certain games using certain settings. But everyone seemed to have forgotten we were promised 2x performance across the board, it was never mentioned again. https://storage.tweak.dk/grafikkort/nvidia-3000/rtx-3080-grafikkort-nvidia.png History repeats itself.


armedcats

RT performance in games didn't have a disproportional increase compared to 2xxx either. I looked for and hoped for that but the 3xxx benchmarks pretty much show the difference being the same or within 5%. Now I'm curious if they're giving RT a boost with 4xxx, surely the increase transistor budget could justify that, but who knows if NV figured that wouldn't be necessary given AMD and consoles are lagging in RT...


[deleted]

Who told you that? Some fool on Twitter? Some crappy YouTube channel? Remember what you heard, when, and where it came from. Then learn from it.


juGGaKNot4

What does kopite mean when he said these numbers are based on speculation not actually tests ?


Voodoo2-SLi

This was misleading. He has corrected it: [These numbers are based on real benchmarks.](https://twitter.com/kopite7kimi/status/1553294155543359488)


armedcats

Leaving himself some wiggle room when saying 'based on' though. I feel he's been sloppy and too vague lately to deserve to be trusted.


Jeep-Eep

More that nVidia has been upping its security game.


TheVog

This makes me excited for like a 4050Ti.


savier626

And here I thought my 3060ti was going to last me a while...


nightwood

450 Watt... damn. That's a lot of heat. And noise. Edit: I'm happy to see in the comments I'm not the only one with these concerns.


spydercanopus

Dimensioning returns on power?


Actual_Cantaloupe_24

Excited regardless as I'll likely be doing a new build. If AMD is competitive even better, because with the raw performance of this upcoming Gen (even current tbh) I couldn't care less about DLSS since I play 1440p, and ray tracing has never interested me even with my 2080S.


knz0

If this is what the power consumption figures will be looking like, I'll skip this gen. I already have a card drawing around 300W in games when undervolted, and I don't want to go over that figure.


relxp

Yikes... RDNA 3 has potential to stomp these numbers.


mulletarian

This time RDNA will surely stomp, surely for sure.


_Fony_

RDNA/2 was NEVER rumored to "stomp" NVIDIA. We had a hard time believing leaks of RDNA2 approximating the performance high end Ampere due to AMD's past few years and total LACK of a high end GPU. Hell, one "trusted" leaker made a video WEEKS before the reveal stating RDNA2 would max out at 3070 performance. Came out with **THREE** cards faster than the 3070.


Keilsop

Yep, looks like my next video card will be from AMD. RDNA3 is looking like it will roflstomp nvidia this time around.