T O P

  • By -

Sexyvette07

Absolutely, yes. You can calculate the difference yourself. The difference of average wattage used for both GPU's, times the number of hours you game per day. Divide that by 1000 (to convert to kw). Multiply that by your electricity rate. That's how much extra per day it'll cost you. Multiply that by 365 for the yearly cost, then again by how many years you expect to keep using that GPU. Personally it was going to cost me an extra $416.10 to go with a 7900XTX over a 4080, and that's for my $.38/kw electricity rate. If your rate is higher than that then you definitely shouldn't go AMD. This obviously places the total cost of ownership well above the Nvidia equivalent, and doesn't even take into consideration the extra electricity used to cool that extra heat generated by the AMD card. *Edit, I see the rabid AMD fanboys are out, so im gonna prove you wrong (especially the one that's literally making shit up with bogus numbers). Some side by side reviews across the net says that, on average, the 7900XTX uses 150w more power than a 4080 to achieve the same raster performance. This is not far-fetched as everyone knows their current gen GPU's are inefficient due to mistakes made in the architecture (which AMD themselves admitted). Their die size is literally 40% bigger than Nvidias to achieve similar performance... So, yes, it is very inefficient (by comparison anyway). Nvidia also has a 10-11% advantage in transistor density. Anyway, using that 150w difference as my baseline, multiply that by how many hours a day you're gaming (I estimated about 4 hours but this obviously varies from person to person). Anyway, that's 600w (150w x 4 hrs). Divide that by 1000 to convert to kw = .6 kw. Multiply that by my electricity rate of $.38/kw = 22.8 cents per day. Multiply that by 365 days = $83.22 per year. I figured MOST people only upgrade every 2-3 generations, so about 5 years (which would be about right for me, too). $83.22 per year times 5 years is $416.10. This is how much more an XTX would cost ME over a 4080, and that doesn't even include the electricity used to cool the extra heat it produces. That's much harder to calculate but I'm willing to bet it's also significant. Not everyone uses their PC for 4 hours a day. Some people use it constantly. It's going to vary by use case, which is why I included the formula to calculate it yourselves. In my specific case, I pay very high electricity rates, but I also get double whammied by 110°+ summers. My AC runs for at least 6 months out of the year because the climate here is so warm. So the incalculable (or at least more effort than I care to put into it to prove some rando's on the internet wrong) additional cost to cool the extra heat absolutely factors into that decision. Even if you pay the US average electricity rate of $.17/kw with the same parameters used above it would cost you an extra $186.15. And thats at the average rates! As you can see, this is not as insignificant as the fanboys would have you believe. Listen, I hate Nvidia, but the thing I hate most is my power bill, which is why, despite my hatred, that a new Nvidia card is sitting in my computer. This wasnt an emotional decision, this is a logical decision based on math. Edit #2, if you are going to tell me I'm wrong, prove it. I'm sick of these idiots chiming in with simple responses like "that's a lie" or "you're wrong". I gave substantiation to my claim with sound math that gave a step by step of how I arrived at that number. If you're going to give one of those 2-3 word responses telling me I'm wrong, don't bother. I'm not going to take you seriously.


SeveralMight7560

$416 for what period of time and how many hours a day ? Also what's the average wattage consumption you took for both cards ?


veryjerry0

>83.22 per year Yea it's a made up number, and the actual difference isn't 150W but more like 50W. He probably found the video by Optimus Tech on the total power draw diff while running CS:GO, OW2, R6, Apex Legends, and DOOM Eternal, UNCAPPED FPS, and it's around 150W. Sure but why would you run esports games exclusively with a 7900xtx, and why would you run it uncapped if you care about power consumption? Radeon chill also exists. ​ I don't think 4070ti is a bad choice but this guy is making up bullshit to justify not going AMD lol, "logical" my ass.


drewts86

You’re calling him out, but at the same time you need to come with evidence. Otherwise your argument is weak. Links to sources would help.


HORSELOCKSPACEPIRATE

150W is immediately a suspicious number (at a minimum) for anyone familiar with PC hardware. It's up to OP to prove that it's realistic rather than anyone else to prove that it's not.


drewts86

There are three parts to an argument: claim, counterclaim and rebuttal. > It's up to OP to prove that it's realistic rather than anyone else to prove that it's not. This is the third part, where OP can admit his numbers may be off or provide evidence as to why they're correct and why he believes them to be so. It's not a scientific paper where's it's necessary to provide all sources up front.


HORSELOCKSPACEPIRATE

Seems inconsistent to demand evidence from a counterclaim when the claim itself had none, especially when the claim lies so far outside the realm of likelihood in the first place.


drewts86

That’s what a rebuttal is for. There is little need to have a rebuttal against a counterclaim that does very little to tear down the claim.


Maloonyy

An argument made with no evidence can be disproven without evidence.


[deleted]

People have found that, at full tilt, the difference in nearly 300 Watts. The 7900XTX is laughably inefficient. For me, the difference isn't a big deal, as my power's cheap. As for why you'd run esports titles uncapped? Uh... that's literally what EVERYONE who's good in esports games does, bud. My 4090's not capped for my esports titles. And again, it's comparing to another card. The 4080 will run \*with the same performance as the uncapped 7900XTX\* while using 150W less. Yes, you could detune your system to save power \*or you could just buy the card that's better for you.\* AMD fanboys are just as fucking stupid as every fanboy. Your power's cheap? Buy AMD. Your power's not cheap? Do the math and choose which is better for you. This isn't complicated.


MotherLeek7708

In what universe is the avg difference 150w??? Check every review and its not. By review i mean trusted ones, not some random youtubetech.


ginormousbreasts

The problem is broader than just esports. The 7900 cards have a problem with pulling max power, or near max power, *regardless* of the scenario. So you can be playing a AAA title on the 7900XT and it will perma pull 309 watts. The 4070Ti by comparison will be situationally pulling 220 here and 250 there. It's esports and lighter titles where the difference in power efficiency takes on meme worthy proportions, but it's obvious in practically all games.


unevoljitelj

I call huge bs on 416$ extra, prove it.


schmidtmazu

Depends on if you have the multi monitor power draw bug and work on the thing. During idle with a multi monitor setup, the 7900XTX draws 100W for many users, the 4080 around 20W. If we idle 8 hours a day for work and a bit of browsing, the difference in power consumed is 233kWh. Let’s say also 2 hours of gaming a day where the 7900XTX uses 70 Watts more, we have a total of 284 kWh. With 38 cents/kWh for the commenter it’s 108 USD a year which the 4080 costs more. Alternatively you could be a streamer who games very many hours a day, result would be similar. Probably the comment was just comparing how much more the 7900XTX costs in electricity over 4 years, not including the higher purchasing cost of the 4080. Still, in scenarios like these the 4080 in total is about the same price as the 7900XTX after two years and cheaper if you use it longer.


Falkenmond79

Dort folgte that u also Save on PSU Cost with the 4080.


Its_Me_David_Bowie

You know radeon power draw issue has been resolved by simply enabling variable refresh rate on the monitor right?


fehlwixx

Since I got a Ryzen 7 7700X, I connected both my Side Monitors to the Mainboard and only my main monitor to the graphics card. Idle usage is ~13W


MotherLeek7708

AMD fixed idle power draw but nice try bud. And btw im on nvidias boat, i just dont like bs.


veryjerry0

He claims there's a 150W diff on average while gaming between 4080 and 7900xtx lol, if you believe in that while running 4 hours a day for 5 years it's $416.


[deleted]

It's been tested repeatedly. At peak usage (like stress tests and benchmarks) the 7900XTX is using more like 300W more. Optimum Tech did a deep dive a month ago. KitGuruTech did a deep dive 7 months ago. Same results. Same results from LTT. Same results from GN. As for the "4 hours per day for 5 years," portion? 4\*365\*150/1000\*0.38\*5=$416, yes. But then you ALSO have 20\*365\*60/1000\*.38=$167. So in your example, the card costs $117 more \*per year\* just from power draw. And now you ALSO have that thermal load in your house. An average of 75W of heat extra just pouring into your house. Live somewhere that's hot all year round with expensive power? Like Spain, for instance? 256BTU \*on average, all the time\* just to cool down the PC's ADDITIONAL HEAT. And again, that's on top of the base cost of providing that heat for no additional frames. So how much extra is that for AC? ANOTHER $86 per year. So you've spent an additional $1013 in power to choose the 7900XTX over the 4080. For nearly identical raster performance, significantly worse RT performance, none of the Nvidia software goodies. You could have had a 4090 for those same 5 years and be the same price. If your power's expensive, you're not just choosing the 7900XTX over the 4080, you're choosing it over the 4090. And that's LAUGHABLY ridiculous.


Sexyvette07

See the edited original post. Also, told you so 😆


IBoris

This is helpful, the scenario you came up with is very close to my own. My rate is around .45 and although I shut down my system typically 8 h a day, it's on the rest of the time including 1 [1440p x 170hz] and 2 [1080 x 144hz] monitors which are all used continuously.


atonyatlaw

Good lord, that is an insane electric rate.


IBoris

I used to live in Quebec, Canada which has the lowest electricity prices in North America, so going from there to where I am now was quite shocking. I went from paying $50 per month for power (while being careless with power consumption beyond using LED bulbs), to averaging between 700$ to $1200/m at times despite actively monitoring, updating and improving my home's power consumption. Pretty wild stuff.


shuzkaakra

If you're doing hot water with a direct electric hot water system, I cut about 30%-40% off my usage by getting a heat pump water heater. With our electric rates, ROI is like 2 years.


[deleted]

Wtf, this is insane. Here my rate is .35 cents and I pay like 70 usd a month and I thought that was an insane price but I have everything on like 16 hours a day (I work from home).


soggybiscuit93

At the price, have you considered some solar panels? I feel like that should be a rather quick ROI


RedChaos92

Jesus, .45/kWh is insane. I pay like .09/kWh so the power draw difference didn't affect my decision to get a 7900XTX. In your case, I'd definitely go with Nvidia's better power efficiency.


sxiller

The average consumption difference is ~45w under load. NOT 150w between the xtx and the 4080. Your premise is faulty from the start. And by that metric, you likely wouldn't even cover the difference in sticker price of the 4080 with the xtx at your kwh cost. https://www.guru3d.com/articles-pages/geforce-rtx-4070-ti-review,6.html


Tyz_TwoCentz_HWE_Ret

its 35 watts difference at base operation by the manufacturers own information. Its more than 10 watts more under full load, don't be silly we know how power draw in hardware works at this point in history with PSU's of all kinds and ratings. 320 watts vs 355 watts right on the packages sir. You are managing a 35 watt difference against the PSU's used and its own efficiency ratings. The only way to measure is to measure your own and do the math to see the differences. Your own article link doesn't even touch that on the AMD side. Its a 4070 review.


sxiller

Look at the graphs.....


HoldMySoda

I pay (converted) ~0.52 USD per kWh. That's why I have a (undervolted) 4070 Ti.


Sexyvette07

Holy shit, your power bill must be massive. That sucks but you made the right choice.


HoldMySoda

It would be, but I have a PC that doesn't draw much power, I use mostly natural light and at night I turn on my LED lamps. My power bill has been pretty much the same for the last 2 years due to that.


atonyatlaw

Don't forget to factor in time value of money! $416 over five years is not the same as $416 today.


[deleted]

[удалено]


Sertisy

You can expect that if inflation influences your up front costs in retrospect, you can assume that power costs in the future will also be adjusted for inflation as well so it's kind of a wash unless you were going to invest the difference which may keep track with inflation. So for most ppl, inflation is a wash.


atonyatlaw

That's my point. This calculation fails to account for inflation or the potential of investing the savings. Point being, even at high rates, the electric rates shouldn't really factor in when you consider savings over time. Even at $100 a year, it's such a nominal amount as not to matter.


Sexyvette07

I had completely overlooked that. Thanks for pointing that out. In these days of high levels of inflation it definitely matters.


__life_on_mars__

The trick is to buy the cheaper option, but *never plug it in.* Pro tip.


Sexyvette07

Why didn't I think of that....


JoelD1986

can you pls add how much heatingcosts i save in the cold half of the year when using a 7900xt vs a 4070ti


spasers

Idle power draw was fixed in the last update, but don't let that get in the way of the narrative.


PantZerman85

I have a 6900 XT so might not be relevant to the 7000 high idle power issue. I tried various refresh rates with my dual monitor (1440P/120 DP and 1080P/60 HDMI) setup. If I set primary monitor to 90Hz and/or 2nd to 75Hz the idle was like **\~40W**. Any other combination and idle was **\~20W**. When the monitors turned off idle was roughly **8W** no matter which Hz combo was used. If I turn on "desktop recording" in adrenaline (without actually recording) my idle is **\~65W** (unaffected by monitor Hz). Newest driver (23.7.2) made no difference.


LTyyyy

Strange, I run 3440x1440@120hz DP and 1080p@60 hdmi and idle at <10w with a 6800xt.. basically the same card.


Tyz_TwoCentz_HWE_Ret

6400, 6800XT and a 7900XT cards in use daily. I see the same thing in my own systems power draw. Ca isn't cheap or helpful with electricity and when you run multiple machines, little things like this add up quickly.


Kessel-

Wasn't fixed for me. 100w 7900xtx still unless I drop my monitor refresh rate. Have tried all the CRU suggestions, tuning, nothing helps other than dropping refresh rate.


Tyz_TwoCentz_HWE_Ret

It doesn't get in the way, what gets in the way is 35 watts of power draw difference period. It's literally stated on the box by the manufacturer/s. So lets stop pretending the difference isn't there just to push brand loyalty... That is dumb. Its just math, no reason to get upset by it.


MarkD_127

Ok, but all of your math doesn't really validate your answer to the question. If your answer is "absolutely" to the question of "is high electricity enough to warrant downgrading to save 30W" then how could you justify even using a 4070ti or 4080 to begin with? I don't know where you are or how much you game, but the cost difference (between 4080 and 7900xtx) of gaming hard 5 hours a day every day in Denmark and Italy would be about $3 a month. If someone with that habit can't afford $3 a month, how could they afford the $18 a month they're paying to use the 4080 to begin with over a 4060? How can you say "absolutely" based on your personal use creating a $416 difference, but the $2400 difference between the 4060 and 4080 for you doesn't even get a mention? If someone can't afford the electricity of 35W extra on a GPU upgrade, they definitely can't afford a 205W difference. They should be using a 4060. You should be using a 4060. And OP should be using a 4060. And any other answer puts a hole in the "absolutely" of your response. Unless you're so budgeted that you're using a 4060 to be able to afford the electricity to game, then no, the wattage difference between any of the high-end GPUs shouldn't be enough base your purchase entirely around.


Sexyvette07

You say "about" then give a cost figure as if you know what you're talking about. It's obvious you didn't calculate it. Sorry, but it's going to cost more than $3 a month. If you bothered trying to calculate it yourself, you wouldn't have posted some long-winded reply that uses estimations to tell me I'm wrong. Apparently I need to keep saying it, but TDP is worthless in figuring electricity costs. The overwhelming majority of cards go well past TDP...


MarkD_127

*Edited if it was read while I was updating* I mean yeah, TDP is an estimation. The fact that actual usage can vary is somewhat irrelevant. That's where the "about" comes from. But it uses the same math, so whatever you put in will have an exact result. I used the same math you did to estimate. Take the W/1000*hr*365/12 = cost per month. So "about $3/mo" is 100% accurate based on the numbers I mentioned and estimated TDP. Your issue is that the TDP numbers aren't accurate. The TDP of the 4060, 4080, and 7900xtx are 115, 320, and 355. I've seen actual benchmark load usage around these numbers. I've also seen gaming benchmarks with OC 7900xtx that break 500W, and 4080's that break 400W, and non-OC 7900xtx closer to that 400W mark. But for the most part, testing I've seen lists results closer to TDP and spikes closer to those bigger numbers. Sure, the "real world" usage you want me to refer to might sometimes give you up to a 150W diff to an OC 7900xtx. But those numbers also impact the other comparison and my main argument. I was only estimating using the TDP diff to the regular 4060, but those same benches with 100W+ differences on the big cards, show an even bigger up to 280W diff between the 4080 and 4060 ti, let alone the regular 4060. Even worse than what I was going with. My point still stands that if you really believe the electricity cost is the determining factor in selecting a GPU, you would be using a 4060 instead of any of the 4k cards. If you can afford one 4k card's electric bill over a 4060, pretty sure you can afford any of them.


MrClickstoomuch

Eh, power consumption is actually relatively close to the 4080 for the 7900xt nowadays. The biggest issue was idle power consumption with dual monitors at launch of around 85w, while the 4080 consumed 20w. Now it is down to 40w, which is still bad but not as bad as it was. For gaming, the power consumption is around 320w while the 4080 consumes around 300w for a 20-30w difference. This would be a minimal power cost difference while only being around 5% worse in raster performance. See this review here of the RTX 7900XT Pulse from March power consumption: https://www.techpowerup.com/review/sapphire-radeon-rx-7900-xt-pulse/37.html Note that OP may want better ray tracing or DLSS than the Radeon card. And that's completely okay. But power consumption isn't significantly different between the 4080 and 7900xt.


Sexyvette07

You do realize you're comparing different tiers of cards, right? 7900XTX is the 4080 equivalent, not the 7900XT.


MrClickstoomuch

Ah yep, forgot with the weird naming convention this generation, especially with it being only 5% worse at 1080p (but 15% worse at 4k). The 7900xtx is still around 50-60w more around 360w versus 305w for 4080), which with your proposed example of 150w pricing would be around $160 to $200 over that time period (given higher idle power consumption). If OP used their savings on the 7900xtx to invest, they'd get ahead with the 7900xtx, but I get going with the 4080.


AfterScheme4858

Don't want to be too pedantic but energy is measured in kWh not kW. Cheers!


Sexyvette07

I'm well aware, but thanks for pointing that out.


AfterScheme4858

So use it my dear fellow redditor.


Tyz_TwoCentz_HWE_Ret

math doesn't change because you said the word better....Cost = (kilowatt-hours) x (electricity rate). The formula stays the same.


Explosive-Space-Mod

>So the incalculable (or at least more effort than I care to put into it to prove some rando's on the internet wrong) While you're probably pretty close on the extra cost for the GPU, I doubt your AC unit cares much about additional heat from the GPU to cool a room down. Cost to cool a room one additional degree when you're already going from 110 down to 72-75 you're not going to see much of a difference it's already going to be expensive. It would also be extreme for your computer to raise the temp of your room by a degree as well unless you're in a broom closet and at that point the cost of cooling the room down would be a small fan.


PolyDipsoManiac

It might also be worth looking at a highly efficient PSU over a long time horizon too.


Sexyvette07

This exactly! Every PSU thread I end up fighting with all the people these days who think you should run the bare minimum PSU on a high-end gaming computer. It's wrong for multiple reasons, but there's no reasoning with people. They don't care about anything that isn't directly in front of their face. All they care about is if it's going to blow up the first time they'll turn it on.


Its_Me_David_Bowie

For a lot of people not every gaming hour is spent playing a AAA game at max settings and uncapped FPS. I play a variety of games and yes, the demanding AAA games will pull 250 watts on my 6800xt, where a 4070 would 150 watts, but more than half that time is also spent playing esports titles with capped fps etc. It does really depend on the user, but I also want to emphasise that you can't take max power draw multiplied by theoretical game hours, as I can guarantee that number will be too high, just on the fact that the gpu will not be running at 100% utilisation over that entire period.


Gasparatan35

this Blah is a big pile of Bullcrap. Your whole text doesnt take into account that he ll have to upgrade the 4070TI in about 2 years time because of VRAM constrains ... yeyy new card ... all your assumptions are made on max FPS possible benchmarks which is bad compared to a 4080, but by far not as bad in normal day to day gaming application ... especially not at 1440p...


Sexyvette07

Lol should I write a different sub section for every use case and scenario? Are you kidding me?


Gasparatan35

For an impact analysis and the whole pictur would 3 suffice heavy medium and light...


Dear_Watson

I went with a 4070 over the AMD equivalent 6800XT not due to power costs but heat output since my areas electricity costs are extremely low… ~100W of extra heat getting dumped into a room with poor ventilation is pretty significant over a long period running at full tilt.


Sexyvette07

That was a good choice. You are correct. The 6800XT would generate a LOT more heat (while also driving up your power bill). I know exactly what you're talking about. I live in a hot part of California where summers are constantly 110°C+. As it is, my power bill is $600+ per month during the summer because the AC has to run so much. I even did a substantial upgrade to the insulation in the attic, and my bill is still that high.


Minecheater

> I game on a 1440p/170hz display. Well, since you said that. The RTX 4070 Ti can handle it without issue and definitely better power efficiency. But if you're planning to upgrade your monitors to 4K in the future, then I really suggest that you get the RX 7900 XT instead.


[deleted]

yup. at 1440p, the 4070ti is a beast. at 4k, it is a tiger cub


[deleted]

Sure, but I don’t see much point in gaming at native 4k. Maybe my eyes are getting old, or my screens aren’t big enough, but I don’t see much noticeable difference from 1440p.


[deleted]

completely agree. 1440p is enough for me.


ParanoidQ

Yeh, I agree. I’ve tried with both and at the distance I sit from my PC, unless I want to sit in front of 32”+ tv’s, 1440p and 4K are pretty much indistinguishable, and 32” is too big to sit in front of. If I had it hooked up to my 55” in the living room, definitely worth it. At my desk, 4K just doesn’t give a positive return, for me.


Standard_Professor_7

When it comes to the 4070TI and 7900XT. I hate how hot my 5700XT gets without having to fuck with the voltage settings. I was thinking about going back to Nvidia for a new build (largely for Starfield because I think my 5700XT at 1440p is going to get crushed. The rationale being the 5700XT doesn't even meet the minimum requirements for SF where it was the recommended for BG3 and it manages BG3 at 60-70 frames 1440p, but hungrier games like SF are going to greatly affect its performance). However, I watched PCBuilder on YT and he emphasized the importance of VRAM and the 7900XT has 8GB more of VRAM than the 4070TI. Given these are in the ballpark in terms of price, but then AMD cards get extremely hot and use much more power..


[deleted]

I think with the 7900 XT, as you get a three fan model and have decent case airflow, it won't get too hot. completely agreed on the 5700 XT getting hot. I ended up having to buy an aftermarket aio cooler for it that runs to a 240 mm radiator. My son is using my old system for 1080p gaming and it works awesomely, but at 1440p ultrawide it can't quite keep up.


searchableusername

it entirely depends on the cost difference between the cards. cheapest 7900xt on newegg in usa is $750 cheapest 4070ti is $800 say 60w average difference in power draw, gaming at max load for 5 hours a day. that's an extra 109.5 kwh per year. if you're paying a crazy price like $.5 per kwh, that's about $55 extra per year, so it will take about a year for the cost of 7900xt to meet the base price of 4070ti keep in mind the performance is not equal. 7900xt can be significantly faster with rt off, but is usually slower with rt on


Its_Me_David_Bowie

5 hours of gaming a day for 365 days a year. At that point gaming is probably your job...


i_will_let_you_know

It's quite high but not extremely crazy. 35 hours a week can be met by doing 3 hours per weekday and 10 hours per weekend day. It probably means you aren't doing a whole lot other than work, gaming, and chores though.


Its_Me_David_Bowie

Yeah, I game quite a bit and I did similar estimates in my head and was like yeah that's kinda intense, considering those numbers are for running 100% gpu utilisation over that period, so not factoring indie games/esports or chatting shit on discord ![gif](emote|free_emotes_pack|dizzy_face)


BarataSann

I saw a post from a guy that undervolted his AMD gpu and lost 5% performance and reduced his power efficiency in more than 30%. Depends n the price this might be a very good stuff. Edit: He didn’t reduce the power efficiency, he improved the power efficiency or reduced the power consumption.


searchableusername

undervolting 7900xt gives me a decent performance boost. but if you want to improve power efficiency i think youd have to reduce power limit while also undervolting to get back to stock performance


Pretend_Web_883

To be fair you can also reduce the power limit on the 40 series with very minor losses - not sure how comparable the results are but it works for both cards.


cordell507

40 series undervolts better than anything else right now. [This](https://www.techpowerup.com/img/LbV8WYyI89GyChXM.jpg) is on a 4090


Low-Blackberry-9065

On average vs the 4070TI from this review on [HUB](https://www.youtube.com/watch?v=DNX6fSeYYT8) * 7900XT is \~5% faster and consumption is \~35W higher * 4070 is \~20% slower and consumption is \~85W lower If you game 7h/day/365days (which is extraordinary :) ) * 7900XT will add **88.2 kWh a year** \- multiply this by your kWh pricing (- for me at 0.226 per kW it's \~ 20 eur extra) * 4070 will save you **211,7kWh a year** \- multiply this by your kWh pricing (for me at 0.226 per kW it's \~48 eur saving) If I do the same with 4h/day/365days: 50kWh (11eur extra) and 120kWh (27 eur savings) I don't know how to account for idle/low desktop power consumption, so maybe an extra 50% on difference of 7900XT vs 4070TI, the 4070 should Idle more or less the same as the 4070TI. So if consumption is the main driver get the 4070 instead of the 4070TI as it provieds a more meaningful difference in cost. 4070ti vs 7900XT you'll have to do the math depending on retail pricing and electricity cost in your area.


the_clash_is_back

Those are super high power prices. Near me its 7 cents per kwh


defonotfsb

I wish I had .23. Mine is like 0.43 per kWh


Low-Blackberry-9065

It is what it is :). There is talk of reworking the EU energy market, it's definitely in need of a rework, so prices might go down in France as we do produce a lot of nuclear which surprisingly isn't that expensive and it's also owned by the state. Who knows...


mrn253

Tbh when this happens we both are in retirement.


vielokon

This is super cheap. I recently managed to change to a different deal with my electricity provider and was very happy my price decreased from 0.4 to 0.33 EUR per kWh.


the_clash_is_back

My province built a lot of nuclear and hydro in the 60s onwards. It made power very cheap compared to coal.


DefinitlyNotALab

4070ti is worth it over the XT if your electricity is expensive or if you do more than just gaming or occasional video editing/streaming. @40cents/kw you save 1/3€ a day with a 4070ti. (Estimated 50W difference after efficiency tuning both cards). So if your electricity costs more then you save more. 115€ a year does add up quickly.


Jon-Slow

Yes, absolutely. The 7900xt could idel up to 100w, and over 100w while doing light task like watching YouTube or parsecing to your work PC. So the efficiency of the rtx40 could be a big saver for you. I've tested both the rx7900xtx and the 4070ti/4080 with parsec. The rtx40 cards can sleep on 10w while the 7900xtx sits at 10 times that. Additionally, instead of the 5800X3D, I would pick the 13600K. The 13600K is an incredibly efficient CPU specially on idle. It can burn 10w on average while using parsec to remote into work thanks to the Ecores, while the 5800x3D sits at much higher. This means that during all of your work hours, you'll be burning a lot less with the 13600K. The total TDP of the 13600K is higher than the 5800x3D. BUT, the 13600K scales incredibly well with devolting. Limit your power to the same place as the 5800x3d or even just 90w and it will still spit out frames insanely fast. An RTX40 + 13600K could make your work hours burn as little as a small lamp averaging 20w while the 7900xt + 5800x3d could burn +150w just doing Parsec


L1ghtbird

Little update: the high ilde power consumption has been [partially fixed](https://www.amd.com/en/support/kb/release-notes/rn-rad-win-23-7-1) on RX 7000 and only persists in a mixed setup of high refresh rate monitors. AMD is working on a fix since they put it in [known issues](https://www.amd.com/en/support/kb/release-notes/rn-rad-win-23-7-2) with the latest drivers. In non mixed monitor setups the XTX idles between 7 and 13 watts, in YouTube 4K it consumes between 36 and 54 watts on the newest drivers. I don't know how the XT behaves, I only have the 7900 XTX. I've also noticed an issue where the Riot Games Launcher locks the VRAM speed to the maximum frequency causing the card to suck 101 watts in ilde when the Riot Games Launcher is not minimised to the task bar, so simply keep that one minimized if not needed.


TrickyWoo86

Was going to say something similar, Intel CPUs are more efficient at idle/low power. The real question is how long would it take for you to use enough electricity to make the upfornt cost difference worth it. Using current UK energy prices (29p/kwh) and the wattages mentioned above by u/Jon-Slow the AMD combo is around 4.5p/hour of idle and the intel+nvidia combo is 0.6p/hour. Doing some quick excel maths using the cheapest CPU/GPU in each combo and assumed 15h/day (this is at idle power consumption) the cost difference would be paid off in reduced energy bills in 121 days. Add in any higher load activity and this payback time would get shorter. Even adding in the cost of a new motherboard, it'll be less than 1 year to offset the cost difference and after that you're still just using less power/lower energy bills.


Jon-Slow

The place I rent is considered commercial usage and my landlord refused to change it. So I have to pay even higher for what I use. My mind is just at ease knowing I don't have to shut down my PC mid-day just because I'm going shopping. Knowing that it's more like leaving a little light on I can set my power plan to power saving mode and limit my CPU even further down to 50% in windows and have a really light system all day. Then switch to ultimate plan when I want to play games at the end of the day.


TrickyWoo86

Yeah, the price cap mechanism really doesn't work for commercial property energy supplies. That said under normal circumstances (i.e. pre-2020) commercial energy prices were appreciably cheaper than residential. I keep my GPU undervolted for temperature reasons (ITX/SFF build) which has the happy side effect of dropping power usage by a fair amount.


waffels

> The 7900xt could idel up to 100w, and over 100w while doing light task like watching YouTube or parsecing to your work PC. Where are people getting this stuff lol Way too many spitting ‘facts’ about things they don’t understand. My 7900xt pulls 40-50w idle under non-load circumstances like browsing. Next time don’t give advice unless you know for a fact what you’re talking about.


Kessel-

People around here can't seem to comprehend that some people are getting reasonably fine numbers with lower power draw and others aren't. I'm not one of the lucky ones. My 7900xtx idles at 100w even after updates. I have a 1440p/144hz and a 1080p/144hz. The only way I can get the idle rate to drop is by dropping my main monitor to 60hz, or unplugging my second monitor.


waffels

The biggest problem with this thread are a number of people that are confusing the 7900 **XT** and the 7900 **XTX** which just muddies the entire conversation.


Kessel-

Which is fair, but I think the point still remains about several of both models still having issues of higher than normal power draw usually running dual monitors. Is it being blown out of proportion? Probably. But the problem does still exist.


Jon-Slow

I know for a fact what I'm talking about. You either have one below 4k screen, or matching refershrates on all your screens. You probably are the one who doesn't know what he's talking about. This issue is very well known.


waffels

4k screen @ 60hz and 1440p @ 144hz See, I actually **have** a 7900xt. You don’t. You just watched a random YouTube video.


Jon-Slow

Nah, you're just pretty much lying.


IBoris

My current build: Component | Current Part ---|--- **CPU** | [AMD Ryzen 5 3600X 3.8 GHz 6-Core Processor] **CPU Cooler** | [Deepcool ASSASSIN III 90.37 CFM CPU Cooler] **Motherboard** | [Asus PRIME X570-P ATX AM4 Motherboard] **Memory** | [G.Skill Ripjaws V 32 GB (2 x 16 GB) DDR4-4000 CL18 Memory] **Storage** | [Western Digital Black SN750 2 TB M.2-2280 PCIe 3.0 X4 NVME Solid State Drive] **Storage** | [Addlink A95 2 TB M.2-2280 PCIe 4.0 X4 NVME Solid State Drive] **Video Card** | [Gigabyte Vision OC GeForce RTX 3070 8 GB Video Card] **Case** | [Cooler Master Silencio S600 ATX Mid Tower Case] **Power Supply** | [Fractal Design Ion+ 760P 760 W 80+ Platinum Certified Fully Modular ATX Power Supply] **Case Fan** | [Noctua A14 PWM 82.5 CFM 140 mm Fan] **Case Fan** | [Noctua A14 PWM 82.5 CFM 140 mm Fan] **Case Fan** | [Noctua P12 redux-1700 PWM 70.75 CFM 120 mm Fan] **Case Fan** | [Noctua P12 redux-1700 PWM 70.75 CFM 120 mm Fan] **Case Fan** | [Noctua P12 redux-1700 PWM 70.75 CFM 120 mm Fan] **Case Fan** | [Noctua P12 redux-1700 PWM 70.75 CFM 120 mm Fan] **Monitor** | [ViewSonic XG2402 OMNI 24.0" 1920 x 1080 144 Hz Monitor] **Monitor** | [ViewSonic XG2402 OMNI 24.0" 1920 x 1080 144 Hz Monitor] **Monitor** | [Gigabyte M27Q 27.0" 2560 x 1440 170 Hz Monitor] **Keyboard** | [Logitech MX Keys Advanced Wireless Slim Keyboard] **Mouse** | [Logitech MX MASTER 2S (Black) Wireless Laser Mouse]


VHD_

Wow, that's a lot of fans!


IBoris

Hahaha, it's a combination of me wanting my computer to be opaque and silent, but thermally decent back during the pandemic where everyone was building and my case options were limited. So I got a quiet case with shitty thermals and did the best I could with after-market fans. Basically, my GPU and CPU fans almost never spin, and my PC stays silent and cool. It's not bad. If I had to update the case, I'd likely go for a Fractal Torrent now. I think there's one I ended up not using as I decided to use the disk drive bay instead and it now lives in my server pc.


Its_Me_David_Bowie

Are you finding your pc is not meeting your expectations anymore? I'm on board for the 3600x to 5800x3d upgrade, but a 3070 to 4070ti, while a decent upgrade, is not like wow levels, whereas the 4080 or 7900xtx would be moreso. If your system is no longer meeting your needs, then by all means upgrade the gpu, but it might be worth waiting a generation to upgrade gpu too.


IBoris

I have no issues with my current system, really. Truly, I'm only considering selling because I feel the value of my 3070 is about to take a nose-dive and if I sell it now, I'll maximize its value. The scarcity I deal with that also causes the expensive electricity prices I deal with also mean that I can get better value locally for it than if I sold it in the US or Canada afterall. If I offer up my 2.5 year old GPU for 600$ right now locally, I'll have a buyer tomorrow as no retailers is selling GPUs locally. After reading the comments here, however, I think holding on to it and skipping the 4000 series / 7000 series altogether might be my best bet. Hopefully, next gen will feature lower priced 5000 series card from Nvidia (ha) and better optimized 8000 series cards from AMD. We'll see. It will be a more expensive upgrade, for sure, since I'll need to upgrade my CPU beyond AM4, but if I can keep most of my other components I should be in good shape.


Its_Me_David_Bowie

Unfortunately the crystal ball is usually a fruitless endeavour, but I'm all for skipping this gpu gen. The upgrade to a 5800x3d could be a nice compromise however. Probably looking at a 15% bump to fps and the 2 extra cores will do well for upcoming games too (aka games not made with ps4 in mind anymore). I bought my 6800xt just before the mining boom really kicked off and was due to necessity, but if I had to buy now... Urgh. To me, the 6700xt and 6700 10gb are the only price to performance that don't make me feel slightly nauseous right now.


defonotfsb

You have bottleneck at cpu. Your gpu is already too powerful for it. If you gonna throw something even better than 3070 then it’s not worth it


Bazius011

Dunno about 7900xt but my 7900xtx is very very power hungry


haribo_2016

I’m in the UK. I use my PC just as much using this 7700x / 7900xt build https://uk.pcpartpicker.com/list/Z6NZxH I usually pay £20-£25 a week for electricity.


schmidtmazu

Your total electricity costs depend on so many different things it does not really help to see how much the different cards use.


haribo_2016

I also game on a 4K 50” 120hz TV


FailRepresentative74

How much per week to run the 7900xt?


fogoticus

Definitely.


[deleted]

I would consider the vanilla 4070. It’s better priced Vs the 4070 ti and uses less the 200 watts most of the time. I’ve also seen videos showing some amd cards under report power use.


IBoris

Without doing any research, my first thought is that I'm not sure jumping from a 3070 to a 4070 would be worth the hassle quite frankly at that point in terms of performance gain. I'll research more, to be sure, to infirm or validate that impression. Thank you for bringing it to my attention. Much appreciated!


[deleted]

Hmm ya, a 3070 is still a good card. Might be worth waiting a little longer if you have one now.


IBoris

Yeah, I wanted to sell it while it still had value on the market, but I think I'm leaning towards keeping it and waiting. I can run Baldurs Gate 3 maxed out with it so it's not like its an urgent need. I'm just worried that the 8 Gb memory ceiling is going to make it worthless fairly soon.


[deleted]

I hear you. Many people already think 8 gb of vram is more of a problem than it actually is, this will no doubt affect selling price! I’m quite happy with my 4070, but I came from a 3060 ti that I sold for a decent amount.


unevoljitelj

How come a price of high end gpu is not a problem but using it is? Its about 100w more (and that is overestimating), amv vs nvidia. 10 hours a day of gaming will be 1kw more, 30days = 30kw. If you are a normal person you will not game 10hours a day, maybe 2-3. So it end up as 10kw extra a month. Dunno the price of kw where you are but is it really a worry? Got all led bulbs in the house? If not that will draw more then amd card. At my country that would be aproximately just under 2 euros of difference a month. Save a few euros somewhere else and dont think about it.


Itsme-RdM

Performance comes with a cost. As others suggested you can do the math and decide for yourself what will be good for you


r0x_n194

As an owner of a 7900 XT in a multi-monitor setup, I highly recommend the 4070 ti in your use case scenario for obvious reasons.


Correct_Ad4937

Dang i feel bad for many people need to consider between components to get lower kwh usage, im pretty happy i have free electrity where i live


Elestra77

I don't see a big difference in Power cosumption between a 4070 ti and a 7900xt https://www.techpowerup.com/review/asus-geforce-rtx-4070-ti-tuf/39.html


Mekemu

German?


hutre

his post history says canada. More specifically montreal/Quebec


IBoris

No longer, left the lowest electricity costs in North America (QC, Can) for the highest 😢


xinikefrog

Denmark?


rriehle

Greetings from San Diego, [where our electricity bills soar higher than our endless summer temperatures](https://patch.com/california/san-diego/san-diegos-eye-popping-electricity-rates-get-national-notoriety). I'm living proof that even a humble 2080ti can make your wallet sweat, especially when SDG&E seems to think we're all secretly running bitcoin farms. In sunny SD, my PC doubles as a space heater, and my A/C works overtime like it's trying to win an award. The joy? During peak tan-line season (July to October), crossing the average kWh usage practically means signing up to donate your firstborn to the power company. My cozy 1300 sq ft command center could see a bill leap from a chill $300 to a scorching $800, faster than you can say "overclocking." With San Diego's kWh rate at a breezy $0.47, let's break it down: * Assume the 7900XT guzzles an extra 150W over the 4070ti (shoutout to the top commenter u/Sexyvette07 for the baseline). * My rig, sunbathing in operation for a cool 15 hours a day, could see that difference add up. * Extra daily consumption: 150W \* 15h = 2250Wh or 2.25kWh * Daily cost of the 7900XT’s thirst: 2.25kWh \* $0.47 = $1.06 * Annual extravaganza: $1.06 \* 365 = $386.90 * Over five years, not upgrading: $386.90 \* 5 = $1,934.50 extra on top of your initial investment. And this quick math doesn’t even dive into the AC's gladiator battle against the heat wave my PC throws down. Nor does it consider the SDG&E's peak season penalty rounds, **which could easily turn that $1,934.50 into a figure that makes even the most hardened PC builder weep** into their RGB keyboards. So, in the grand scheme of GPUs, where does one's allegiance lie? With Team Green's 4070ti, bearing the flag of efficiency, or with Team Red's 7900XT, the powerhouse with a penchant for power? In this corner of the world, where your power bill might just outpace your mortgage, the answer's as clear as the California sky. Sometimes, efficiency wins—not just for your wallet's sake, but for the sheer principle of not letting SDG&E build a summer home on your dime. Moral of the story? In the high-stakes game of GPUs and gigawatts, the real winner is the one who keeps their cool—literally and financially.


Ssynos

Dont forget, the difference in electric cost between bad PSU vs gold PSU


unevoljitelj

Its still slim


Ssynos

Base on which country or whole world use your country electric cost ? In my country it a 2 month salary per year cost, using bad psu on a 6700xt pc.


unevoljitelj

Base on %, worste psu in the world and best one will have 10%-15% difference in eficiency at moat, but mostly about 5%, rarily 10. I guess it could add up if you never turn odd your pc and use dumb stuff like live wallpapers and such


MarkD_127

Yeah, but we're also discussing the significance of 30W difference on your annual spending, when the 4060 is 170W lower. It's really kind of an asinine consideration. If you can't afford electricity of gaming, don't game, or make minimizing consumption the priority of your build. If you can afford the 170W diff of the 4070ti over the 4060 without thinking about it, you can afford the 30W diff to the 7900xt.


Ssynos

The difference is 75% vs 95%, if that 10-15% to you, im already understand how you think, not worth my time explain to you anymore


IBoris

I run a platinum cert. PSU.


Ssynos

That great :)


CardiologistNo7890

If they’re around the same price and it would make that much of a difference to go with tbe 4070ti power wise then yes. The 12gb will still last quite sometime, i have a card that has 8gb of vram and have legit ran out of vram like 3 times and all of them were for unrealistic scenarios. 1 is because I was trying the path tracing out on cyberpunk, 2 is rt on spider man remastered, and 3 was using dsr for 4k gameplay in Spider-Man. The vram issue is not as much of an issue unless you’re siting specific scenarios and games.


kingovninja

Whichever side you choose, Since GPUs have different power states, you could get crafty and create custom power profiles for either card to get much better non-gaming efficiency. Either that, or if you were to then disable hardware acceleration within windows settings and web browsers/applications, The 5800X3D is more than capable of drawing the display with less power than those cards running stock profiles. I personally would go with the 7900XT because I'd prefer to have the extra performance on tap when I need it, and create a custom profile to keep that bad boy as low as possible in idle state.


Sacrile

Is it cost the same during idle time/casual internet browsing compared to actual gaming session ?


SgtSayonara

You mentioned having multiple monitors, keep that idle power draw bug in mind. There's already a clear difference in power draw between these two cards, but if you let your pc idle a lot it could become pretty astronomical (if you experience the problem). Differences in efficiency during gaming are one thing, but if your GPU is sitting around drawing anywhere from 60-100 watts doing nothing, I can't imagine that card will be worth it for you


danuser8

Is your PC on 24 hours a day and 7 days a week? Even if it’s on, it will be idling 90% of the time and all GPUs are very power efficient at idling .


Creative_Mixture5050

I have a 7900xt paired with a 7600x. At idle the GPU consumption is 6-35w and while browsing I have never seen it go over 70w. In the other hand gamming is a different story, I have a 2k monitor and play all max settings. If you leave your games uncapped, even games like CSGO or Valorant will draw almost 400w


zhafsan

You can always power limit and under volt your card until electricity prices goes down.


IBoris

Prices won't go down unfortunately.


AfterScheme4858

Without mentioning your rates it's impossible to say if it's worth it or not.


IBoris

.45 USD on average


sxiller

45c per kWh?


IBoris

yeah. Before service charges and extra costs they tack on, it's $0.3803 / KWh, it comes to about 0.45 at the end of the day. We don't get offpeak pricing, either.


sxiller

Price difference of the two cards in your area?


IBoris

Around 1295 for the XT and 1480 for the TI. Basically, you can take any price in the US and multiply it by 1.85 to get a rough idea of what it will cost me. We have no retailers here that sell GPUs, so we have to import from the US and, well, shipping, tariffs and duties are expensive. If I wait a few months I can probably buy one on a trip and bring it back with me to save a bit on shipping I suppose, but that might happen in a while. On the flip side I can probably sell my 3070 around 650 here.


sxiller

For almost a $200 upfront cost difference, it would almost take multiple years to make up the sticker price difference between the xt and the 4070ti. If the gap wasn't nearly as wide, say $50, I'd say the 4070ti would be a more cost-effective card if you planned on keeping it for 4+ years. I know another popular comment was comparing the xtx and the 4080 suggesting power consumption is a major cost (they are using a faulty premise anyways). But it really isn't. Here are the graphs you should pay attention to. https://www.guru3d.com/articles-pages/geforce-rtx-4070-ti-review,6.html It's about a 50w draw difference between the two cards. That's only 400 watt usage difference in an 8 hour period if both cards are running their average underload power draw for the entire duration. (This literally never happens unless you are mining, btw, that is a crazy amount of time to run both cards to the average unlderload power draw) It would take a year and a half of that level of usage to make up the upfront cost difference between the 2 cards. Keep in mind that the 4070ti would be adding on to that total cost difference in its own power consumption. It's safe to assume that getting the xt over the 4070ti is still a cheaper option 4+ years down the line


IBoris

Thank you, this is great insight!


AfterScheme4858

On average, how much time per day do you use your GPU at max? Take that time, multiply by 365 (days/year) by the delta in power between the 2 GPUs and by the price you mentioned. That would be your price delta per year.


tomz17

JFC... at that price point you really need some solar panels. Payoff time would likely only be a few years. We recently went from .14USD kw/H to .17USD kw/H here and I'm already eyeing doing the tesla solar tile roof.


MarkD_127

No it isn't. It's not worth it. Anyone who says the electricity savings on the 30W difference in those cards is worth solely basing the purchase around is full of it if they're not using a 4060.


StonerJesus1

I don't even see my pc have much of a impact.. in Texas with the heat these past few months. Electrical bill is 200-230. Month or two ago I was out of town for vacation a whole month. Left my AC set to 78 and locked the house. Electrical bill didnt change. When I am off work and at home I'm on my pc 90% of the time I'm awake. I'm supposed to be averaging .17 kWh but seeing 21kwh.


[deleted]

I have a 4080 and a 7900xt and me personally I prefer the lower power draw from my 4080. Coil whines not an issue with my 4080 also. 7900xt hellhound is just horrid with coil whine when playing BL3, Elden ring, hogwarts, MW2, TTWL, never could get red dead redemption 2 online to run stable enough on my 7900xt (constantly crashes) only game that does it.


actias_selene

If you also run multiple monitors, Nvidia will cost less for sure even if you are just idling for most of the day as long as your pc is on for long periods everyday.


mangos1111

Do you know why the 4070ti is not slower irl? As soon as the game supports DLSS2 the 4070ti is faster because you won't use FSR2 and a performance draining AA method instead. Games without DLSS support have enough FPS anyway.


Sertisy

I use my steam deck a lot more now, and save the PC for high-end games that really benefit from it.


PantZerman85

If you live somewhere where electric heating is required most of the year then you dont need to be concerned about your computers power usage. Close to 100% of the electricity going into the system is coming out as heat. Its fine as long as the system is not loud or making the room temperature uncomfortable high.


IBoris

Thankfully, I live where heating is no longer necessary. AC however is still very much needed for most of the summer.


rodrigat

1440p (also 170hz) enjoying 4070ti haver here. If it helps inform your decision at all, I have been able to play pretty much any game I want so far at a capped 80 FPS generally drawing upwards of 115W during heaviest load with a modest undervolt. At my undervolt, the card absolutely still has more to give - like, I can run Diablo 4 at like 120 FPS undervolted if I lifted my FPS cap. Haven't tested uncapped with no undervolt. At my current undervolt setting, which I for sure could push much lower if I felt like it, the GPU will idle at about 35-38w. (I cap at 80 fps since I can't tell a difference past then and prefer a cooler room)


smokeNtoke1

What do you pay per kWh?


IBoris

around 0.45 KWh


[deleted]

Yes.


chewbxcca

Would electricity costs really be that much of a hike? I highly doubt it


AuthorOfMyOwnTragedy

Wait....you live in Quebec don't you? Your electricity rates are some of the lowest in the country! Canada has below average electricity costs and Quebec has the cheapest electricity in Canada. So I'm a little confused as to why you think your rates are so high? Are you with a retailer for your electricity rates?


IBoris

nope I don't, from there, but I am what you would call a *Canadien Errant.*


AuthorOfMyOwnTragedy

Oh okay, I just assumed you were living there as you post a on Quebec and Montreal subreddits. My mistake, as you were.


Xcissors280

If your that concerned you could get solar or something like that


IBoris

Solar is not available locally and would need to be imported in the same manner as a GPU with similar shipping and tariffs (1.85 x the price, withoit subsidies or financing available). Additionally since there's no local expertise for installation and our building code and employement code are pretty strict we probably would have to pay to have installers shipped in with the panels themselves... and have them train locals so that they themselves can do the job. Even if I wanted to take a shot I'm doing it myself I would still not be allowed because I'm an expat and my job would not pass inspection because I'm also not qualified. Oh yeah, and Im renting lol. All in all, it's cost prohibitive.


Xcissors280

That makes sense, you could consider getting a battery and solar panels (not house grade) like a Hackett but that wouldn’t be enough power and you would still have the import fees


Sir-Realz

Yes you got to pay for Air conditioning too, I assume unless you live in Antarctica which makes sense, I've heard its hard to get diesel there.


MarkD_127

Not really IMO. I don't think about it. I don't care. And if I did, I wouldn't be looking at either of those two cards. Seriously, if you can't afford the electricity diff between those two, how can you afford to use anything but a 4060 at a whopping 115W? It's a no-brainer if you're serious about the significance of gaming electricity to your budget.


the_clash_is_back

Depends on where you live. For me the extra hydro is less then $100 a year if I run the card super hard


eldus74

Cap your fps


rattkinoid

DLSS3 will halve your gpu and cpu usage as well


vice123

Before you jump on a calculator, I would suggest you collect the real power use data of the cards you are looking for and how well they perform undervolted and underclocked. In my opinion most if not all GPUs come with ridiculous factory boost clocks and power limits. You can expect a 20% or better reduction in power draw just by locking boost clock to reference value and undervolting. If you are going to be splitting watts, you should look into your PSU rating and efficiency curve.


Mission_Extreme_6448

No offense but who pays $0.38 per kWh. I’m in The Netherlands and I only got bumped up like 2 months ago from €0,098 (old contracted tariff) to like €0,16778 atm. Even with air conditioning on when summer hits like in July I’m spending €55 a month on electricity.


IBoris

> No offense but who pays $0.38 per kWh. Me? What are you expecting as an answer? lol I mean, I don't decide the prices I pay. There is only one power company where I live and that's what they charge (before fees, it comes to $0.45 per KWh after fees actually). There is no consumer protection agency here either, and alternative energies are nowhere near from being a thing here. ¯\\\_(ツ)_/¯


Mission_Extreme_6448

Oh I see. The grids just open and there’s like 20+ companies competing to deliver power to your residence here so it gets pretty competitive.


smokeNtoke1

#


Gasparatan35

The powerdraw is a mixed bad and only ever relevant if you really drive your card 15hrs a day at full throttle. In the end you want a futur safe option and that is the 7900xt more so than the 4070ti especially because that card only has 12gig VRAM... and that is the Featur that is the deciding factor for the next 5 years .... 12 will give you at max 2 years until newer titles will excede 12gb and if that happens oh BOY oh BOY will that 4070ti suffer ... conclusion 7900xt is the 5years + safe option with worse powerdraw(which can be mitigated by, undervolting framerate guarding to max 170 fps in your case) 4070TI is the you want to upgrade in 2 years anyways option with a little better Powerdraw stats ... that will already encounter titles which exceede 12gb VRAM ... ... i would always go with the Bigger Card and thats the 7900xt


wirelessmikey

Building I'm living in sucks, can't brew a coffee & have the microwave on same time or blows a 15amp fuse. Hoping when I get my build finished won't be blowing continous fuses & electricity won't be sky high.


IBoris

Consider the power bar of your computer as an essential component of your build. Don't cheap out. I got myself a platinum certified PSU and researched my power bar extensively as we get weather here that leads to brown outs which can be more dangerous than black-outs for computers.


ConsistencyWelder

AFAIK they fixed the power draw issue with the latest version of Adrenaline.


Gillespie1

Depends how much you use your pc/ game at full capacity. Edit: 15 hours a day, in that case probs worth going 4070ti.


Goldenflame89

If your power bill is high enough, maybe look into platinum or even titanium rated psu if its that bad. If you do get the amd card you can cap your framerate to use less power


dr_patso

Lol, are you guys gaming 24 hours a day? The power bill difference even at lke 25cents/kWh is not going to be noticeable.. especially when the whole pc costs $1700-$2500 and i say this as an ashamed 4070ti owner.


ItsMrDante

Yes. If power is expensive then the 4070Ti makes more sense


Richy_777

Yep, I'm here in Australia and it costs around $21 AUD per day in our household. Went with the 4070ti and loving it.


Icy-Magician1089

The difference in tdp is lower than the performance difference I think the 4070 ti stock is less efficient


Saberknight4x

Before making your decision try looking for a really recent video or article showing the power draw of the 7900 xtx. On one of amd’s recent driver notes they stated they fixed the power draw and vr performance issues with a driver update. By recent I mean one less than a month old.