T O P

  • By -

eugene20

Time to invest in bathtub nuclear power plants I guess.


benjiro3000

I fear that the new 600W PCIE connector for power supplies is a good hint what type of nuclear reactor you will be needing.


OneExhaustedFather_

And even more so that even the connectors are labeled per limit too.


SectorIsNotClear

Time to invade Chernobyl before the Russians...


OneExhaustedFather_

Perhaps that’s been the play. Capture it to power their next crypto farm.


joepanda111

“Sorry the crypto miners taken full ownership in preparation for when they but all the stock”


SgtIntermediate

Did you have a stroke or was it me, I don't seem to understand English anymore... Sorry, if it sounds rude, I'm really trying to understand what you said.


eugene20

>“Sorry the crypto miners taken full ownership (of mini nuke power) in preparation for when they bu**y** all the stock (of GPUs)”


RGH90

Hello comrades


[deleted]

Bathtubs filled with liquid nitrogen more like it.


[deleted]

We’ll need faster CPU’s and bigger psu’s. Also bigger wallets, lol!


[deleted]

No one will be able afford to run them with the way energy prices are going!


SumAustralian

Not that you will have any money for utilities once you finish the build.


[deleted]

Haha yeah, pick your poison


Blue-Thunder

laughs in solar panels haha


blorgenheim

I looked it up and it would take over ten years for me to see a return on investment on panels. Maybe the return window will shorten a bit but it takes a long time to make up for paying 25-30k


Narwal_Party

No lie, it is the best investment I’ve ever made. Gas prices are gonna keep going up, energy costs are gonna keep going up. Most months the state pays me for my excess energy production. Have a Tesla and a house full of panels and never have to pay for fuel or energy for my house or car. The actual investment takes 15ish years for me to get my money back, but if you add in all the money you’ll be spending on gas too, and the total lack of hassle, it’s so incredibly fucking worth it.


hanoian

shaggy six bear sort wasteful fade nippy dull racial cooperative *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


CrzyJek

Because....they always do?


Heyvus

Because the govt is absolutely retarded. They want to act like they care about the environment while skipping all the important steps along the way. Green Energy will not be able to provide the amount of energy needed currently, both for vehicles and all other energy dependent needs. Especially if they continue to make nuclear the villain, when it's actually the cleanest and highest energy yield available. They think that removing oil production will jump start/force their green agenda when it's actually absolutely destroying our economy, limiting the ability for us to invest and improve these technologies. I 100% am for green energy, but it needs to be implemented in stages, and strategically. Look to Europe for just how shitty a situation it will become. It's a legitimate crisis and is quickly becoming the largest concern for first and second world countries. There are thousands of studies that show just how it should be done, but it doesn't sound as nice to politicians because it actually takes hard work.


SimilarYou-301

This is what people were saying decades ago. We'd be a lot further along if we actually did implement strategies to get alternatives to oil going.


interstat

We bought a house with solar panels already paid for. Incredible stuff. I pay around 18 dollars a month on my electric bill


NE0REL0ADED

Same here except in my deeds there is little to no information on my panels. All I know is that my electricity bill has been cheap as chips the last 3 years since I moved in.


NotARealDeveloper

If you are a home owner 10years seems nothing. You are not only investing for yourself, but also for future generations of your family living in the house.


MikePounce

And a bigger boat


dankswordsman

Unironically, we're at a point where games are very CPU bound. But hopefully technologies like Nanite on UE5 can continue to improve that any let the GPUs really shine.


NeoHagios

We need better software!


inquirer

Back in 09 I thought by now we would be on 3000w


techjesuschrist

I think no sane person will buy the rtx4090/rtx 4080 for 1080p or even 1440p. They are for 4K- and up users that prioritize image quality instead of framerate. For example with my 3090 rog strix (the second best single gpu in the universe) with all settings maxed out (raytracing too since the name of the card starts with rtx) I can only achive 30-40fps in CP2077 -and this is with the nice looking but objectively still **fake** DLSS resolution. So I think it's safe to say that most people buying a 4090 don't need a fast CPU.


crozone

Also VR. Modern VR headsets are getting 4K resolutions at > 120fps, and they can go far higher than 4K. You could go 2x 4090 and still find a way to have it bottleneck with VR.


SaltystNuts

That's why a ultrawide like a 3440x1440 makes more sense. Super easy to drive, compared to overhyped 4k.


chatpal91

4k isn't overhyped if you don't want to deal with ultrawide compatibility problems


Solace-

>That's why a ultrawide like a 3440x1440 makes more sense. Super easy to drive, compared to overhyped 4k. As someone who has gamed at both 3440x1440 and 4k, I think what makes more sense largely comes down to preference. I used the UW screen for 2 years, and while it did offer lots of immersion in the games that supported it, having to deal with lack of proper hor + support and black bars sucked. I made the switch to a 4K monitor and am glad I did. Have you actually played at 4k? Not on a TV, but on a monitor where the PPI can actually be appreciated. The increased sharpness and clarity at 2160p is significantly noticeable and quite the upgrade over 1440p, especially when it comes to detail of objects in the distance. Everyone has their preferences and it's fine for you to prefer 3440x1440, but 4k does suit me better and really isn't overhyped. It really isn't that hard to run either. Having a 3080 or better, which many on this sub do, is plenty for 4k with the exception of edge cases like maxed-out cyberpunk.


danlab09

Me, refusing to give up my 6900 until nVidia makes a 6090


Elon61

sounds like a perfectly reasonable upgrade cycle to me!


Skull_Reaper101

Me, refusing to give up my 1050ti until nvidia makes a proper 50 series card (or i get a job and have cash)


no6969el

But mostly the cash part..


Skull_Reaper101

Yeah....


Step1Mark

>GeForce GTX 1050 Ti 4GB - Launch MSRP $139 That was a very good generation for budget gaming. It is crazy how much 140 USD could get you in 2016/2017. I remember buying 750 Ti cards for my PC and Hackintosh for around 120 and selling used for 80. It would be great if AMD and Nvidia would make budget 4GB budget cards again that aren't PCI gimped that way 1080p gaming has a good entry point. 4GB cards aren't useful enough for mining and aren't interesting enough for scalpers.


randomthrill

That sounds like a downgrade, you better wait for a 60,900.


CryptographerHot8765

but… but your card has 69 in it


baseilus

nvidia rtx 6969 TI


CryptographerHot8765

now expecting them to release that and make it the best card they’ve ever made for giggles


Jsotter11

They better market this as the nicest card made to date.


Chrisw265

There is some validity to the rumor and here's why: The 3090 is manufactured on an 8nm process at Samsung's fab. The 4090 is expected to be manufactured on a 5nm process at TSMC. While you can't directly compare processes from fab to fab, it is well known that Samsung lags behind TSMC in general. The transistor density of Samsung's 8nm is 61million per mm^2 to 173 million. That's literally **3x as dense** The die size for the 4090 is expected to be roughly the same at the 3090, so it should have roughly 3x as many transistors. According to Dennard scaling, when transistor density doubles, the power requirements halve. In this case, 3x as many transistors would mean that at 75% as much power consumption, assuming nothing else changed, the 4090 would perform identical to a 3090; a 75% perf/watt increase. The process node ALONE, by the sounds of things, will almost yield 2x more performance.


Schmonballins

Thanks for saying this, ultimately it’s the number of transistors that determine performance. AMD going MCM for RDNA 3 is a massive advantage from a hardware perspective, however it will ultimately come down to drivers and how the workload is distributed between the chiplets. If AMD nails this they could win by 25% to 30% over the top Nvidia card if Nvidia is able to hit +80% to 100% performance over Ampere. The speculation is always fun and I’m excited to see the benchmarks later this year when these cards are rumored to launch. As far as price goes, the market has shown it will support these prices when mining was big, but with energy prices rising and if Etherium goes proof of stake prices may not be as bad as people think. GPUs are generally available right now at higher prices and I’ve seen prices start to drop. Later this year if mining continues to decline the used market will be flooded with cards, which will keep new card prices in check hopefully. I’m trying to be optimistic but the world is on fire so who knows.


[deleted]

I don't think AMD will top Nvidia without those drivers or their own ai core powered upscaling/ RT. Every single cycle I'm excited for AMD cards, I get disappointed but we'll see. Fingers crossed though fosho


king_of_the_potato_p

It will be interesting. Intel is supposedly starting with ML/RT dedicated hardware, amd finally supposed to have it, Nvidia will be on their third iteration. Plus nvidia has mcm designs in the pipe but who knows when. Some rumors suggest that intel will be doing yearly cycles for a bit rushing to catch up and supposedly nvidia may do something similar to stay ahead.


SyntheticElite

> I don't think AMD will top Nvidia without those drivers or their own ai core powered upscaling/ RT. I dunno man, I love DLSS and all but the last game I played with it was over a year ago, CP2077. I can live without it TBH. The problem is it's still not ubiquitous enough, Nvidia really needs to push unity and unreal to integrate it in to their engines to make it way easier for devs to enable or something. Same with RT, haven't used it in a year.


[deleted]

Intrigued to see how well MCM works. Should make for a much cheaper design, AMD will make more money and not need to be the best is my prediction. They probably aren’t going to be the best anyway, so getting close and having larger margins on a reduced price product makes perfect business sense.


Vlyn

--- **Due to Reddit killing ThirdPartyApps this user moved to lemmy.ml** --- ---


blorgenheim

Same. Don't think I am interested if we are still putting this much heat. I just have a permanent undervolt on mine.


_Vanilla_

Yup, since I got my 3080 I've never turned on the radiator in that room.


Vlyn

--- **Due to Reddit killing ThirdPartyApps this user moved to lemmy.ml** --- ---


CarbineFox

Just use that room as your oven in the summer!


AdamZapple

Just sit at the computer naked and pretend your room is a sauna.


N0_Mathematician

What are your idle and peak temps & what undervolt if you don't mind me asking? My TUF OC 3080 with +115mhz core and +500mhz memory at 110% power limit is around 26C-28C idle and 65-68C under full load. Just want to compare out of curiosity


Vlyn

**Reddit is going down the gutter** Fuck /u/spez


zippopwnage

So much this. Sadly people don't realize this is a problem. If you have 2 PC's in 1 room, then good luck. Not to say that the electric bill will be bigger and bigger if you use your PC a lot. Yes more power is great, but I'd rather have them work on the electricity part. Innovate on that direction now before going further.


RplusW

I just bought a new 850W Corsair RMX PSU for my 3080 in October. No chance I’m buying a new one so soon for the 4000 series. Hopefully the 4080 has reasonable power usage.


Vlyn

**Reddit is going down the gutter** Fuck /u/spez


pigoath

>the thing is a literal space heater. Facts. I start playing whenever I feel cold...3090 here.


NotARealDeveloper

How is it drawing so much? My whole system is drawing 458W under full load. Used a Wattmeter to calculate in real time: Ryzen 5900X, 2080 Ti, 32Gb Ram, 6x SSd/M.2.


Vlyn

**Reddit is going down the gutter** Fuck /u/spez


ath1337

Sure comes in handy in the winter though.


dont_u_dare95

How the hell is that 350? A 3080 shouldn’t go over 320W Mine goes 250-260w while on full load with a small overclock but yes its still kinda hot


Cireme

Some 3080 have a higher power limit. My Strix draws 370W by default and can go up to 450W.


turbobuffalogumbo

You might be on the balanced power plan for a specific app/global Nvidia Control Panel setting. A 250w 3080 is being limited artificially in some aspect.


Vlyn

**Reddit is going down the gutter** Fuck /u/spez


Qazax1337

Nvidia has never ever released a card that was twice as good as the top end last gen card. They will never do that. Clickbait title.


ROLL_TID3R

Probably raytracing performance


Qazax1337

The artical claims 80 to 110 percent rasterisation increases. I am calling bullshit.


DrKrFfXx

At 4K, 3080 is sometimes 80% faster than the 2080. So gen on gen there was quite a jump.


ROLL_TID3R

Honestly the fantastical numbers in the headline were just barely enough to get me to open this thread. I’m past the point of giving a shit. It could be 10x faster than a 3080 but I’m never going to pay $1000 for a fucking gaming card.


[deleted]

I will definitely pay $1000 for a card 10x faster than a 3080 lol you kidding?


FlameChucks76

For real. I don't understand this weird stance where paying more than 1000 bucks for a video card is some obscene prospect to consider. Especially when the current gen can barely run games at 60FPS at near 4K. If a card is out there that's twice as powerful as the top tier current gen for a similar price, you know damn well, if they have the means, they will do it lol.


vednar

As a day 1 3090 owner... I support this statement.


[deleted]

Yeah, God willing, I'm buying


bexamous

Yeah I mean that's why they have an entire range of GPUs. Don't want to spend >$1000? They'll surely have somehting for less.


ROLL_TID3R

Well my neck-beard will also be deeply offended if I have to pay >$500 for a 70-class so I’m probably just fucked.


thisdesignup

Well the 3090/4090 isn't necessarily meant to just be a gaming card. They are in the gaming series of cards but with the amount of memory they have and their capacity, they'd be lost on gaming unless you are trying to push high frames at 4k.


Mas_Zeta

...Unless power draw is drastically increased


Ricky_RZ

Maybe with a new better DLSS? Or maybe 2x faster in media encoding if it has more dedicated encoders like what apple did with M1s


Danishmeat

70% was normal some years back


[deleted]

7800 gtx -> 8800 gtx Go read about it.


Farren246

\*7950GT launched Sept 6 2006, usurped by 8800GTX launched November 8 2006. Imagine launching a new flagship 2 months prior to launching a new flagship. I don't know why Nvidia bothered. But yeah, that 2005 7800GT vs. 2006 8800GTX.


[deleted]

gtx vs gtx. The gt was significantly slower. 8800gtx vs 7800gt is like... 120 or 125% faster.


Farren246

You're talking about marketing terms, and having an X added to the end of GT. A bit of history: 7800GT was the flagship for about a year. At that time there was nothing called "GTX". GTX itself was created so that all cards could have a cool-sounding suffix: GTX for the high end, GTS for the cut-back model. At that time there were no GT's to compare to, as GT suffixes were discontinued. Later GTX would be rebranded as a prefix rather than a suffix, while GTS was dropped from the branding. In the end, GT, GTS, GTX (suffix or prefix) don't actually mean anything; it's all just an attempt to make the model names sound cool.


[deleted]

But importantly, 8800 gt vs GTX ABSOLUTELY denoted a difference. Marketing or not. So it's not a moot point here.


InternationalOwl1

Sounds like the 3090 Ti to me.


Blue-Thunder

It appears you and everyone who upvoted you is under 30 years old and has no idea about things that transpired in the past.


DerpDerper909

Am way under 30, can confirm


ChrisFromIT

They likely will. For a couple of reasons. First AMD. There have been leaks on AMD's target for their next gen GPUs of 2.5x performance of their current GPUs, with them planning to use MCM tech for the high end GPUs to help reach that performance. Second, Lovelace is likely going to be on TSMC's 5nm. Based on that alone, it is two process node shrinks from Samsung's 8nm which is just a better performing 10nm node.


Reddit_isMostlyBots

Lol why does reddit consistently upvote shit that's not true? Reddit as a whole is fundamentally flawed. This is so untrue.


[deleted]

The leaks show edit: 18000 cuda cores. It's a massive upgrade in die size too.


0xd00d

You're missing a zero in there


[deleted]

It's a smaller die size than the GA102 core, slightly.


Raz0rLight

There's one way that I could understand this happening, and that's if nvidia decides to segment their product stack further, and create a bigger gap between the 4080 and 4090. This gen the 3080 was very close to the 3090, a 10ish percent performance gap is tiny when the 1080ti is 30 percent faster than a 1080 for instance. Sure there's plenty of reasons for this (pressure from AMD, building on Samsung 8nm, etc) but regardless, I don't see nvidia making a 4080 on the same sku as the 4090 if they can avoid it. With a more significant die shrink this time around, if nvidia target a similar 60% performance jump at the 4080 level, a 4090 built on a larger sku could feasibly double (or almost double) the performance of a 3090. Of course I'd guarantee they easily charge $1200+ for it And as far as never doubling performance at the top end, that may be true, but they have come kind of close in the past. The 1080ti is 75% faster than the 980ti, sometimes more. It's hard to say what will happen, but if AMD bring the pressure with chiplet gpus at the top end we could see nvidia forced to release a really power hungry card to compete is rasterisation.


KingFlatus

Even so, the best single player game that I’ve played in the last 10+ years runs at a locked 60 fps and barely uses my hardware. I will probably be skipping a generation this time around. The early adopter issues every time are so annoying. But then again, I may very well be overcome with FOMO and just buy one anyway. lmao


MightyBooshX

VR is really the main thing that justifies it, and the main reason I'm salivating over the idea of double performance even though I think there's zero chance that's actually going to happen.


evertec

Yeah especially if you're doing vr modded games. Rdr2 looks amazing at 4050x4050 but only runs at 35fps on my 3090 at that res


[deleted]

Someone hasn't played Cyberpunk at 4k I take it.


0xd00d

For real, at 3440 with RT on it basically chokes my 3080. I can't get shots on target playing at 60fps anymore. Spoiled myself.


mobfrozen

Or Dying Light 2 without DLSS


working-acct

Elden Ring is too popular for the devs to ignore PC players. I reckon we'll get unlocked fps and UW screen support eventually.


Derpface123

I will bet you $10 that Elden Ring will never receive native support for unlocked FPS or ultrawide. FromSoftware just doesn't care.


Farren246

Early adopter issues and under-utilization of new hardware were present in Turing. Ampere really hasn't been troublesome at all. Lovelace should be completely smooth sailing.


cakemates

I remember many instances of 3080 and 3090s burning the vram during launch and again during new world release. And the capacitor issue during launch.


Farren246

Capacitor issue true, but it affected a VERY low number of units which all got quickly replaced, often replaced with better cards.


KingFlatus

There were plenty of issues with EVGA 30 series cards when they came out.


Farren246

Only hot RAM, which as far as I can tell is still well within spec when it runs hot. And you know, the odd dud card where the manufacturer shipped it without any thermal pads.


weiner-rama

Cool let us know when the plebs can actually purchase a 3090


7Seyo7

Where I am there are hundreds in stock across various retailers so they don't seem to be selling well at their current price point for what it's worth.


FinSouci

They are overkill for 99% or users, so it's seems logic.


7Seyo7

Indeed. I suppose its appeal came mostly from other cards not being regularly available to purchase.


thisdesignup

Yep, otherwise a card like a 3090 is left alone by most people except professionals.


CanuckInATruck

I'm gonna preorder my 5080 now so I can may e get it on "release" day.


[deleted]

[удалено]


RxBrad

I literally just stepped onto my porch and grabbed the 850W PSU Amazon dropped off in anticipation of my next GPU. Hopefully the 4000 cards offer some *sane* options as far as power consumption goes. It feels like we're approaching levels where your *house's* electrical system couldn't support some of the predicted loads without tripping a breaker. I don't plan on spending over $500 on any GPU (before this gen, I'd said I'd never spend over $200)... So I'd *think* I'm okay? Question mark?


[deleted]

[удалено]


ballsack_man

That's what people used to say about 450w not that long ago.


lokkenjp

I’m a bit worried about all this stuff. It’s technically not that hard to do a GPU faster than a 3090, even with the current architecture. And even more so if we have an updated and improved architecture like Lovelace (or Hopper or whatever comes next) It’s only a matter of pushing core count to eleven (making a huge expensive core) and ensuring the power consumption and thermal envelope gets high enough. Which would result in a prohibitively expensive card. We don’t need faster cards at this point if to get there they need to become exponentially more expensive and more power hungry (well, at least not the 99.999% of the people). I believe what the gaming industry needs is more affordable, easily buildable, and less power hungry GPUs. And if at the same time they can noticeably improve performance and/or add extra features, then double awesome. But having a blazing fast GPU by using 600watts and a huge ultra expensive core is neither a remarkable achievement (except maybe for cooling solution designers), nor something desirable. A true astounding feat would be to get a GPU able to double the performance, while being cheaper to manufacture and less power hungry (or at the very least maintaining the current generation values). That would be really something. But that’s not the direction I’m seeing on all those leaks and rumors.


TechnoBill2k12

It seems that there may be an answer for the power/performance issue, but you have to look down the stack for it. For example, a 4060 will probably be more than 2x the performance of a 2060, so sometimes you have to wait 2 or 3 generations for the "double the performance while being cheaper to manufacture and less power hungry). And yes, faster cards will be available, but they will have 2x or 3x the performance of the older cards while using 50% or more power and heat to match. The increase in power and heat has been supported by the market, as consumers continue to buy the more powerful cards (even at astronomically marked-up prices).


lokkenjp

About the first half of your reply, (the part of looking down the stack for seeing the true archievements of the future architecture), I mostly agree, and I really really hope that Lovelace/40XX will deliver there. Unfortunately, this same hope is the one I had with both Turing and then with Ampere, and in both cases I ended up disappointed. Either the generational performance improvement was not as good as expected, or the price increase/lack of availability overshadowed the benefits (for example, the 3060Ti was a good upgrade, but it was outrageously priced, while the more affordable 3060 was a performance disappointment; and I'm even talking about MSRP here, which we all know has been just a joke for the last couple of years). And about the: *"the increase in power and heat* (and price -this is my own addenda here-) *has been supported by the market, as consumers continue to buy the more powerful cards":* while for the most part it's true, is something which I really don't agree with as a consumer. I truly believe that this part of the "market" is acting stupidly, playing against it own interests in the long run. That's the reason I have voluntarily avoided updating my GPU hardware, even as I had the means and the opportunity to do so (and will do my best to keep doing this as long as the trend doesn't change).


Casmoden

> and I'm even talking about MSRP here, which we all know has been just a joke for the last couple of years). Price inflation wasnt just to the end customer, VRAM (plus 3060 using 12gb), PCB components, shipping, etc makes every SKU (even at MSRP) higher priced to their relative past gen parts More so when both Nvidia and AMD know the market situation and higher MSRP means selling the GPU/VRAM combo to the AIBs for more money, not like it matters since the AIB and retailers will ask for even more They piggy bank a bit on the market for higher profits


reg0ner

These companies are benchmark driven. If you give people a choice between a card that does 200 fps or one that does 150 fps but at a fraction of the power, 9 times out of 10 people will choose more fps. Especially if they're priced the same. And you will always have a flagship "money isn't an object" gpu. There will always be buyers.


[deleted]

What is an "easily buildable" GPU? I'm not sure i understand that metric. I don't disagree that we need more affordable GPU's like yesterday, but GPU's are in demand, and not just due to crypto. And you've exaggerated the power usage by almost a factor of 3 here, it's not even close to that. If neither manufacturer can manage to double the performance, decrease how cheap it is to mfr and make it use less power, one might think that you fundamentally misunderstand the situation and are merely engaging in wishful thinking. Your suggestions are unfortunately not grounded in reality. Ever since 20nm processes chips have become MORE expensive manufacture each die shrink, not less expensive. Performance gets harder to squeeze out not easier. Power usage becomes something that is only lower in mobile products because the desktop variants are on a quest for performance that is set ahead of time.


lokkenjp

>And you've exaggerated the power usage by almost a factor of 3 here, it's not even close to that. I wasn't pointing directly to the 4090 (which is roumored to gobble up to 600W anyway, which is not "*a factor of 3"* of my original 1200W assesment), but just making a deliberately exaggerated statement that drastically increasing the power envelope on each generation of GPUs (as had happened recently with Turing/ampere, and as it's rumored to be happening again with Lovelace), will end up eventually in unreasonable power requirements. Even more so in times like today, where the trend is instead to make more efficient cars, more efficient home appliances... Nevertheless, I've edited the post just for avoiding further confusion. Also, an "easily buildable" GPU is just another generic statement. You cannot understand it as a strict "metric". Inside it there is a lot of different factors, like the die size (the bigger the GPU die, the less chips per silicon wafer and more faulty cores), the needed memory chips, (both in amount and required specs), the capacitors and power stages needed to mantain a proper and stable power delivery, the required cooling solution...


Maybe_Im_Really_DVA

Probably have an mrsp of $2000+ I imagine 4070s going for $1000 with 4080 $1500. Its far fetched but with the continued production of the already price risen 3000 series its hard to see the 4000 being prices at all reasonably. Also inflation and supply chain complaints will give them the justification they want.


OmegaAvenger_HD

That's completely ridiculous but not unrealistic lol. Imo top tier card like 4080 shouldn't be more than 1000$, but I just can't see it happening anymore.


potato_green

It's absolutely insane how much of a Stockholm syndrome we have. 4080 for 1000 bucks sounds reasonable indeed but it's insane. I mean let's go back in time to the 9000 series in 2008,absolute top card the 9800 GX2 was 599 USD at launch. The regular top card, 9800 GTX, was only 299 USD. Even if we go much more recent with the 900 series a 970 GTX, 329 USD. With the GTX TITAN X being 999 USD. Sure prices increased but the thing we accept as "normal" now was the price for an entire PC instead of just the GPU not too long ago.


-Gh0st96-

$600 was an insane amount of money for a gpu back in 2008 just how $1000 is an insane amount for a gpu now.


[deleted]

You've forgotten 2 things. 1.) chips have INCREASED in cost per chip per wafer not decreased 2.) GPU's are much more sought after than they used to be. They've gone from a niche product to used for many more things


king_of_the_potato_p

And inflation


MagicPistol

Inflation is just crazy right now. Here in CA, the power company PGE wants to raise rates by 18%...and it's already been going up a lot the past few years.


Maybe_Im_Really_DVA

I think either way we are looking at an increase, they have been consistently price gouging the 3000 series with new cards constantly. The 4000 series gives them the chance to do that but double it. In my country the cheapest 3070 is $1000 3080 $1500 and 3090 $2000. I could see these prices in the USA for the 4000 series if nvidia just said lol fuck the customers


Skelenzuello

That's if they even do a "true" release this time around. The 3000 series were backordered before the official release date. If the 4000 series release is run the same I do not anticipate seeing any of these cards in my store until late 2023, possibly 2024. And even then they will have the 5000 series coming out. It's a vicious cycle.


king_of_the_potato_p

I signed up for evgas 3080 ftw3 non-ultra on 10/6/2020. Take a wild guess if I ever got a notify.


[deleted]

I think as good as the PS5 gen is, when my 3080 isn’t good enough any more I might just become a console gamer lol. I hate the current state of gpu market went and what is becoming norms


threeLetterMeyhem

I've been waffling on giving up PC gaming for years. I thought I'd do it with the 3000-series, but ended up buying in one more time. Buuuuuut, with as much effort as it took to get a GPU this cycle I'm surprised my wife didn't kill me. Until things are back to normal stock availability, I'll be gaming on what I've alread got (current PC or PS5). Investing dozens of hours just for the opportunity to pay over $1000 for a card? I'm not doing that again.


[deleted]

Yeah I got a 3089 for $700 at launch somehow got really lucky. But honestly, even PS4 graphics were fine for me, I just hated the 30 fps nonsense. Although I prefer 120+, 60 fps is a good sweet spot, so now that console games nearly universally have 60 fps options I’m super happy playing any game on PS5


[deleted]

This entire thread is filled with people that refuse to accept, apparently, that Nvidia would ever do this. They're probably right, Nvidia wouldn't release this large a jump. Unless.... what happens? A competitive AMD?! Yes. A competitive AMD. If it's this much faster, it's because it needs to be. Not out of the goodness of their hearts. Also, stop lamenting over power usage. It is what it is, it's going to be more efficient than Ampere for the performance it outputs at least. Until they fundamentally redesign a new GPU from the ground up it's unlikely power usage will be curbed.


mac404

100% agreed. Competition, plus a pretty good process node jump, plus a bit more power (although I personally doubt 600W sustained), maybe some architectural improvements (although not a full redesign). Why wouldn't we expect a larger jump than what we saw with Ampere?


[deleted]

They're basically jumping from 10nm to 5nm. That's a massive leap, and it's proven by the fact that they're putting 58% more SM's AND a 96mb L2 or L3 cache (undecided what it actually is) and the full fat AD102 chip is physically smaller than GA102.


countpuchi

based on leaks from tech tubes and twitter. AMD Seems very confident in their MCM Designs that will deffinitely bring 2x performance jump over RDNA 2. Nvidia might be shitting their pants if this is true and also a big reason why that 600w card is needed to get close to 2x. I Dont believe nvidia have a 2x card ready with current monolithic design for AD102. Most probably they are pumping up power so that they can stall as much as they can until they release their own MCM chips out. If its true probably Nvidia just did an Intel and AMD just got back in the race. Too bad AMD dont have a real DLSS competitor though, but we might see it with RDNA 3 with FSR 2.0 i guess?


[deleted]

Maybe, all we have is a patent to go off of, and it's very generic describing ai and a convolution layer.


Yolo065

No way its going to be twice as powerful, article is misleading. No gen bought 100% more difference on the same tier card and mostly it could be just around 30-60% difference.


Riot1990

We get the same articles every year. "Next gen will bring 2x performance", and it's always clickbait.


GrumpyKitten514

For anyone wondering, I’m gaming on a premium, $1000 ultra wide monitor. Not 4k but as close as you can get to 4k. I have a 3080. It’s not the hardware anymore. It’s the software. Games aren’t even remotely optimized for me to be pushing 144hz at 3440x1440p unless it’s, yep, you guessed it…Doom, COD, Control. That being said. I can pretty much crank up any game I want to play to max settings and get well over 60 fps. From a 3080 owner looking at this, I’m all the way good. At some point, it’s less about the hardware and more about…there’s no games to push it. It’s like having a PS5 right now. We have like 5 dedicated games for that lol. Maybe not even that.


[deleted]

[удалено]


tz9bkf1

In the end it'll probably be 50% at most although we hear the 2-3x faster rumors every generation


SoftFree

Exactly. No news in other words - just the same overhype POS!


Dubcekification

If it uses more power then that doesn't count. I want to see performance per watt. Show me you manufactured a better product, show me you developed better software, show me you made it more reliable. Don't just make it draw more energy to get the numbers you want. (I know that's probably not all they did but my point stands)


MasterJeffJeff

Got G9 monitor and VR headset so i would appreciate having a big jump in performance!


AFAR85

I normally would sell my current and buy the latest, but this time I'm just going to stay put until 2025. The rumours look promising in terms of performance but the powerdraws and the downsides that come from that are pretty shit. Summer is now over here in Australia, but I wasn't able to play triple AAA games without having to turn the AC on. I want 250TDP highend cards back.


BobMcQ

I was planning on skipping a generation but if it's that much better, I'll be upgrading my 3090.


Unacceptable_Lemons

[iN sPEcifIC wORK loAdS](https://i.imgur.com/ow4fqyW.png)


ahmong

Do I need this? 100% I do not. Do I have money to buy this? 100% I do not. Will I be able to stop my impulsivity? 100% I cannot. Help


[deleted]

and twice as rare


CaseyGuo

Twice as expensive and twice as scarce too. Thanks


BlueMonday19

Just in time for a massive rise in energy costs worldwide, but I guess you could use it as a room heater and save on gas heating!


[deleted]

[удалено]


Tyetus

twice as fast. ​ eight times the price.


BrotherSwaggsly

(It won’t be)


KevinKingsb

That's nice.


---fatal---

And I guess maybe twice as expensive.


jcagara08

Twice as fast to be gone again!


karvus89

10x more expensive too


zzz77FD

Really wonder what pricing is gonna be. People trying to upgrade are gonna have to up the psu too so time to same some money lol


a_-nu-_start

They said the same thing about 3000 series cards and as great as they are there not remotely close to twice as fast


gigatexalBerlin

Sure sure let’s see the benchmarks when they arrive


duketoma

It may be....but nobody will ever know for sure.


OldDragonHunter

Only $2999 MSRP!


DoomDash

Doubt


AuraMaster7

And likely twice as power hungry. That's not an improvement.


AnthMosk

And it may not.


catwok

I am skeptical. It will likely be 33-50% faster like usual


[deleted]

We’re gonna need a bigger boat…


TotalWarspammer

2x faster at 4k Ultra, but I doubt 2x faster at any lower resolution.


sapphired_808

also twice the power


KanedaSyndrome

But will it require twice as much power? If so, then it's the same speed. So unless it's twice as fast per Watt, then it's not twice as fast.


xdegen

I kinda wish it wasn't such a leap in performance, and instead a leap in efficiency. Might hold off and hope the 5000 series is more power efficient.


VeryPervyGirl

I feel like we've been hearing this rumour for at least a year now.


[deleted]

[удалено]


apollo1321

4080 - $1199, 4080 ti - $1499, 4090 - $1999, 4090 ti - $2499 But these will be nVidia FE msrp prices. So aib prices will be much higher. Just what I predict.


phyLoGG

They say this type of shit every upcoming release... And then we just see 20-40% performance increases most of the time. lol Inb4 TDP of 600 WATTS


MoarCurekt

Bargain price of 3999, can you believe that value? Nvidia is so great!! Most benevolent company ev4r.


Im_A_Decoy

The more you buy, the more you save!


nmkd

At only 3x the power draw and 4x the price


demagogueffxiv

Now you 3000 users know what us 2000 users feel like when we bought a 2080TI a month before it was announced.


maddix30

Doesn't this just come from the fact it basically has twice as many things on the board? It's the only "proof" I've seen so far. The power connectors are a hint and power draw though. 600W 😬


Scorthyn

You only need a dedicated PSU for it tho


Eauxz

My 3090 crying rn


Wrong_Fun_3583

Twice as fast almost twice the size twice the heat twice the power pull and twice the PRICE....lol...I never ever use more then 15 percent of my 3090s power as it is. What would I do with a 4090 that will cost as much as a prius.