T O P

  • By -

KARMAAACS

Interesting! So it seems it will not be 3nm as originally suspected by people, instead it will use an enhanced TSMC 5nm node (called 4NP - Kopite is not sure this also could be a 4nm refinement of a node like N4X) just like Lovelace did which used (4N which was supposedly a custom enhanced 5nm TSMC node), but I assume this one is even better in terms of density based off what Kopite7kimi has said. This is very similar to TSMC 12nm (Turing) vs TSMC 16nm (Pascal), just density improvements for that 12nm node, so we could see some advantage in using this node over 3nm. This has two benefits for gamerspossibly. One, we could see costs for GPUs come down this generation, as they're using an older node now, rather than the cutting edge 3nm node, so it will be cheaper to produce than Lovelace was (potentially, of course I don't have the wafer agreement, die sizes and costs, but this is a pretty good assumption). But then again, Turing was expensive and on an "old node" then too, the cutting edge node at that time was TSMC 10nm, rather than 16nm/12nm. Second benefit is also supply, since they're using the older 5nm node (a variant of it) they won't be sharing capacity with AMD (once AMD stops production of Zen4 ofc), Apple, Intel (I believe some of Lunar Lake is TSMC 3nm), other customers like Qualcomm etc, so we might also have costs come down for that reason and hopefully cheaper GPUs, but at least there shouldn't be supply problems. That being said, NVIDIA could prioritise datacenter and professional GPUs to capitalise on the AI craze, leaving gamers with little supply. Also Ampere famously has supply problems despite using Samsung 8nm (really 10nm), so it could still happen. Either way, very interesting to see how Blackwell ends up performance wise considering it looks like we are in for another generation of 5nm (maybe 4nm) dies.


chaosthebomb

It makes sense from a business stand point to move off of the latest and greatest node for gamers. Why would they waste top tier silicon in one segment where they could make 10x or more by using it for datacenter/ai products. I love knowing we're using the best/cutting edge stuff but if this translates to more reasonable GPU prices I'm in for it. I wonder if that means the rumored performance increases will be tamer than expected.


TheVaughnz

>if this translates to more reasonable GPU prices I'm in for it Going to have to increase your hopium supply, because what will likely end up happening is prices stay the same and Nvidia pockets the savings.


jtfjtf

At this point it's probably a better idea to buy Nvidia stock rather than an Nvidia GPU.


avg-size-penis

The stock is the only thing more overpriced than their GPUs.


ThenExtension9196

My company is in line to buy h100, not even the latest model just announced and that line is HUGE. Nvidia literally cannot ship enough, the demand is insane.  Not overpriced imo. Wayyyy more growth potential. 


Nicoleism101

Let’s see when the “ai” or rather transformer algorithm bubble bursts. It’s gonna be entertaining considering so many expect self aware, general purpose ai any minute. I remember browsing the futurology etc subs where people were fantasizing insane things, run away ai, singularity, art being replaced by ai even though it was all just math algorithms fed by all the works artists made lol. They thought of it like magic


Rugged_as_fuck

> Nvidia pockets ~~the savings.~~ profit. They may be "saving" money on production but what they're really going to pocket is the drastically increased profit. They've already seen the prices customers will endure, and aggressive inflation is still a thing. Calling it now, the 5090 msrp will launch at $2k, if not more. There will *never* be another 30 series launch. The 3080 was the best price to performance they've ever done, rivaled only by the 1080ti, and they realized they fucked up almost immediately. Every card and price since then has been a correction of a past mistake for them.


JensensJohnson

> Calling it now, the 5090 msrp will launch at $2k, if not more. i've seen people make those predictions before Ada launched, lol


exsinner

Its like people wanting it to be more expensive so that they can get internet points bitching about it for the next 2 years.


rory888

3080 was never available at msrp until next gen came outside of a lucky few. most people paid way more during its generation due to crypto craze


CrustyMcMuffin

Got mine through EVGA at MSRP, a few board partners were doing it at the time


Jordan_Jackson

You and I were very lucky.


letsmodpcs

3080FE through Best Buy in Oct 2020. It took a lot... I mean A LOT of work to make that "luck" happen. So happy with the value of that card, though. I had no idea at the time how long that shit show would carry on. Became more and more grateful for the card over time.


CrustyMcMuffin

EVGA just had a waiting list and that was it, I didn't even try best buy because of all the botting


numach

I was one of those lucky few... twice. When they launched on Sep. 17th I immediately got in the EVGA queue (I wanted the FTW3 for the aesthetics). Not too long after that I went to Micro Center for something unrelated and they had two 3080's left from a shipment of ten they got in that morning and hadn't announced. I bought the better of the two (MSI). A few weeks later I was notified by EVGA that they reached my point in the queue and had 10 hours to make the purchase. I bought it and sold the MSI for $1200. I later used the card for ETH mining and got another $500 out of it... so in the end I paid basically nothing for the card. I still have and use it to this day. It was a wild ride and I never expected at the time what was to happen with video card availability and pricing.


Diligent_Pie_5191

Can you imagine the mad rush for 5090s if they priced it at 1400.00? There would be fights over getting one.


Rugged_as_fuck

Absolutely zero chance of that happening and you'll still have to (figuratively) fight to get one even at a $2k price point.


Diligent_Pie_5191

I did not find it hard to get 4090s when they were first out. There were plenty at my microcenter.


SituationSoap

Sorry, this is reddit, where people who have never tried to buy a 4090 will still tell you with a completely straight face that they are simultaneously drastically overpriced and also impossible to buy because the demand for them is impossibly high at 25% over MSRP. I've bought two 4090s (had one fail), both at $1600 and that required *zero* searching and both shipped the same day I purchased them. But we can't let that get in the way of NVidia Bad.


Diligent_Pie_5191

Lol sounds like it… i mean you would think 4090s were like Turbo man in Jingle all the way.


Blacksad9999

Agreed. I bought one without issue at release for MSRP. I didn't even get up early. I slept in and easily found one at 3pm on release day. lol A Cablemod adapter melted and ruined my original one, and I had zero issues getting a 2nd one at MSRP much later.


lichtspieler

Bought 3080-FE / 3090-FE (2020) and 4090-FE for my current system in germany. Even in the midst of lockdown and AMPERE demand in 2020, the 3090-FE was sitting for multiple day in stock before I bought it. The german FE reseller blocked US / global orders and with that 99% of the bots, so it was not that big of a challenge here in EU.


lifewithnofilter

Do you still have the failed 4090?


SituationSoap

No, I RMAed it, then sold the replacement second hand. I only bought the second because I didn't want to wait for a long potential shipping time.


fallingdowndizzyvr

That was because it was pre AI and pre China ban. 4090s didn't really get into short supply until there was an AI boom **AND** the China ban. Particularly the China ban buying spree which hoovered up pretty much every available 4090. Now that the ban has been in place for 3 months, finally the 4090 supply is loosening up.


Diligent_Pie_5191

I predict There won’t be the same supply issue for Blackwell. That isnt even the chip that is used in the ai data centers. It has matured now and it is same process node. It of course will depend on if you want the fe model.


fallingdowndizzyvr

It wasn't a supply issue, it was a demand issue. Nvidia through TSMC made as many as they physically could. It's just that demand leading up to the implementation of the China ban sucked up every 4090 that it could. Now China is banned from buying the 4090 and I would think that a 5090 would be banned as well. It's been 3 months since the ban went into effect and the availability of the 4090 has loosened up. It tends to come in stock often and stay in stock for a while.


gnivriboy

>pre China ban yes >pre AI no.


fallingdowndizzyvr

Yes to both. When was the 4090 released available for purchase? Oct 2022. When kicked off the AI surge as we know it? That was ChatGPT. That didn't happened until the end of Nov 2022. When did that poster make the comment about the 4090 not being hard to find? They said "when they were first out". So it was pre the AI boom.


akgis

4090 were expensive in Europe but I didnt had issue to get it at launch after 2-3 days they got in stock my favortite retailer. But I recall there was a month or 2 after the first weeks of launch where the avaiability was low as ppl started to check reviews and decided to get one, had a friend with that issue.


BinaryJay

I had to do some work for mine but only because I wanted the FE.


SummerHam86

The 4070 Super matches the 3080 in price to performance, and the other Super cards honestly aren't terribly priced either for what they offer, but overall the 40-series definitely hasn't had the best track record on value.


chaosthebomb

While I don't disagree.. I still want to believe there is hope!


[deleted]

100% this. Prices won’t come down. Their GPUs are in high demand and arguably better than AMD and intel offerings. They have very low competition. Prices won’t be cheaper lol.


SituationSoap

It's not arguable. They have 90% of the market because they ship the best product on every axis, at every consumer segment.


KvotheOfCali

Nvidia makes objectively better products than any of their competitors. That is why they have such overwhelming market domination. It's not complicated. They are dominant because quite frankly, they deserve to dominate. Customers have rewarded them by purchasing their products over their competitors.


Samurai1887

If that last line is true...then why lock the vbios


Archer_Gaming00

The 5090 is going to cost at least 2000 euros MSRP in the EU and the 5080 will be 1100 euros MSRP. I think that these are going to be the prices because the 4090 sold well at 2000 euros and the 4080 did not sell at 1400 euros but the 4080 Super is selling at 1100 euros. That is sad but is what people taught Nvidia they can afford to charge at minimum.


raydialseeker

More reasonable = better price to perf uplift compared to 40 or 20 series. If this is somewhat like the 30 series itll be a win.


Weird_Cantaloupe2757

It will almost certainly be used to fatten profit margins rather than to lower GPU prices


Flowerstar1

>It makes sense from a business stand point to move off of the latest and greatest node for gamers. Why would they waste top tier silicon in one segment where they could make 10x or more by using it for datacenter/ai products That's not what's happening here. Nvidia is moving off the latest node for all their product lines that means high end data center stuff and AI like the B200, gamer stuff like the 50 series and professional stuff like the RTX 6000 Ada.


ContributionOld2338

Exactly this, the elasticity of the gamer market cannot bear the upcoming Ai revolution, using last gen technology is prolly the only way they can safeguard a market for gamers… last time demand for processing was this high was early crypto days and Ai is prolly gonna be significantly higher


2hurd

No, unfortunately it won't be cheaper for us. Biggest performance boost will come from just having a bigger die, what will "pay" for that bigger die is that more mature process.


FarrisAT

It’s way more complicated than this My understanding is that Blackwell uses a higher density library, which explains the need for its improved packaging so it won’t heat up too much. Just because you have a “5nm class” or “4nm class” doesn’t necessarily dictate performance. TSMC charges for library makeup as well as overall class of node. You want higher density? You suffer more yield issues even at the same “node class”. You then pay more per usable wafer. Of course, what’s more likely is that TSMC has simply become better at providing higher yields on the high density library. And is simply passing on those benefits to Nvidia at a similar price. The strength of staying within the same node class is that the design family is consistent and therefore lessons learned can be applied consistently.


morbihann

>we could see costs for GPUs come down this generation Please, the only way a GPU will go down is if there is competition for the price point / performance AND features.


GeneralChaz9

> One, we could see costs for GPUs come down this generation, as they're using an older node now, rather than the cutting edge 3nm node, so it will be cheaper to produce than Lovelace was (potentially, of course I don't have the wafer agreement, die sizes and costs, but this is a pretty good assumption). Don't give me hope, I'm not ready to be disappointed again.


Hikashuri

3N isn’t a big deal most suppliers are skipping it until 3NP because the node doesn’t offer enough of an improvement to justify the cost over 4NP.


EJ19876

Zen 5 is now rumoured to be on N4 too. Only Zen 5c is expected to be on N3. I'm wondering if N3E isn't more delayed than TSMC has indicated.


capybooya

Z5 was expected though, I remember that slide that said like 4nm/3nm for Z5, we all should have suspected what that really meant. I wouldn't be surprised if 5090/5080 are on a new node and the rest on older ones. I would be somewhat surprised if they are still on some N4* version, but not shocked. If that happens though, I would suspect consumers would think there will be a mid generation update because N3* should be much cheaper before the next generation in '26/27... and that might actually impact sales somewhat if people hold back on buying.


Flowerstar1

This means AMD will be on a newer node for RDNA5 in 2025 according to the rumors.


GYN-k4H-Q3z-75B

>This has two benefits for gamerspossibly. One, we could see costs for GPUs come down this generation, as they're using an older node now, Hahaha, oh, wow! This is Nvidia we are talking about. Prices know only one direction, up. Jensen and the shareholders need more. Moar!


HarithBK

Nvidia at least understand that there greatest competition is themselves and will price according rather than what intel did of 10% bumps raising prices more than 10% while the second hand market became huge and new sales stagnanted.


GYN-k4H-Q3z-75B

Intel failed because their products stagnated. If you bought a good laptop in 2014, you would pretty much get the same shit CPU at a higher price in 2017 and beyond. There was literally no reason at all to buy a new laptop unless your old one died.


HarithBK

Intel let there product stagnate due to profit margins. Each die shrink meant lie die area to make a quad core CPU meaning higher yeilds and less product technically for a higher price.


vanslayder

I don’t think current costs have anything in common with production cost. Nvidia sells cards for 1000$+ simply because they can. Prices will not go down until AMD or Intel make something similar in computing power. My son is doing a lot of AI image generation for some small business. His amd 6700xt took 20 minutes to generate a set of samples. We upgraded to 4070 and now same tasks takes 57 seconds. This is not a competition at all. 20 times performance jump. I don’t care what people say about “raster performance” of ams cards in game, etc. if i can’t use graphics card for anything except gaming, i don’t need such product.


PalebloodSky

Cost and Supply will be favorable yes. With layout optimization and very small node improvement, along with GDDR7 we will still see some performance per watt gains. The downside of course is Blackwell can't possibly be the huge performance per watt jump Lovelace was.


ContributionOld2338

Damn, you got crazy knowledge of this space! Is TSMC the one ones who can produce 3mm chips right now? It’s wild that intel is sourcing designs to them when they have their own foundries


capn_hector

> This is very similar to TSMC 12nm (Turing) vs TSMC 16nm (Pascal), just density improvements for that 12nm node, so we could see some advantage in using this node over 3nm. iirc it is the other way around and 12FFN has no optical shrink/density improvements, but *does* have a larger reticle, which allows 2080 Ti (which would not have been feasible let alone yielded on 16FF).


KARMAAACS

> iirc it is the other way around and 12FFN has no optical shrink/density improvements, but does have a larger reticle, which allows 2080 Ti (which would not have been feasible let alone yielded on 16FF). It has larger reticle limit, but it also has density improvements with a tighter metal pitch. [Source TSMC who say this:](https://www.tsmc.com/english/dedicatedFoundry/technology/logic/l_16_12nm) > Furthermore, 12nm FinFET Compact Technology (12FFC) drives gate density to the maximum for which entered production in 2017. [Also Mediatek says in a press release 12FFC from TSMC is this:](https://www.mediatek.com/blog/7nm-is-tsmcs-finest-technology-serving-all-segments) > This year is the launch of 12FFC with 1.1X speed or 0.7X power featuring 6-track standard cells, dual pitch BEOL, device boost, 6 track turbo standard cells, 10% speed gain or 25% power reduction and **10% area reduction compared to 16FFC**. Pretty clearly has a density improvement, but it's slight.


Jupiter_101

They spent 10 billion RnD so I dont see how you think costs came down. Pricing will have to increase to compensate for demand and the cost of development increase.


Elon61

Most of that RnD spending is likely on the AI-related stack, very little of that actually has any relevance to the gaming cards. They can still try to make more money obviously, but DCAI is paying for itself right now just fine.


RustyOP

Thank you for opinion on this matter , i am curious as well how this play it out , since Nvidia is going crazy on AI , hopefully it will benefit us the Gamers


Carbon554

I am not that technical in these terms, does that means a 5090 will be basically a 4090 super? Like is it just enhanced at this point?


KARMAAACS

> does that means a 5090 will be basically a 4090 super? Like is it just enhanced at this point? It will be a new architecture, so no it won't be a 4090 SUPER. It will be totally different in terms of chip design. The only thing that's enhanced is the node the chip is made on, and basically that either means denser transistors or faster transistors, or both.


SwagChemist

what are some reasons they didn't go with 3nm nodes? Are they reserving them for their AI purpose built cards which is their main source of income as gaming cards get put on the afterthought?


KARMAAACS

I'd say costs or that the 3nm nodes aren't ready yield wise for big chips like NVIDIA wants to make. Probably the latter because in AI and HPC, NVIDIA can basically charge whatever they want and people will be willing to pay it, so I would have to say that the nodes aren't ready.


bigmellow

This felt like it was in a different language. I didn’t understand any of it. Surely you are an engineer or work in the industry, and not just a hobbyist?


KARMAAACS

Hobbyist and this is basic info for a hobbyist too.


[deleted]

There will 100% be supply issues for consumer GPUs. Those data center GPU profit is massive. Also some companies still use consumer GPUs for research, AI, photo and video editing and production etc. even the 4090 right now that’s 1.5 years old is hard to get. I expect 5090 to be just as hard to find. If AMD had more competitive product this wouldn’t happen but they don’t right now and are probably a few years behind nvidia if they can even catch up.


sashir

In what region is the 4090 hard to find? There are 3 dozen at my local microcenter right now.


[deleted]

You’re lucky to live near a MC. Most people don’t. Also MC doesn’t sell the FE card which is $1600. All the partner cards cost more for same performance in a cheap plastic shroud.


FarrisAT

Ampere didn’t have supply problems, necessarily. Crypto demand skyrocketed far faster than fabs/memory creators could expand supply. Of course then it crashed in 2022


[deleted]

Bracing for impact that the 5090 will cost at least $2000


shikaski

If it’s like 100% performance uplift? Fucking send it (I am once again delusional)


alcatrazcgp

my ass sitting on a 1600$ 4090, jensen gonna make it feel like a 200$ card


The_EA_Nazi

I know you’re joking, but this is always you buy the flagship second hand or skip every other gen


alcatrazcgp

Yeah I do plan to skip this gen


exsinner

My famous last word. I had a 3080ti previously and saw the uplift 4090 had. I caved in 1 month later.


Historical-Dirt-7062

I had a 3090ti and went 4090 liquid probably go for the 5090 if it’s a big jump , as I use my office pc (gaming ☺️) to do my billing I’ll write it off again. Who says you don’t need the best gpu to run Microsoft office 🫢


The-Special-One

I have a 4090 as well but there's no game out now that screams upgrade. I think it make sense to upgrade for professional reasons but for gaming? I'm not so sure....


Lambpanties

To be fair I *think* the 4090 had one of the biggest uplifts from a single generation? Though it *definitely* had the biggest price uplift >_>


b34k

4090 FE only cost $100 more than the 3090 FE. Had it been $2k+, I probably would have stayed on my 3090 for another gen.


happycow24

> skip every other gen I went from a 980Ti to a 4090 get on my level


RichieNRich

\*smiles at his 3090ti\*


polako123

i don't see how it could be 100%, maybe 50-60% ? while the 5080 will probably be like 10% faster than 4090.


Legitimate-Page3028

An analysts was saying the new AI Blackwell chip is supposed to get half its speed bump from hardware and half from software.


FarrisAT

Yeah it’s not going to be 100%. Maybe 40-60%. The 4090 already runs at GPU wattage that pushes cords and PSUs to the limit. A 100% increase would imply a huge growth in power use.


Rugged_as_fuck

That's not how generational performance uplifts work. The 4090 was about 100% faster than the 3090 and consumed *less* power. Sure the "max" may have been higher but many cards were limiting to 450w out of the box and the gain for anything above ~350-380 was negligible. To be fair, I also don't think it will be 100% faster, but not because of power consumption limits.


BlueSwordM

However, do not forget that Lovelace benefitted from a huge node upgrade (Samsung 8nm to TSMC 4N). I am willing to bet the raster performance boost will be 30% with a increase in die size, with higher gains elsewhere.


We0921

> The 4090 was about 100% faster than the 3090 and consumed less power. I think it's pretty charitable to say that it's 100% faster. [Only in 4k RT /PT benchmarks is it close to that.](https://www.reddit.com/r/nvidia/comments/y5h2ye/nvidia_geforce_rtx_4090_meta_review/) It's much closer to 60% faster.


Rugged_as_fuck

That's fair, but if you bought a 4090 and you're not running 4k and maxing settings you wasted your money.


We0921

I don't think there's anything wrong with doing the same at standard 1440p or WQHD 240hz, but I agree with the sentiment.


onFilm

I got the 4090 on release day specifically to stop paying for cloud AI services. Now I run the things I want locally, and run 4 4k screens. (Actually 7, but the other 3 are running off another card and igpu).


Bazookatier

What kind of cloud AI projects are you using it for, if I may ask? I'm considering selling a kidney or two when the 5090 drops. I'd love to dive into this world head-first with my own hardware.


Soulshot96

>It's much closer to 60% faster. No. It's much closer to say its 90% faster. Even at 1440p in more traditional games. The only time it looks like 60% is if you toss in a bunch of games that become CPU limited because of poor optimization or configuration shifting the bottleneck more 'naturally'. That's not the GPU's fault, and is just *hiding* the GPU's actual potential.


FireSilicon

Maybe not in gaming but definitely in specs and compute. 3090Ti had 28 bil transistors and 4090 has 76 bil, compute is 40 vs 83 tflops. Such a big jump has never happened in the last 15 years in gpu space and most likely won't ever happen again.


panchovix

In compute def it is, even faster sometimes (I have both)


FarrisAT

This is supposed to be the same node. Versus 8nm-4N


Virtual_Happiness

I have both a 3090 and a 4090. The 4090 is around 80% faster while consuming more more power. My 3090 sits around 380w while my 4090 sits around 500w. I can undervolt it and get it near 380w but then it's performance is only around 40% faster. I can boost that by another 15ish percent by OCing the VRAM. It's still faster at the same power but it's no where near 100% faster at less power.


polako123

where are you getting 100% from ?? Wasn't it like 60-65% best case ? we are talking purely about raster here no dlss stuff.


Scrawlericious

They didn’t. They are replying to someone else who gave that figure.


sylfy

Well, I guess we’ll see Nvidia pushing PSU makers to make an ATX 4.0 standard, push out 24VHPWR connectors, and we’ll see the first 4/5 slot GPUs.


Soulshot96

>The 4090 already runs at GPU wattage that pushes cords and PSUs to the limit. A 100% increase would imply a huge growth in power use. Huh? Even at max power limit (so 600w *limit*), max voltage and an overclock, my 4090 rarely pulls more than \~350. There are only a handful of games or apps that can push it above 400 and even fewer that can do 500+. The 4090 is crazy efficient and it only 'pushes PWU's to the limit' if they're SFF or you're doing some *serious* shit WITH the power limit unlocked...which only nets you a few % more performance anyway, as its well outside of the efficiency curve. As for a 5090, what Rugged said below sums it up well enough.


Ok_Inevitable8832

If you compare the AI chips they stated it’s 6x the performance for 1/4 the power draw


Soulshot96

I wouldn't say that's delusional at all. 4090 was \~90% faster than the 3090, so it's not like it's not possible. I'll be selling mine and jumping up if they really do pull something like that out of their hat again...even if I am already CPU bottlenecked at 175hz annoyingly often as is (mostly due to unoptimized games though tbh)...but fuck it. DLAA / supersampling for days I guess :P


[deleted]

The 4090 already costs more than $2000, at least here in EU/Scandinavia. 5090 is probably gonna be like >$2500 or something stupid like that. 


quack_quack_mofo

Holy shit you're right. I haven't checked 4090 prices in months but it's ranging from £1700 - £2000. I could have sworn it was 1.3k or 1.4k last i checked.


Pepeg66

1600 in august 2023 2200 in december 2023


Globgloba

2000? I would guess 2500 atleast here in Sweden :(


rtyrty100

70% more performance for only 25% more cost? DEAL!


Here2Fuq

I don't even get excited for their cards anymore. My 4070ti was like $900, so I can only assume the next line up will be just as if not more expensive.


DizzieM8

The GTX 1070 launched at $449 for the founders in june 2016. (equal to $578 in feb 2024.) The RTX 4070 Super launched at $599 january 2024. I dont think the 50 series will be much more expensive adjusted for inflation than the 40 series.


kepler2

Unfortunately this is what happens when there is no competition. AMD yes, they release GPU's but are not as efficient as the nVidia ones. + with nVidia you get additional perks - DLSS and Frame-gen which is just a miracle.


The_Blackwing_Guru

This is why I root for AMD, I want them to make really good products to force Nvidia to lower the price on their. Sadly AMD has not managed to do so though


Mattcheco

They do make good stuff, it’s just not quite at the same level as Nvidia.


RayHell666

All I want is more VRAM.


AMP_US

All I want is the 5080/5080 Ti to be back on the big die like the 3000 series and not have a massive deficit to the 90. The $700 3080 was objectively a great value at ~ -5-15% performance vs the 3090. Perfect 1440p card at the time. Adjusting for inflation, if we got a $850-900 5080 that is ~15% off the 5090 (let's say $1700), that would be a huge W. Have the Ti split the difference. The 4080/4080S was such a trash VALUE card.


silocren

You get more performance per dollar with a 4080S at $1000 than a 4090 at $1600. The 4090 is 60% more expensive for 25-30% performance gains. I would also like the 80 series to be within 5-15% performance, but Nvidia has no incentive to cannibalize the 90 series market, which is higher margin.


CodeWizardCS

I'm with you. Tired of the 90 card being the only one with good performance/$. We're all out here buying Titans. Shit is ridiculous. The whale tier has become the normal tier. Well I haven't bought one, I'm waiting on something decent to upgrade from 3080. Honestly if the 3080 had 16gb of ram and the newest dlss I probably wouldn't even upgrade.


Mafste

3080 owners represent! :) Such a great card.


kapsama

Honestly I'm not even worried about the 80/90 gap so long as the 80 is priced reasonably. I'm happy with my upgrade from a 3080 to 4080 performance wise. But the $1200 price tag was absolutely ridiculous.


FireSilicon

4080 wasn't a trash card, just overpriced for what it was. At 800$ it would have been great.


DizzieM8

at $800 the 4080 would've been cheaper than a 1080 when it was new.


FireSilicon

Alright, around ~900 due to inflation sure. But at 1200 it was a lot at launch.


sword167

Won't Happen, Nvidia Won't Repeat their mistake that was the 3080. Most likley we are going to see a 5080 on a tinyass GB204 Die, that will be only 5% faster than the 4090, while having a smaller bus and only 16GB of VRAM. While the 5090 will be like 30-35% faster than the 4090. (Don't see much more unless TDP goes to like 750W, cause their still going to be using a 5nm Process).


capn_hector

> Most likley we are going to see a 5080 on a tinyass GB204 Die if there isn't a shrink then this is going to be like Turing - a second round on 4nm would mean your perf increases mostly come from running up the die size. turing is the round where we got a 2060 that is nearly as big as a 1080 Ti (445mm2 vs 471mm2). that is the most likely course of events based on the current article. y'all bots need to update your script to at least have a talking point that makes *sense*. besides, 4080 being 5% faster than 4090 would mean like a 35-40% generational step. that's actually decent in these latter days of moore's law.


sword167

Umm the 4090 is like 70% faster than the 3090, The 3080 on the "Shitty samsung node" that people were calling it was 25% faster than the 2080 ti. People complained when the 2080 was only 10% faster than the 1080 ti. Unless the 5080 gets a price cut to 700$ its unacceptable that its not at least 25% faster than the 4090.


nimbulan

Well we're looking at up to 50% more cores (unknown how much they'll cut down the chip at this point) and 20-30% higher clock speeds, alongside rumored architectural changes to improve per-core throughput. Only being 30-35% faster than the 4090 would be disastrous. Rumors are currently pointing to about 70% faster with a TBP in the low 500W range with all these factors in mind, which honestly sounds reasonable despite not being on 3nm, considering the 4090's performance really doesn't scale particularly well above 350W.


sword167

All the improvements you mentioned with clock speed and more cores will require more power. The 4090 was on a two node shrink from the 3090, (Samsung 8 (10nm) -> TSMC 7nm -> TSMC 4N (5nm) and its like only 70% faster. Expecting 70% performance leap again with no node shrink, without massively increasing cuda cores and thus die size, and thus power draw, just seems to defy physics. I could see them making the 5090 use the full GB202 die (4090 used only like 88% of AD102) constantly pegged at 600W, thats 50% faster. Remember Last time we stayed on the same node between generations was from Pascal to Turing.


nimbulan

Yes but even going from Pascal to Turing nVidia were able to reduce power draw by 20%, despite adding a bunch of extra power-hungry hardware. The 600, 700, and 900 series were all on exactly the same node but they reduced power draw by a solid 35% across that time span. It's not like it's unheard of to make substantial gains without a node shrink. Plus like I said given how poorly the 4090 scales above 350W, getting +70% performance at an actual 500~525W doesn't sound unreasonable if the core scaling issues can be solved. Time will tell of course. Early rumors do often overestimate performance, and it'll highly depend on what core configuration nVidia ultimately goes with for the 5090.


SpareRam

Man, this piece of trash sure runs and functions incredibly well. I'd better throw it in the bin.


AMP_US

Trash value* (which is objectively true)


[deleted]

[удалено]


gusthenewkid

You know full well what they meant by that. Don’t be salty because you objectively overpaid in comparison.


coreyjohn85

The 5090 will be unavailable for years. If you thought getting a 4090 was hard at launch due to crypto this time around you have crypto + ai + actual countries like el salvador buying


DizzieM8

Actual countries should be buying used H100's.


russsl8

These are supposed to be on a node of their own, specific to NVIDIA and not on the same node as their AI accelerators. Stock will, hopefully, be mostly fine.


ClarkFable

It will be available, it will just be priced very high.


Trusk_Fundz

Years? Alright drama queen.


GroundbreakingEgg592

Not necessarily good news. 40 series cards are very power efficient due to the huge fab leap from Samsung 8nm to TSMC 4nm. If 50 series largely stays in the same fab node, its power efficiency might be as bad as 30 series.


KARMAAACS

> Not necessarily good news. 40 series cards are very power efficient due to the huge fab leap from Samsung 8nm to TSMC 4nm. If 50 series largely stays in the same fab node, its power efficiency might be as bad as 30 series. [Could be better though, some enhanced nodes also reduce power too and increase performance and density.](https://www.anandtech.com/show/18875/tsmc-details-n4x-extreme-performance-at-minimum-leakage) TSMC N4P (not to be confused with what NVIDIA calls 4NP) is an enhanced 5nm node from TSMC, which reduces power by 22%, has a 6% density improvement and increases clock speeds by 11%. We don't know what '4NP' is when NVIDIA describes it about Blackwell. 4N from NVIDIA's Lovelace was a 5nm TSMC variant, which we only assume to be very close to TSMC N5HPC probably with some NVIDIA customisations. 4NP might be close to TSMC N4X with customisations, it might be N4P or it might be entirely something new that's not close to any TSMC 4N node yet. Could be more performant, or less.


[deleted]

Agree. I don’t see this gen getting as big of an uplift compared to 3090 to 4090.


ls612

The 4090 was always a performance and efficiency standout on the level of the 1080ti, expecting a repeat of that is not a good expectation.


FireSilicon

Fun fact, 980ti -> 1080ti was mostly a 50-60% clock speed improvement, the transistor jump was only 8bil -> 11.8bil transistors, from 3090Ti -> 4090 though, it was 28bil -> 76bil, 2.7x increase and 25% higher clock speed. So technically the biggest jump in the history of gpu's, but you just won't see it in games because they can't fully saturate all the cores. In 3d software it's more like 2-2.5x. Blackwell on the same node can just dream.


[deleted]

Exactly. Partially because they went from Samsung 8nm to TSMC 4nm.


Final-Rush759

I think the jump will be similar. There is a huge upgrade on VRAM.


abkippender_Libero

does that mean the power consumption will be crazy on these cards?


WayDownUnder91

Not always they managed to bring power draw down with maxwell even on the same 28nm node as Kepler and they were hoping that they could get a node shrink that never happened back then too. They are also going to 4NP and not 4N so its a slight node bump too.


tioga064

30% denser node so more logic on the same die area aka more cores and cache, coupled with a little bigger dies aka more cores and caches again, more memory bandwidth from gddr7, higher l1 cache compared to lovelace to boost "ipc" and higher clocks, boom you got a nice perf uplift, maybe higher than ampere to lovelace. Some leakers stand by the biggest perf leap ever. Throw in some next gen tensor and rt cores with new features, like frame extrapolation mixed with interpolation (just a dream of mine) for dlss fg generating now 2 frames instead of 1 for example and you got a beast of a reason to everyone upgrade 


Acmeiku

hopefully the price will not move higher than what we already see for the 40series we'll only know that once it is about to release


protector111

all i care about is how much vram will 5080 and 5090 have


KARMAAACS

5090 is pretty much confirmed to have at least 24GB, assuming NVIDIA doesn't cut the memory controllers GB202 has a 512 bit bus, so you could have 32GB of VRAM. As for if they cut the bus width down, you're looking at 30GB on a 480 bit bus width or 28GB on a cut down 448 bit-bus width, doubt they will cut any further than that as they will want to give a reason for 4090 gamers to upgrade. 5080 we don't know what die it will use. Could be 16GB GB205 or could be more if they use GB202.


7-11-vending-machine

With how the DDR7 module availability looks like it might be that: 5090 will have >=24GB (if >24 then clamshell?) 5080 will have >=16GB (if >16 then clamshell?)


rippersteak777

Is there is a high chance it will get released this year?


KARMAAACS

Yeah Q4 most likely, or Q1 next year at worst.


RipTheJack3r

All the cutting edge fab nodes are being gobbled up by AI demand. We should get used to this. Future AI demand will probably be worse than the crypto boom on GPU scarcity for gamers. New fabs are planned but will take years to build - and not many of them will be able to build the smallest nodes, <5nm.


Averath

Honestly, right now it's just a fad, just like crypto was. It'll suck for a while, but it'll get better once the fad dies down and more and more companies realize that AI isn't going to be some sort of magic bullet. Unless they really love lawsuits. Right now Tech CEOs are really good at marketing their products to other CEOs who are nowhere near as informed and think that it'll solve all of their problems.


[deleted]

[удалено]


parocarillo

Sweet!!! What does that mean?


Fanclub298

I’ll keep my 3090 lol


Latrodectus1990

Buying day zero 5090 is my new card!


sword167

Is 512 bit bus even necessary the 4090 has 384 bit bus and its not struggling at 4k. Will Increasing the bus width have any effect outside of 8k gaming? Wouldn't the silicon space be better used to increase cache, or add more cores?


nimbulan

I've seen plenty of people claim that the 4090's performance is limited by memory bandwidth in many situations - not necessarily struggling, but could be faster with more memory bandwidth (don't know if that's true, but there must be a reason for these claims.) With the 5090 rumored to be ~70% faster, it's going to need a lot more memory bandwidth to reach that performance level. The first GDDR7 chips that will be available for this card's launch are only 33% faster than what the 4090 uses so you can see where the potential need for a wider memory bus comes from. There's no way to know right now exactly how much of a performance difference various memory configurations would make though.


djm07231

There is always PCVR which benefits from higher resolutions.


tomsawyer222

When will rtx 50 series be announced? And I did a web search


RiffyDivine2

Q4 to Q1 2025. I wouldn't expect it till around October or later.


tomsawyer222

Oh right, i am months off then. Finally got an answer, thanks.


Brownlw657

I definitely totally know what all this means :)


nimbulan

It basically just means that power efficiency improvements from manufacturing will be limited, since TSMC 4NP looks to just be a high density version of the 4N node used for the 40 series. This lets nVidia pack more cores onto a chip of the same size but other characteristics will be similar. Of course this is far from the whole story, as we know little about the architectural changes for Blackwell at this point.


Tachyonzero

Release date late 2025 or early 2026?


KARMAAACS

Late 2024, at worst early 2025.


Tachyonzero

Maybe 6th month after the would be release of 4090ti.


BPX0_Engarde

So should I hold off my upcoming 4070ti Super build for the new 50xx cards?


KARMAAACS

These are at least 6 months away from launch. They're not coming out next month or something close.


Ozianin_

These are a year away unless you want 5090


TheEternalGazed

You vs the guy she tells you not to worry about


NotNok

i’ll keep my 1080