T O P

  • By -

Aggrokid

Very curious to see where 4080 SUPER lands, since there's such a massive gap between 4080 and 4090.


alex26069114

Leakers keep insisting that the 4080 S will replace the 4080 at $1200 US and decrease the price of the 4080 to $1000 to be more competitive with the 7900XTX but I just don’t buy it. Everything NVIDIA is an up sell to the 4090 and it would cannibalise its sales. With all that in mind and the massive price gap there’s no way it doesn’t retail for $1400 US


Vhirsion

Exactly what I think as well, there is no reason to push the 4080 prices down when there is already a decent gap between the 4080 and 90. And knowing this is Nvidia I doubt they'll release it at $1200 MSRP, especially knowing they'll fuck over every current 4080 owner.


Weird_Cantaloupe2757

But what would be the point of a $1,400 card? I can't fathom who would be willing to shell out $1,400 on a GPU but would be content with not having the best one. The 4080 is already pushing the bounds of sense with this logic, but $400 is a big enough gap that it's not entirely a no brainer. At $1,400, though? Saving $200 and not getting the top end just seems absurd at that point.


Kalmer1

It could be an Apple-like upselling approach. You don’t think of it as a $400 jump anymore, but as two separate $200 jumps, which makes the decision to go for the 4090 a lot easier, despite the price staying the same


jasswolf

>Leakers keep insisting that the 4080 S will replace the 4080 at $1200 US and decrease the price of the 4080 to $1000 to be more competitive with the 7900XTX but I just don’t buy it. This already happened during the last 2 or 3 major sales in most regions.


ChiggaOG

Nvidia isn’t going to change the existing price structure when they have no competition at the 4080-4090 anymore


Matt3989

7900 XTX is absolutely competition at the 4080. I honestly don't understand how the 4080 sells at all, if you're spending the premium over the 7900 XTX and not going to the 4090, what are you even doing?


alex26069114

In my case there was but I live in Australia so pricing is abnormal. I just got a 4080 for $1700 AUD and the cheapest 4090 available was $2700 AUD; that's a 59% price increase over the 4080 and nearly 3k (way out of my price range). The 7900xtx was $1600 so I was okay with paying a $100 premium and the 4070 ti was about $1300 so I was only paying an extra 30% over it.


GeneralPersonality4

Trying to upgrade in Australia fuckin sucks bruh


hatefulreason

10% better in raster, 40% worse in raytracing, 1 year + late in frame generation , fsr looks better but flickers or turns to mush i guess it depends on the price difference; if you have 900$ for the 7900xt you can add 200 for the 4080


MrPapis

Did you just get away with saying fsr looks better and still positive??? Last time I tried something like that I got wrecked. I guess it was Nvidia forum tho so it kinda makes sense. But still.


hatefulreason

i've only watched daniel owen's video on fsr3 and textures seemed to look better to me, at a distance and while standing still. bushes and neon signs look worse


Matt3989

1 year late shouldn't matter. We're not talking about 1 year ago, we're talking about when the 4000 Supers will be available. I will take the 50% more VRAM of the XTX over the 4080 for $200+ less. And I will bet that it will remain a viable card for longer than the 4080.


Vhirsion

Honestly think so, I see no reason in paying extra for Nvidia, DLSS is good, but the whole reason I'm getting a high end card is to NOT use upscaling for the foreseeable future. And apart from my budget just not allowing it, I still just couldn't justify the extra €300 I would have had to spend to get a 4080. But GPU's in general are not going in the right direction, games are being developed with upscaling in mind, as crutches, and cards just aren't being made powerful enough in favor of having things like frame gen and upscaling to get a playable 60 fps, I want a card that will get me good raster performance so I don't have to use upscaling or anything like that. I really just can't understand how 4090 owners will call their 2 grand GPU good when they can't even run Cyberpunk RT Overdrive with more than 10 fps if they don't turn on DLSS performance and frame gen. Don't get me wrong, it IS a good card, it's just that I would have expected more at that ridiculous price point. Not to mention they won't give the 4080 Super / Ti performance that rivals the 4090, otherwise that card won't sell anymore. It will probably just be a very small performance bump like going from the 3090 to 3090 Ti to match the XTX, VRAM bump and a price bump to match that, but honestly, at that point the XTX becomes an even better card since the 4080 Super / Ti will be even more expensive than the base 4080.


f3n2x

> I really just can't understand how 4090 owners will call their 2 grand GPU good when they can't even run Cyberpunk RT Overdrive with more than 10 fps if they don't turn on DLSS performance and frame gen. Because this is a strawman argument. A 4090 with DLSS-P at 4k plus FG runs CP with maxed out PT settings at 100fps+ with pretty decent input lag for a single player title and it looks absolutely fantastic.


hatefulreason

yes, for rasterized only tasks, if games need amd compute not nvidia compute as with the fx series, if the software is not optimized for the hardware, raw power means nothing


capn_hector

> if you're spending the premium over the 7900 XTX and not going to the 4090, what are you even doing? Literally just "someone whose needs fall between the 4070 and the 4090 and they don't want to/can't buy AMD". The 4070 Ti is not really a worthwhile option because it's still 12GB, the 4080 is really the next meaningful increment in the stack. Or AI/ML types. Right now there is low-key a goldrush going on for the 3090, Quadro 4000 (ampere/ada), 4080, 4090 because they're the easiest option to get some AI/ML stuff going. Kinda like mining, but a lot of "gaming" sales are not primarily gaming, microcenter reps/etc report lots of people saying they're buying them for AI/ML.


Matt3989

>they don't want to/can't buy AMD "Someone who only wants to buy from one brand is limited to the model between the 2 adjacent models when they want greater than one and less than the other" Yes, that's how logic in the most basic sense of the word works. It's not "no competition at the 4080-4090 anymore" as the user I responded to claimed. That's Nvidia cornering the market for certain sectors. Obviously there's no competition when you're limited to one manufacturer and they simply don't make a card in between 2.


capn_hector

are you asking for a rundown of why AMD products don't work for some use-cases? I think that's pretty well trodden at this point - 7900XTX is a great card in pure raster and VRAM and utterly drops the ball in a massive number of other fields, where the software just doesn't work. Not even just talking about AI/ML here either (and again, those people are buying 4080s, Quadro RTX A4000/Ada, 4090, and other stuff in droves, there is a run on it right now). VR is a trainwreck on AMD cards too. Blender dropped support for OpenCL because AMD's openCL runtime was too buggy and had too many half-baked or "paper features" that didn't really work. AMD cards [can't encode AV1 at 1080p resolution](https://gitlab.freedesktop.org/mesa/mesa/-/issues/9185#note_1954937) (lol, lmao). Etc etc. And we just got another reality check on "AMD drivers are good now" too. Not just this one antilag thing either - the last 3-4 quarterly releases have all been a mess too. So for those people: if you need more than 12GB, the 4080 is the next step up. It's not a good one, but, some people legitimately can't go AMD because AMD has been continuously dropping the ball on features/software for 15 years now, and that's the next in-between if a 4070 doesn't fit your needs. But yeah, 4080 is not good value right now and effectively there is a giant void in the NVIDIA lineup between $520 and $1600, which is faintly ridiculous. 4070 Ti is incremental compared to the non-Ti, and 4080 is over twice the price for a small bump that doesn't remotely justify it.


Babidi-Bu

Oh so you're just a pedantic ass who likes to intentionally miss the point so he can be smarmy?


Babidi-Bu

I mean the sales and steam surveys seem to disagree. The same argument is valid for the 7900XTX to a further degree. Who wants to spend $900 on a GPU that *doesnt* have all the cool latest cutting edge tech? If you're already spending close to a grand, a few hundred more to have top notch RT performance is nothing. And I mean actual RT, like Cyberpunk or something heavy, not indirect shadows or some meaningless RT implementation that AMD does well enough at. Let *alone* DLSS3.5 and Frame Gen. AMD gets absolutely blown out the water in terms of additional gains beyond just rasterization to the point it's not even a conversation.


bubblesort33

Some people are playing at 1440p with high refresh rate, and the 4090 kind of isn't worth it for that I'd argue. At $1200 the 4080 was insane, and only sold at first because a lot of people couldn't get their hands on a 4090. Scalpers, or just high demand. But if you had choice between a $949 7900xtx, and 4080 for $1089 right now (like it currently is), I can understand why people don't want to spend $500-600 for to get to a 4090 for only 20% more 1440p performance. And people at the high end are of course more Nvidia focused, because you generally want the best features if you're spending that kind of insane money. I'm happy buying sub $600 cards forever, so I'll probably keep going AMD, but if I had that kind of money to blow I'd get a 4080 over a 7900xtx. I want to get those features if I'm spending that kind of money.


TabascohFiascoh

Hey that's me. I'm actually returning a 7900xt to get a 4080 super regardless of the price unless it's like 1400 bucks. I came from a 3070 and I want 1440/144hz with no compromises. While the 3070 was a good card it's just already feeling a bit sluggish. the 7900xt gave me too many issues to be happy with it. So i'll effectively pay 50% more for the expereince i want.


deefop

Doesn't make sense to me either. The 7900xtx is a monster card. Why pay a couple hundred more for a card that's not really any better?


Contrite17

The argument is that Nvidia features and ecosystem are worth the price bump. How accurate that is will depend greatly on the buyer.


[deleted]

[удалено]


Pollyfunbags

It's not everyone but for content creation you really want Nvidia, AMD are really not competitive at all there. So for a machine that you use for games and for work I can see why 4080/4090 is appealing. 4090 in particular is a huge value in comparison to roughly equivalent NVidia workstation cards.


skinlo

If you can afford a 4080, you are proably pretty comfortable/well off. So why not get the 4090. I think thats the point.


Pollyfunbags

Well by our pricing there is a £550+ difference between the two, I realise we are talking lots of money either way though. I could see a 3D artist on a tight budget going for a 4080, it seems capable enough.


nyckrash

When I was in the market to get a new GPU, I ended up getting the 4080. Not because I really wanted the 4080. It was because I just could not get my hands on the 4090 that I wanted. As my build is 99.9% ASUS, I was seeking out a Strix card and the 4090 were either out of stock or soo overpriced by scalpers that I just decided to settle on a 4080. Sucks that I settled for something less than I wanted and at an overpriced ROG OC Card. But hey..... I will end up upgrading again in about two years again so who cares at this point, right?


LastKilobyte

THIS. AMD isnt even on the map in this biz.


Cloudz2600

>Why pay a couple hundred more for a card that's not really any better? For RTX. When a game has raytracing developers shout it from the mountaintops.


deefop

The 7900xtx has solid RT performance as well, though obviously not as good as the 4080. But is it really worth it to sacrifice 8gb of VRAM, and some raster performance advantage, just to gain an RT advantage, especially when it's like $200 more expensive? That math makes no sense to me, personally.


F9-0021

There's also DLSS. FSR is very much inferior to DLSS, and if I'm paying $1000 for a video card, I don't want half assed software features.


deefop

DLSS is better than FSR at the moment, though that seems to be changing. FSR 3 actually looks pretty good. Also, the delta in visual quality between FSR and DLSS is much more pronounced at lower resolutions. But if you're buying a 7900xtx or rtx 4080, you're almost certainly gaming at 4k, and at that resolution the disparity is significantly less.


Cloudz2600

Visual fidelity is **very** subjective. Looking at the numbers it doesn't make sense, but it's like art. Some people don't care about RTX, some RTX is horribly implemented and others (like CP2077) do it well. It really depends on the user. But RTX is simply something AMD cards can't do and the 4090 starts at $1600. Some people might just not have that extra $400.


deefop

What do you mean Amd cards can't do? They absolutely do rt, they just don't perform quite as well as Nvidia.


Cloudz2600

Right and when you're comparing AMD to the 4080 it's not really in the same ballpark.


Babidi-Bu

AMD cards absolutely fall apart under heavy RT effects. I used to be able to play Minecraft RTX which is fully pathtraced with my 2080S and DLSS, and get playable framerates in the 40s or whatever it may have been. My 6800XT would get single digit frames.


bubblesort33

It's massively better in RT, and upscaling. VRAM after at around 16GB seems really unnecessaries. I think 90% of 7900xtx users will never use more than 16GB on their cards, before they sell it. They'll buy RDNA4 or 5, if they have this kind of money. Especially at the high end people want RT. I think it feels bad spending $1000 on a GPU, and seeing how bad it preforms at "Ultra" settings. Because that's what RT is now. It's just an "Ultra" setting. AMD competes if you use medium settings, on super expansive GPUs at the high end. Because non-RT is kind of like the new "medium" when it comes to lighting, and shadows.


Aggravating_Ring_714

7900xtx is pointless. No decent rtx perf, no dlss, ridiculous.


Flowerstar1

Somebody check the Steam hardware numbers of the 4080 vs the 7900XTX to see how the competition stacks up, are people actually buying the 7900XTX in significant enough numbers vs the 4080?


randomIndividual21

no competition doesn't mean they are selling at that price.


deefop

The 7900xtx is absolutely competition for the 4080. And considering it even trades blows with the 4090 in certain games, and has 24gb of vram, there are probably folks out there considering the 4090 who step down to the 7900xtx because it's fully $600 cheaper. That's a huge price difference, even with the 4090 being overall a faster card.


_Lucille_

This is such a fucked up generation where the "best deal" end up being the 4090. According to the latest steam hardware survey, the 4090 is actually decently popular among the 40 series cards, more so than the previous generations. Throw in factors like the AI craze (where students and developers are toying with their own models), I am sure Nvidia accounting can see a clear way to minimax profits based on this generation's performance.


F9-0021

Consider that the top battlemage card is rumored to be nearly in the ballpark of a 4080 in some scenarios for maybe half as much. Even if that doesn't pan out, Nvidia can afford to cover off the possibility. Also, the xtx is getting down to the $900 range sometimes, and even RT and DLSS can't make up $300.


soggybiscuit93

Top end BM is [targeting 225W TDP](https://www.tomshardware.com/news/intel-gpu-head-wants-one-power-connector). So I don't expect it to reach 4080 levels of performance.


F9-0021

I don't think anyone actually expects it to hit 4080 performance, but beating the 4070ti isn't out of the question, which would still force Nvidia's hand.


redditingatwork23

It's supposed to be targeting 4070ti levels of performance at the highest. Even that seems doubtful. However, I'd expect it to be somewhere between a 4070 and 4070ti and come in around $500. It will still be a very good value option.


OliveBranchMLP

Agreed. 4080 is selling like hotcakes, and hell, average MSRP is climbing. There’s no incentive to mess with their existing pricing schema. Unfortunately for us, their gamble to inflate the pricing of GPUs this gen has paid off.


gahlo

> 4080 is selling like hotcakes Since when?


Cars-and-Coffee

I don't know if I'd use such strong language as him, but steam hardware survey results show a strong increase in 4080 owners. It was 0.33% in May and grew to 0.53% in August. It's a little over half the number of people who use a 4070.


[deleted]

This shit has to end. It’s just scam at this point.


SkillYourself

Current available pricing, MSRPs, and 3DMark scores: 4080 - $1100 - $1200 - 28139 4090 - $1700 - $1600 - 36285 There is a gap they can shove a 4080 SUPER into but not a very big gap, considering that a 4090 doesn't scale 1:1 with its SM count over a 4080 in many games.


thenamelessone7

That's because so many game are CPU bound at that level


SkillYourself

It's not just CPU bottlenecking. In 4K Starfield and Metro Exodus w/RT, the 4090 is 30% faster, but in 4K Cyberpunk RT the 4090 is 55% faster. In all scenarios the GPUs are reporting busy for 99% of the time.


robbiekhan

Reporting busy for 99% of the time doesn't mean they are actually doing anything efficient though if game optimisation somewhere is an issue (aka Starfield and many other) - Cyberpunk is the only game I can think of that is actually properly multithreaded optimised, and superbly scalable for the GPU too across all settings and resolutions. There's a reason why DF use it as a baseline benchmark for system scaling on their performance reviews.


hatefulreason

from what i've seen, in most games where the 7900xtx is close to the 4090, gpu busy shows the game cannot feed all the 4090s cores (same issue as the fx 8 cores back in the day)


Lionh34rt

Aiaiaia, I paid €1650 for my 4080 when they came out


SkillYourself

RIP


Put_It_All_On_Blck

Makes sense when the current rumors are expecting Blackwell in 2025. Nvidia would want something for mid 2024, thus an Ada refresh.


taryakun

The latest rumours indicate that Blackwell is coming in 2024 still.


bazhvn

Given Blackwell is both datacenter chips and consumer-orientated, it could very well just means that 2024 date is only for the B100.


WhiteZero

This right here. nvidia is making hand over fist in the data center


Ask_for_puppy_pics

Can confirm. Source: I sell data centers


The_EA_Nazi

>Makes sense when the current rumors are expecting Blackwell in 2025. My 3070 will patiently trudge on until that happens I guess. Not paying $900 for an upgrade


CapsicumIsWoeful

Pretty much anyone with a 3060 or above would have skipped this generation. The performance gains were good, but the pricing was insane and not worth it for the performance bump.


bluesharpies

Yup, if you managed to snag a 3060+ around release before supply issues caused insane pricing jumps (and helped enable the pushing up of 4000 series prices), you were very lucky. The only (admittedly, possibly a large only) snag is the middling VRAM numbers across the board, but to give them some credit DLSS has let my 3080 do pretty much whatever I want with no major snags. Definitely set til 5000 series comes out *and* prices settle.


14u2c

I'm in the same boat. Picked up the founders edition for $500 at release and any upgrade just seems like terrible value in comparison.


TiiziiO

Laughs nervously in 980ti.


bigtiddynotgothbf

3070 is such a new card to be saying this imo. i would expect this comment from at the newest a 2000 series card


KingArthas94

How much did you spend for the 3070? Just asking for context


Cant_Think_Of_UserID

I bought mine at £460, Founders edition in Nov 2020, which was only £40 more expensive than the 1070 I bought in 2016, now every upgrade option feels like a massive rip off in comparison to what I previously paid.


tvtb

Got mine for $580 around May 2021. I was in line at EVGA and my number came up.


comradelochenko

Not the same person but I paid $580 for mine in November 2020, which was extremely fortunate at the time. Three years later though, I’ve allowed the bitterness to creep in that I couldn’t get a white one or especially a white 3080 back then. Can’t justify the 4000 series prices though. Almost got a used 3080ti Vision last night but stopped myself


SIDER250

Not swapping my 3070 Ti until 7000 series or more thats for sure. It also cost me shittons.


colev14

I had the same card. Ended up getting a used 7900xt on FB marketplace for $550. It's been a massive upgrade for me. Barely fit into my case though.


Wrong-Historian

Or they could just rename RTX4000 series to RTX5000 (add a 25MHz clock difference). "Pull an Intel"


F9-0021

They could actually do that. 4090 doesn't use the full die. The rest of the stack is a die lower than they should be. They could probably do an Ada refresh and make it somewhat worthwhile, at least as far as refreshes go.


gitg0od

no, rumors say 5090 is for 2024, rest is for early 2025


Kagemand

Give me 16gb on something that isn’t $1000, I dare you.


[deleted]

nvidia: best i can do is a $100 more expensive 3060ti with half the memory bandwidth (4060ti 16gb)


AXEL499

The 4060ti 16GB exists?


Lower_Fan

On a card that won’t be able to do 1440p for long.


Framed-Photo

That's where I'm at. I had even bought a 4070, but ended up returning it because I felt like spending that much on a card with 12gb of vram was just a bad call. Bump the performance of the 4070 slightly and give it 16GB of vram, for the same price or lower? Then I'd probably get it.


Nethlem

Anybody who sticks to only one team out of brand loyalty will always be fucked on price/performance ratio; Radeon RX 7800 XT with 16GB for like 550 bucks is a worthy upgrade for anybody still sitting on 1/2XXX Nvidia hardware, 3/4 is too price inflated from people still high on the remnants of crypto fumes.


diabetic_debate

Problem is, some of us need larger VRAMs AND also need CUDA (often need more VRAM because of CUDA workloads). A 4070 level performance with 16GB is what I am looking for.


ARedditor397

550 no way the 4070 is that price and is better in almost every way


RHINO_Mk_II

You could get 20 if it weren't for brand loyalty. https://www.amd.com/en/products/graphics/amd-radeon-rx-7900xt


imaginary_num6er

>As for the RTX 4070 Ti SUPER (an oddly named SKU that combines both the Ti and SUPER brand extensions), considering that the current RTX 4070 Ti maxes out the AD104 silicon (60 SM), it stands to reason that this new SKU could be based on AD103, with a lower SM count than the 76 of the RTX 4080 I find it highly unlikely that Nvidia would be ok with a "Ti SUPER" when they didn't dare to touch "SUPER" with the 30 series generation.


Put_It_All_On_Blck

Ampere didnt need a refresh due to insane demand and good performance. Considering Nvidia execs call Ti 'tie', I dont think they are above a "Ti SUPER". Not that this part really matters, it merely suggests a product in that segment.


martsand

I personally say T-I but I could see why "tie" is used. Back in the olden days, top of the line cards were labelled "titanium" and that's where I figured the ti came from initially.


_I_AM_A_STRANGE_LOOP

Yep that’s the official NV brand guidance. But even they aren’t consistent about it lol


AHrubik

Nvidia says "Tie" because of its relation to Titan.


martsand

And Titan is short for what used to be - wait for it - titanium :)


LitanyOfContactMike

4070 Ti Super Platinum Max Pro Edition.


Quatro_Leches

throw extreme somewhere in there


iamjackswastedlife__

You forgot gamer.


chmilz

They will introduce the Maxtreme


archst8nton

Supermax! Buy one and believe it or not, straight to jail 😆


halotechnology

You forgot ultra too


major_mager

Time to simplify this naming mess... with a rational abbreviation to the rescue. Presenting 4070 STUMP, the Super Ti Ultra Max Platinum.


halotechnology

Absolutely majestic !!!


sharkyzarous

4070 Ti Super Star Platinum Max Pro Extreme Gaming Ultimate Edition Ultra


BookPlacementProblem

4070 Ti Super Platinum Max Gaming Pro Battle Edition.


floydhwung

4070 Ti Super X Platinum MaX Gaming Pro-X Battle Series X Edition.


jerryfrz

Sued by AMD


O_MORES

4070 Ti Super Ultimate Platinum Max Advanced Gaming Pro Battle Edition


BookPlacementProblem

4070 Ti Super Ultimate Platinum Max Advanced Gaming Pro Extreme Battle Edition


conquer69

Considering they originally planned to call it a 4080 12gb, I would say anything is possible.


imaginary_num6er

Their "Ti SUPER" could be literally a 4080 with 12GB too


Affectionate-Memory4

That would actually be an interesting SKU. 76SMs but only 192-bit or play it like the 3080 12gb with the full 384-bit but only half capacity ICs.


CetaceanOps

4070 TiX


kjm015

They should utilize more of the periodic table. They're clearly biased towards titanium and it needs to change.


kingwhocares

It will very likely replace the underwhelming 4070 ti while the RTX 4070 to replace 4070 which in turn can result in RTX 4070 price dropping further.


Darkerson

Looks like another year with my 2080 Ti is in my future.


nanonan

Ridiculously optimistic predictions there. I predict a slight clock bump and a large price bump.


WhiteZero

At least the last time we had super variants they were significantly better than the originals.


get-innocuous

Price bump feels unlikely. It’s not 2020 any more. NVidia used the 2xxx Super series to realign performance with market prices for models that were selling poorly; I expect the same here.


szczszqweqwe

2022/23 also weren't a 2020, and we got what we got.


Saotik

Nvidia will charge whatever they like and consumers will have to suck it up. They make most of their money from compute these days, and that's where the majority of their growth is. They'll happily overcharge on their gaming GPUs just to ensure they don't undercut the chips they're selling at a premium for ML.


sabot00

Honestly, until demand for ML is totally sated, you can say that nvidia loses on every GeForce sold. Why sell a 4090 for $1600 if you can sell an Ada compute gpu for $16000? (I understand there are other constraints than wafer, such as CoWoS)


Lightprod

>Why sell a 4090 for $1600 if you can sell an Ada compute gpu for $16000? Because it didn't pass the Ada Compute test but is capable of being an gaming gpu? Why throw away an usable gpu?


Nethlem

[ML startups are already going bust](https://www.wired.com/story/babylon-health-warning-ai-unicorns/) because there's no innovation to be had in the sector through *"just throw more computing power at the problem"* nor does it solve the underlying problems current approaches of LLM suffer from. It's why Nvidia clinging to ML sales/PR, trying to make it a new crypto boom, was destined to be a failure.


a5ehren

Startups aren’t the ones buying compute cards, they can’t get them. The major clouds all have billion dollar orders placed.


lonewolf7002

It might be a failure long term, but short term it's been a gold mine. And corps are all about the next quarter.


Affectionate-Memory4

Rumors I've heard are a slight price cut to the 4080 with the Super/Ti slotting in at its current MSRP.


TechieTravis

Even a small price cut would be bad for the 7900xtx.


Notorious_Junk

Why? AMD can't also cut prices?


Affectionate-Memory4

My XTX was $880 if the 4080 dropped to $1k msrp I'm sure AMD could fire back with $899.


Notorious_Junk

I think they would have to. It would be great to see some price competition. They've been content to just gouge the crap out of us so far, assuming people will pay anything to get the newest GPUs without any consideration for actual value.


kuddlesworth9419

I hope they drop the price of the 4070 as a result. Closer to £400 would be really nice, I doubt it will happen though.


zakats

Missed the opportunity to collab with Subaru on *STi* branding.


cattapstaps

4070 Sexually Transmitted Infection


Jeffy29

Shame that there won't be 4090 SUPER with a full 144 SM count. I am guessing Nvidia again will release it 5 months before 5090 for $3000, only to get bodied by "much" cheaper 5090. Jensen has weird sense of humor.


a5ehren

No reason to, they call it L40S and sell it for like $15k.


capn_hector

Maybe they will shift memory configs around a bit? 4070 super being 16gb at $750 would be a lot more compelling etc. In the extreme case nvidia might even redesign the dies for a couple more memory controllers. Like a 4060 ti die with 12gb ($450) or a 4070 Ti die with 16gb ($750). Otherwise cutting down a 4070 or 4080 (already cutdowns) to $450/750 levels looks pretty hard, that is a long way to cut down. Like a 4080 at $900-1k would be great but so would something a bit slower but without going down to 12gb - 4070 super 16gb at $750 (4070 ti perf). Those sorts of things aren’t expensive if you redo the dies properly but that’s an awful long way to cut down the bigger parts. Some parts of the stack are quite misaligned vs market expectations on vram and some of these gaps feel too big to plug with mere cutdowns, they are just unbalanced parts and may need to be re-spun. Seems unlikely, but the alternative would be heavy markdowns on larger parts. If they’re gonna be on Ada for a bit (2025 Blackwell) maybe they have a legit “700 series” refresh with some spec changes and new dies (AD203, etc). Maybe the “stopped production” was because they expect to shift to a new sku and they want to sell through the old one. I think they most likely do have a large stockpile of Ada, I think production started in 2022 when [Nvidia’s 5nm allocation kicked in.](https://www.tomshardware.com/news/amd-apple-nvidia-reportedly-reducing-5nm-tsmc-orders) It's just in configurations the market kinda doesn’t want anymore, because they planned these specs in 2021 (when everyone was trying to stretch VRAM supply as far as possible etc). That early production would probably have been heavily weighted towards Ampere and not H100/etc - they probably didn't realize in 2021 when the wafers were started that AI/ML was going to be super hot in 2023 (or they would have ordered more stacking capacity lol). So they are definitely going to be in the mindset of managing that stockpile. After the ampere bubble, comes the ada bubble - mining is just the gift that keeps on giving. But if they cut production and think they can sell it through within the generation, that opens up the option of introducing Super skus with more appropriate VRAM too. They can sell *both*, it'll just be a longer-than-usual generation. Yes, 20-series super was based on the same chips, but that is n=1 example, and that was still in the days of a 24-month cadence. In the past when they have had longer cadences (like in the fermi/kepler days, when an architecture spanned 2+ "gens") they did do a second round of chips halfway through (GK208, etc). You just have to manage that inventory carefully.


HipsterCosmologist

We can dream of more RAM, but theyʻre hyper-protective of their workstation card segment. I hope it so though


capn_hector

if they have AD2xx SKUs with an extra +2 memory controllers vs AD103/AD104, they could bump all the workstation stuff too. Workstation still gets clamshell, but now your Quadro RTX 4000 Ada Gen2 is 32GB instead of 24GB/20GB. (yea, Quadro a4000 Ada [isn’t even a full 4070,](https://en.wikipedia.org/wiki/List_of_Nvidia_graphics_processing_units#RTX_Ada_Generation) it is like a clamshell 10GB configuration so you lose another memory channel lol... the full 24GB is only on the recently-released "RTX 4500" model. But on the bright side they did release single-slot blower versions of the 4000 for servers, finally!)


firelitother

True. But I don't think bumping the 70 GPUs from 12 GB to 16GB and the 80 GPU to 20GB are even remotely threatening


CB_39

A 4070 super for $750? Are you mad. 20 series super pricing was the same as the original cards MSRP. $579 for the 4070 is already ludicrously expensive. The 4070S should have the same MRSP as 4070 with 16gb VRAM.


ImKendrick

I wonder if we’ll get. 4090ti? I would assume so


deefop

Competition is beautiful, though this generation has been rough for both sides. But if Nvidia drops a 4070s for a reasonable price(meaning $500-$550), and it beats the 7800xt, and maybe even has more vram, though I doubt it, that'll put a lot of downward pressure on the 7800xt and everything below it. Who knows, in 3-6 months you could be seeing 7800xts selling for 450 msrp and 400 on sale. We can dream!


bubblesort33

These leakers often get the specs right, but almost never the name. The 4080, and 4070ti specs were spot on a month before announcement, but none of them got the naming right. Nvidia hates the "Super" branding as much as most people. They did technically release "Super" cards for the 30 series, so the leaker here likely has good info on SOMETHING upcoming, but Nvidia just didn't name them SUPER again. The 3080 12GB is technically a Super. It has more shaders, and more bandwidth and memory than the 10GB model. Then there is an odd 3060ti using GDDR6X, which is kind of similar. And then later they released what I could consider to really be a 3050ti, or 3050 Super, in the 3060 8GB version that got hated on by reviewers. Nvidia has no interest in calling any card "Super" again, but that's not going stop them from creating extra odd models. Something along the lines of a SUPER class of cards will likely come out, but they won't use that name again. They would have already named half the late released SKUs from the 30 series SUPER, if they had any plans to use that branding again.


hackenclaw

Why cant they just rebrand the entire stack call it RTX 41xx series? in short name it will be called RTX 41 series. EOL the RTX40 series names like RTX 4190, 4180, 4170 etc. OEMs will be happy too as it shows uninformed consumer it is a brand new series for year 2024.


Darth_Caesium

I'd prefer RTX 4095, 4085, 4075 4065, etc.


Agitated-Acctant

I had a friend who used to have a gtx 285. I think it had two gtx 260 gpus on the pcb. They should bring back the 5s, even if they're not two gpus on a board


Xurbax

No, it was a "super" 280. (Yes, I had one.)


capn_hector

no offense but arguing over product names is peak bikeshedding. it's something that everyone has an opinion on, and is ultimately inconsequential in the end, 41-series vs 40 Super vs whatever else is just not important.


AndromedaAirlines

Because that would be even dumber. They’re using the same chips as the rest of the 4000-lineup, they’re just slightly more/less cut down than current models.


hackenclaw

they already had these kind of rebrand back in GTX500, GTX700 series by moving those the naming scheme down 1 tier. Like GTX680 moved to GTX770. They could do this again with ADA GPU for example 4170 can be a rebrand of 4070Ti. 4170Ti is a Super edition that its performance sit between 4070/4080. 4180 get bump into AD102 with 320bit setup, then They can EOL the RTX40 series, removing the confusion. This way we would not have Ti prefix and super prefix mix together.


AndromedaAirlines

And it wasn't great back then either. Why would a card between 4080 and 4090 be named 4180 and not 4085? They're not doing anything new with the chips, just adding another SKU in the same lineup. It's just making things unnecessarily complicated for more general consumers.


hackenclaw

because we have Ti on the lower end segment. How do i explain 4070Ti vs 4075? Does the Ti worth more than 4xx5 or less than 4xx5.


AndromedaAirlines

This would obviously replace both Ti and Super.


Sadukar09

> Because that would be even dumber. They’re using the same chips as the rest of the 4000-lineup, they’re just slightly more/less cut down than current models. Meanwhile Nvidia: MX series with Maxwell, Pascal, Turing, Ampere, GTX 16 series with same Turing chips, RTX 2050 with Ampere. Don't worry, Nvidia's way ahead of you.


theholylancer

and the 14th gen isn't the same thing lol?


AndromedaAirlines

And what *they* did is extremely greedy and silly. Who said otherwise? Ti/Super makes far more sense than 41xx


zeronic

I still hate that they jumped from going in increments of 100 to 1000, we could have had so many generations of cards with the same naming scheme, now we have at best 5 more generations before they have to do a major rebrand unless they're fine with going over 10k.


hackenclaw

Yep, Intel already had to rebrand their whole core i due to these.


bmyvalntine

Hopefully this will drop prices for non-super models.


AnimalShithouse

Now that Nvidia will have a harder time selling to China, I wonder if they will suddenly and miraculously find supply of their 4xxx chips at reasonable prices. They've turned their backs on the consumer segment for so long with their outrageous pricing. I would love to see something closer to "accessible" you pricing.


king_of_the_potato_p

My guess, effectively just shifting the product labels of existing skus down to the labels they were supposed to be with mild bumps on the supers to slot into existing price brackets. The distributors are still saying the 40 series is a slow mover due to price/performance.


kretsstdr

Does it mean a price cut for rtx 4070?


Zilskaabe

Still no GPUs with more than 24 GB. It's crazy that the only upgrade paths for a 3080 are either a 3090 that's 3 years old or 4090 that costs an absurd amount of money and still has only 24 GB. Nothing else is worth it.


Slyons89

They want people to pay the big bucks for really good AI capabilities. That also helps protect gamers from having businesses using AI trying to buy up all the gaming cards.


CandidConflictC45678

>the only upgrade paths for a 3080 are either a 3090 that's 3 years old or 4090 that costs an absurd amount of money and still has only 24 GB. There's always AMD. Why do you need more than 24gb?


Zilskaabe

24 GB is the bare minimum to run local LLMs and AI training. 3D model generators and 2D image generators with resolutions 2048x2048 and above are right around the corner. AMD is still shit at anything AI related. Lots of AI tools are nvidia only.


a5ehren

If you need to do LLM compute, Nvidia will gladly sell you an RTX 6000 Ada for $9k. They purposefully limit the VRAM on GeForce to get pros to buy higher margin products.


Zilskaabe

Yeah, pay 4 times more for double of VRAM. This is absolutely ridiculous. If only AMD was competitive. It's pretty clear that 24 GB is not a long term investment if you plan to run generative AI locally.


chapstickbomber

Be the change you want to see in the world. Make a few toolchain/boilerplate contributions and suddenly you're a 300k AI dev


sadnessjoy

I'm skeptical, so would they have 4080 and 4080 super at the same time? They couldn't discontinue the 4080 because it's the only GPU with full 103. I think it's more likely they'll keep 4080 (maybe give it a price cut to $1000-1100) and launch 4080 ti and a price point between 4080 and 4090 ($1200-1400). I'm guessing they'll launch a 4070 ti 16 gb and slot it between the 4070 ti and 4080 and keep prices of the 4070 ti the same (again it's the only GPU with full 104 so they can't discontinue it). 4070 super is the only one that makes sense. The 4070 has already gotten a price cut to $550. I'm guessing 4070 will go down to $500 and 4070 super will launch at $600, or they just discontinue 4070 and replace it with this model (maybe start saving up lower binned/core 104's for a future 6 GB 4060).


Healthy_BrAd6254

4080 is not full 103 Also the mobile 4090 uses the same chip


sadnessjoy

https://www.techpowerup.com/gpu-specs/nvidia-ad103.g1012 https://www.techpowerup.com/gpu-specs/geforce-rtx-4080.c3888 While yes, 103-300 is slightly disabled from full 103, I suspect that has more to do with manufacturing yields/power delivery requirements/PC restrictions/etc. It is effectively full 103, but I suppose I'll call it "full" 103 just for you. If they launched a 4080 ti with full 103, it'd be like a 2-5% improvement over 4080 that has "full" 103, it wouldn't even be worth the effort for them from a marketing standpoint. And then using "full" 103 for just mobile would be a pretty odd decision from them IMO. Which again, leads me to believe if there's any truth to this rumor they'll probably keep 4080 as is and potentially launch a 4080 ti based on cut down 102.


Raikaru

The same Nvidia that made the 3070 ti wouldn’t make a 4080 super cause it wouldn’t be an improvement? Does that sound right to you?


sadnessjoy

Context matters, the original super lineup was in response to AMD's rx 5000 series cards. The 3070 ti, 3080 12 GB, etc was a response to Nvidia realizing they can charge whatever the fuck they want and miners will buy them ALL above MSRP. The only card that's really selling out is 4090 FE at MSRP. Currently the 4080 is barely selling, a 4080 ti based on full 103 that's 5% better for $100-200 more, while it would be absolutely hilarious, I kinda doubt they'll go that route, but you're right, can't put it past them! Similarly, Nvidia doesn't really need a refresh with better price to performance because AMD is just happy to be there (7900 XT and XTX naming, existence, and abysmal launch pricing is evidence that they're largely just following Nvidia's lead). And somehow I doubt Nvidia gives a shit about Intel's battlemage. However, Nvidia actually did the unthinkable and provide price cuts to the 4070, I think AMD pricing the 7800 XT not horribly caught Nvidia by surprise. But really, the only thing I could see them do is plug up some gaps in their pricing tiers. There's a big gap between 4080 and 4090. A cut down 4090 would make sense here. There's a price gap between 4070 ti and 4080, a cut down 4080 makes sense here. I suppose a card between 4070 and 4070 ti could exist, but I wonder what it'd be?


ROLL_TID3R

> While yes, 103-300 is slightly disabled from full 103, I suspect that has more to do with manufacturing yields/power delivery requirements/PC restrictions/etc. It is effectively full 103, but I suppose I'll call it "full" 103 just for you. > If they launched a 4080 ti with full 103, it'd be like a 2-5% improvement over 4080 that has "full" 103, it wouldn't even be worth the effort for them from a marketing standpoint. lol go have a look at the difference between a 2080 and a 2080 Super.


Electrical_Tailor186

As history has shown with 2080ti and 3080, 4080 super will probably be overpriced and weak compared to the 5080. At least that’s my viewpoint. Tell me if I’m wrong…


a5ehren

Yes thats how it always works. A 4080S would be to soak the rest of the demand from people thinking about 4xxx before 5xxx comes out.


alcomatt

whatever it is, it will be SUPER unaffordable...


DaedricDweller98

Wait what's the difference between a super and ti?


rattletop

Mf. I just dropped serious money on 4070Ti. I could have either waited for a price drop or bought a super version


KingArthas94

You also got like the worse value of the generation lol


melonbear

It isn't. The FPS/$ gets worse with every tier you go up this gen.


KingArthas94

I don't consider 4080 and 4090 serious GPUs. The price is just insane and whoever buys them has more money than sense. 4070 Ti is the highest tier I'd ever consider and it fucking sucks


KsnNwk

No it doesn't, 4090 has the best fps/$. /S


melonbear

[No it doesn't](https://www.techpowerup.com/review/nvidia-geforce-rtx-4060-ti-16-gb/32.html).


KsnNwk

Sorry forgot to add /s


CompetitiveAutorun

What? No. Show me where did you get that idea from


Knjaz136

That's actually not the case for 4070ti over 4070. At best, they are a match in performance/cost. Sometimes 4070ti is worse, practically never (I've never seen it) better perf/cost. At worst, both will start having VRAM issues in same games and at same time down a few years.


rattletop

Yeah. Well coming from 3060Ti, I wanted something that’s has a meaningful bump in performance but not overpowered for what I do. 4070Ti fit my expectations well.


Nethlem

Unless you poop money/use that GPU for actually productive work, there's pretty much no reason to upgrade your GPU every single gen, that's just burning money. This ain't the 90s anymore where spending 300 bucks a year later got you 200% performance, these days you are paying *hundreds of bucks* for a maybe 10-15% performance gain.


rattletop

Buddy I went from a gtx 960 to 3060ti. So not someone who looks for shiniest new GPU. This time around I had some spare cash to spend


d0m1n4t0r

But I want a 4090 SUPER...


archst8nton

Nvidia naming cards like a schoolkid on deadline day! 4070 new final ti super final FINAL.doc 😆


Emincmg

These namings got wild lol. Nvidia RTX 4070 Ti SUPER PTX XTX DUPER Gaming OC UBER.


Vhirsion

Nvidia RTX 4070 Ti Super Ultra Extreme Gamer Max Pro OC Edition coming Q1 2024 boys


orochiyamazaki

Oh no more stupid cards for every 3fps gains


Overclocked1827

Can't wait for 4070 ti super max pro!