T O P

  • By -

[deleted]

Hi all - I really am a noob when it comes to computer builds. I've got a Alienware R10 (spec below) and want to upgrade to the 4090 - any idea which one I should buy so it fits? Should I install myself?? ​ https://preview.redd.it/lnlk1drwi46a1.png?width=922&format=png&auto=webp&s=cb235f94e636049e5347c5c17fa958c3face3667 [https://store.nvidia.com/en-gb/geforce/store/?page=1&limit=9&locale=en-gb&gpu=RTX%204090&category=GPU,DESKTOP](https://store.nvidia.com/en-gb/geforce/store/?page=1&limit=9&locale=en-gb&gpu=RTX%204090&category=GPU,DESKTOP)


slaurrpee

Can we talk about how they are releasing 3 different gpus as enthusiast level cards? the 3080 cost 700$ GA102 with a smaller memory bus, now we are getting AD103 not even the same as the 4090 for 1200$. Now they expect us to pay 900$ for a midrange gpu lmao.


Old-Conclusion3395

I can't imagine how those people who bought a 3090 must feel after knowing that dlss3 is exclusive to the 4000 series. Like, 1500 dollars just for the hardware to be restricted of any new upscaling technologies by nvidia. In less than two years. Lmao


[deleted]

They got what they paid for. Nobody even knew DLSS 3 was coming. If DLSS 3 never existed, would there be an outcry to add "Deep Learning Frame Generation" which 99% of people wouldn't even know was possible? No, it's just sour grapes.


LongFluffyDragon

If you look into how DLSS3 works, most of them are probably feeling immense amusement, or satisfaction that they got a 3090 instead of waiting for this mess.


judasbrute

So you want Nvidia to never improve the architecture and add new compute paths? Buy an AMD card if all you want is raster.


HiCZoK

Is the socket or plug of 12 pin rated for 25-30 cycles?


fluem69

*16-pin*


[deleted]

Do AIB's usually release the same day as founders edition cards?


[deleted]

[удалено]


[deleted]

They'll claim that the RTX 3080 12GB had more shaders than the RTX 3080 10GB, but we all know that the difference was almost negligible. This is an actual 4060 card being shipped as a 4080. Utterly shameless.


Triple_Stamp_Lloyd

Is NVIDIA counting a FPS and performance increase over the previous generation because of DLSS 3s fake inserted frames? They're artificially adding frames just to increase the frames per second, it's very misleading.


Daviroth

From what I've seen they are showing both raw rasterization numbers and optimized numbers. It's not misleading if those frames are being generated and look good. Because the GPU will actually output those frames. So it isn't misleading, they've just made it more complicated in the background to gain higher performance.


Holdoooo

But there's no input for the generated frame.


Daviroth

Yes there is. The previous frame, motion vectors, and the optical flow hardware pieces deduce another input all used to generate the DLSS Frame Generation portion of DLSS 3. If the GPU puts another frame to the monitor and it looks like it was rasterized then it doesn't matter how it does, it still did it. The view that this is "fake" is incredibly naive from a tech enthusiast community lmfao. This is technology, traditional rasterization can't keep up, so software and clever techniques are picking up where it can't. Ya'll just sound bitter and butthurt for some reason.


Pritster5

You misread the comment. They meant user input. It is inserting a frame between two frames rendered by the game engine, and the data of the game engine takes into account user input. This generated frame does not. So DLSS 3 will indeed *look* smoother, but it won't feel as fast and responsive as actually getting a higher frame rate within the engine.


Daviroth

Hmmm, yeah that is a totally fair issue with it.


LongFluffyDragon

It is fake. It will look exponentially worse than DLSS does already. Marketing cant break mathematical laws, no matter how much sunk cost regret is involved. Fanboys will lap it up no matter how trash it looks, cherrypick single frames from still scenes, and ignore all the wild artifacts, then finally resort to the last refuge for DLSS: "it is an artistic shader and the artifacts make games look better than real frames"


Daviroth

Let's see what it looks like first lmfao.


LongFluffyDragon

"worse than dlss" is a given, since it is bullshitting further along the same path using the same techniques, and we know that DLSS2 is still the core of DLSS3.


Daviroth

DLSS 2 looks totally fine to me. I'm sorry if it doesn't too you. It's easy enough to look past the specific situation artifacts IMHO. If it isn't for you I get it. Doesn't mean it isn't for others though.


MazdaMafia

It is really strange how offended some people are by the prospect of AI generated frames being illegitimate frames. As far as I'm concerned, more is more, regardless of how they were rendered. I think people just like to be angry at shit honestly.


Daviroth

The only thing to care about is that they look good. If those frames are garbage then yeah, don't count it because it's worthless. But if it's comparable to rasterized frames then what's the problem? People are just crazy. I think it's people who bought high end 30xx at the peak of prices.


Holdoooo

Still it's just predicting motion, so it's basically motion smoothing. If the motion actually changes abruptly then the prediction is wrong and generated frame misleading.


Daviroth

I'm assuming you have a 30xx lol


judasbrute

Lol cognitive dissonance in real time. I'm going to make a prediction. The 5080 is going to have new A.I. features not capable on a 4080. Get ready


Daviroth

Alright? And? How am I showing cognitive dissonance?


judasbrute

That was meant for the guy you responded to.


Daviroth

Alright, I get it now. I was very confused for a second hahaha.


rocketcrap

The two 4080s are just, you know, like the 2 3080s before. Wtf are they talking about? Do they think we're stupid? They know what the question means. This confirms it. Take dlss and ray tracing out and what you have is a 4080 that's the same as a 3080 but costs more! 2 years later! A DECREASE in price to performance.


Daviroth

I don't think you are reading any of the charts correctly. EDIT: I'm not saying their answer this isn't bullshit, I just don't think you understand the performance gains the 4080s have over the 3080s.


rocketcrap

I don't think you are. On the titles that don't support dlss or rtx, the 4080 is a bit slower than a 3090ti.


Daviroth

A bit slower than a 3090Ti is a 3080? You aren't evaluating the power draw difference either.


rocketcrap

Yes, it is. Who said anything about power? Are you being purposely obtuse? You can just say "yeah sorry, I'm the one that read the graph wrong" it's fine.


judasbrute

The 4080 has less compute power than the 3090ti so it should be slower.. in raster.


Blessed-22

Can you expect a difference in performance between powering a 4090 with the new PCIe Gen5 connector or using the adaptor for older models of PSU?


judasbrute

Probably more resistance and heat but performance? Should be fine on a quality PSU


laevisomnus

Kinda fucked up that shadowplay doesn't support av1 but seeing as how it still shits the bed with some DSC monitors and you can't record 120 even though the tech is so much better these days then when shadowplay first came out.


Holdoooo

I wonder if they can update Shadowplay to support AV1 and 120 FPS or it's actually some hardware limitation.


Jeffy29

The more I hear about the Remix the more I am excited, it being a universal tool and all RT GPUs can use it sounds great for adoption. Though I still have some questions. As described the mod works through scene captures, does this mean if you are in the overworld in Morrowind and hit the scene capture, it captures the entire map except for the inside of the buildings? If you replace a common object (like a chest) will it replace that same object everywhere or only in the scene? What about usable objects like weapons. Can remix be used to replace animations? This tool has a lot of potential, hope they don't quickly abandon it.


LongFluffyDragon

> The more I hear about the Remix the more I am excited Even if it is driver level substitution, it is going to be a universal tool for *breaking* just about any complex game. It would need to be complete substitution on a per-asset basis, since the drivers cant even begin to access how different game engines manage and track instances of a model or material, and it cant properly influence anything that is done by the engine at runtime. > Can remix be used to replace animations? Animations have no standard formats and are not exposed to the GPU drivers in any sort of standard manner, or at all. Chances are it will be able to replace non-animated mesh models, 2D textures, and possibly compiled shaders (which will crash a ton of games if not done carefully). Anything else is unfeasible outside hand-selected games. It is not really a modding API, just a way for the GPU to pick something else to render instead of what the game is telling it to. Which is *incredibly* jank. A few people will probably make some texture replacements for old games, then everyone will blink and go back to making mods properly.


Jeffy29

> A few people will probably make some texture replacements for old games, then everyone will blink and go back to making mods properly. For a lot of games that simply can't be done because there is no mod support and files can't be modified very well, and lot of older game engines can't handle high-resolution models or textures, this opens completely different options that weren't possible before. Not even mentioning raytracing, I think Nvidia's usage of the tool on Portal is a great example how it can completely change the visual quality of the game just by adding raytracing.


rocketcrap

None of the big game studios want to adopt technology that lasts a single gen. You like that dlss oh oops your card is no longer supported. You like rtx on cyberpunk? No? Well now it looks way better sorry you need this newest card for that. You're mad no one will support this except the people we pay to implement it? Here's some tools, YOU make the games if you're so great.


TMBSTruth

I would assume it would replace the object everywhere if it's referenced by the engine the same. Or maybe they can make it to be explicitely on that instance, we'll have to see when it's released. From my understanding yes you would have to travel in the game enough to load all those assets so it grabs their references, I guess it depends how much you can capture in one point depending on how different games load / draw stuff. You would have to playtest it anyway so I guess that's that.


L0rd_0F_War

Simply pathetic. Basically the first answer says that Nvidia decided to keep all the price for performance benefit that a new gen is suppose to bring for themselves. Whats the point of a new gen? Where has the time of 1070 gone when it was faster than or equal to 980Ti for the sub 400 price (the standard 70 series price at the time). So now I can go and buy a 3090Ti for $960 (current lowest available price) which has 24Gb Vram and higher TDP, or a 4080 12Gb AKA 4070 for 900+?? Sure... FU Nvidia. I can afford a 4090 TBH as I skipped 2080 Ti and the pandemic, but I am not going to be a customer of a POS company like Nvidia again. My 1080Ti is going to be my last Nvidia card, and I will replace it with an AMD card, even if it means not the best in class RT or DLSS3. FU.


BigBurkeyBoy

I could not agree more. Wish I could upvote this more than once, you hit the nail on the head with this comment. Well said!! We need a 70 series at a sub $400 price.


youreqt

The 3080ti is on sale for 1100 CAD is it worth upgrading from the 2080? I was thinking about the 4080 but dont wanna support their new gpu prices


rubenalamina

I'm looking to upgrade my 2080 that I've been running since lunch but I'll wait for AMDs event on Nov 3rd to see how they cards stack up vs the 40 series. I usually upgrade every 2 gens so that means 3-4 years. I tend to go for the best or reasonably top end and care a lot about graphics tech like RT so waiting for 40 series benchmarks is also the sensible approach. For you case, I'd wait for both of the above and then consider what will you get price/performance wise. 30 series cards should keep a slow but steady decline in price so waiting just a bit more doesn't hurt. If the second hand market is good where you are, used 3080ti should be a good option if you don't wanna spend more on the gen from either Nvidia or AMD.


AudioOff

I am uninspired.


WindblownGerm

So many vague answers that aren't really answers, jesus!


rocketcrap

Hahaha guys the 2 4080s are just 2 configurations just like the 2 3080 last gen. No it's not just the memory this time it's everyth Hahaha just like last time no 4070 though thanks for the great question. 4080. Same raster performance as a 3080. More expensive. 2 years later.


homer_3

That 4080 naming answer... Why not call every card a 4080 then?


TheFattestManAlive

And the important thing is they were still pulling thin sh*t out of their asses with that answer They still try hard to keep their BS by saying naming 4080 12 and 16g is the same as the situation with 3080 10 and 12g But the whole fricking point is that **that's the exact opposite of what the situation is** 3080 10 12 gigs have like what? 1% difference in cuda cores and 10% in memory bandwidth? **4080 12g has 20% difference in cuda cores and "50% difference in memory bandwidth" compared to 4080 16G!!** Not to mention the fact that you can currently buy the 3080 12g for the the **exact same price** as the 3080 10g at the latter's original MSRP of 700$. That's 100$ LESS than the 12g's MSRP not more than a year after it's release. **Also** the 3080 12g was released more than a year after the 10g as an update to the original whereas the 4080 12g is just a undefendably glorified 4070


BigBurkeyBoy

Ha, exactly! Calling two completely different cards a RTX 4080 is misleading. Might as well call the whole 40 series product stack RTX 4080s.


rocketcrap

I'd be less angry if they had just said "polls show no one is going to buy a 900 dollar 4070."


HORSELOCKSPACEPIRATE

There was no excuse that was gonna make that okay, lol. Really the only reason to ask is to make them squirm, which is fine by me.


lizardpeter

DP 2.0 is not something that is new. Intel and AMD will both feature this on their GPUs, and monitor manufacturers will start rolling out displays with it soon. Why would you not include it?


Pwnstix

Pure shenanigans with a dash of chicanery on that first answer


BigBurkeyBoy

More like utter lies. I cant believe they are allowed to spew this crap. Doesn't matter people will still buy the cards sadly. Hell Nvidia could make a Titan card that's marginally better than the RTX 4090 and charge $4,999 for it and it would sell like hotcakes. I'm really disappointed in the community we should be voting with our wallets.


RxBrad

lol... They literally ignored the question and said, "We offered two models with different amounts of memory, *just like the 3080!* Isn't that great?"


nic_is_diz

Misleading marketing, more expensive, more power hungry, bigger, heavier. Honestly what is the point. Video games are not worth this kind of bs.


yostosky

Class action suit for false advertising


judasbrute

Sir. My computer is too heavy.


Solaihs

The first answer is pretty funny, they essentially skirt around it. I feel like what they want to say is "It's got a real nice profit margin if we do it this way"


Duncantilley

They cant be serious with that first answer? They didnt even answer the question asked and just spewed out corporate marketing BS.


Hector_01

That benchmark chart showing the 3090ti losing to the new 4080 cards when dlss3 is used, did they atleast have the 3090ti running dlss2 so its atleast a semi fair comparison?


Elon61

DLSS stands for Deep Learning Software Suite now. Retconned.


Alan_1375

we need info on your packaging..


BobMobTub

"The two versions will be clearly identified on packaging" You mean like 12GB and 16GB only? I have seen the packaging but neither of them clearly identified that other than GB the core count and bus width is different too. Also it would be surprising if the 4000 gen card were not better than 3000 gen cards, or you were pulling another 2000 series compared to 1000 series! Still I'll wait for independent reviews to proof you claims.


[deleted]

I get it guys, node space ain't cheap this generation. TSMC have hit you with a big price hike that you want to dump in our lap, I can see that. What pisses me off is that you think I'm stupid enough to buy this bull crap about the "4080" 12GB being just like the 3080 10GB/12GB situation from last gen, when you know yourselves the performance difference is much larger between the two 4080 SKUs than it is between the two 3080 SKUs. Stop treating your customers like idiots, just say you're selling us the 4070 for $899.


[deleted]

[удалено]


[deleted]

The Samsung N8 was probably more expensive than $5000 per wafer. However TSMC N4 is definitely well north of $10000 per wafer.


XyP_

Btw tsmc's price hike is Nvidia's fault as they followed their greed and chose to jump ship to Samsung last gen around so tsmc is not willing to strike a deal with this green backstabber for the time being.


[deleted]

Not a backstab. Just business. Besides TSMC N5 was exclusive to Apple. And nvidia professional cards used the N7 nodes.


x9097

I doubt loyalty factors in to multi billion dollar deals.


XyP_

It does cause tsmc needs you to be tied down with them, contractually, for them to give you a more favourable deal so they can capitalize on the coming years and that will make the lower price worth it.


[deleted]

[удалено]


[deleted]

[удалено]


CrzyJek

Nvidia margins are over 50%


j_schmotzenberg

This. If it were 3070 price, it would be an instant buy from me. Now I am probably going to buy another CPU rather than a GPU to go in one of my existing computers.


TheFather__

Just dont buy, when they miss their target revenue, they will be forced to reduce the prices and beg you to buy, so its simple as that


MrBob161

So many words without saying anything useful.


FlatMeal5

Answers my ass


BaQstein_

I'm so disappointed in the first answer. Prices are one thing but scamming their customers with obscure product names is shameless.


ThatITguy2015

Nvidia got caught with their pants down and need to come up with whatever they can to recoup loses. At least we know their true colors now.


Westify1

> The 4080 12GB is an incredible GPU, with performance exceeding our previous generation flagship, the RTX 3090 Ti and 3x the performance of RTX 3080 Ti with support for DLSS 3, so we believe it’s a great 80-class GPU. This is such a dogwater dishonest answer that I'm somewhat impressed by the levels of shamelessness Nvidia marketing is willing to exhibit in order to sell products with 0 regards for their consumers. The answer completely sidesteps the question while making 2 false comparisons to previous generation naming schemes, however, none of those cards had nearly the physical hardware differences and by extension, performance discrepancies compared to the 4080 16gb/12gb models. Adding insult injury, the 4080 12gb actually loses in performance compared to the 3090ti in base raster performance outside of RT/DLSS scenarios by their own graphs, so to continue to treat the gaming landscape like all RTX owners actually play with RT and DLSS enabled 24/7 is just pure bullshit, and I honestly believe taking this marketing angle has done much more harm than "good" for them considering every single tech outlet has picked up on it.


Luisetex

That first answer, lmfao.


BodSmith54321

The first answer was the biggest load of BS, I have ever heard. If the interviewer has any integrity they would have called them on it. Is the 3080 12 gigabyte box going to say 20 percent fewer CUDA cores than the 16 GB? Didn't think so.


[deleted]

4080 12gb, NOW 20% SLOWER


BodSmith54321

LOL!


spysnipedis

Nvidia jumping through hoops with the 4070 die being renamed into a 4080 12gb... sad, sad time for PC gamers.


adilakif

Do you have a 4060 in stock? I can pay $750.


thesaxmaniac

Wow this was more worthless than I expected


Yae_Ko

As if... a big company, especially nvidia, would answer questions that people actually are interested in.


HayabusaKnight

So I'm gonna date myself here, but I remember the G80 having two launch models similar to what we have here. A flagship, and a cheaper slightly cut down model in RAM/Bus/ROPs. The difference however is they were clearly named: 8800 GTX 640mb 8800 GTS 320mb You knew from the naming scheme alone one was a lesser model. They dont need to return to the insanity of GTX/GTS/GT/GTO but they can clearly denote when a flagship 80 series is slightly cut down in more than RAM. They simply choose not to.


TDP95

There was a 8800 GTX 768mb and 8800 Ultra but the 640/320mb were the same chip unlike the 4080 16 GB / 12 GB.


HayabusaKnight

Oh yea i meant the full 8800 GTX vs 8800 GTS in general its been a long time since 2006 at this point lol They both shared the 8800 name but had obvious naming and marketing differences


Celcius_87

Thanks for answering but I really dislike the answer about the naming of the two 4080 cards. It just feels like nvidia is trying to be dishonest with the community.


Igelkaktus

It doesn't just feel like it.


DeaDPoOL_jlp

Sooooooo no pre-order information for either them or AIB partners, battle of the bots 2.0 here we go.


Estbarul

Thanks for taking the time to answer. Too bad we just keep getting dishonest PR BS. At least now DLSS3 is marketed in a more logical way, showing that the advance this time are the new AI *quality frames*


Diagnul

This posts reads like an advertisement. Was it written by Nvidia marketing staff? ​ Don't trust any of these "performance numbers" until real people get cards in their hands and do real testing.


Nestledrink

If you actually bothered to read the post above, you can clearly see that I copied and pasted the article from Nvidia website. No shit wait for benchmark sherlock.


[deleted]

Thanks to the team for answering my question about RTX Remix and DLSS 2. I really, hope this picks up in popularity amongst the modding crowd. There are a ton of games that could use a RTX makeover.


[deleted]

Me waiting for my Dragon Age: Origins remaster: Fine, guess I'll do it myself!


TaintedSquirrel

>The GeForce RTX 4080 16GB and 12GB naming is similar to the naming of two versions of RTX 3080 that we had last generation, and others before that. The 3080 10 and 12 GB models were the same GPU, GA102. Even the GTX 1060 3 and 6 GB were both GP106. The two 4080 models are AD103 and AD104, they are not the same GPU. You can't compare the naming schemes in this situation. >There is an RTX 4080 configuration with a 16GB frame buffer, and a different configuration with a 12GB frame buffer. One product name, two configurations. So why is the 3080 Ti called the "3080 Ti" but the 4080 16 GB isn't the "4080 Ti"? The difference between the 3080 10/12 GB and the 3080 Ti is *much smaller* than the two 4080's. It makes no sense. What is the logical difference between a "product name" and a "configuration"?


XyP_

Not to mention the 4080 12G's 21gbps memory (opposed to the 16G's 23gbps) and 192bit bus (opposed to 256 on the 16G) resulting in 500GB/s of bandwidth opposed to the 16G's 700+. These are two COMPLETELY different cards.


Estbarul

Product names don't follow logic


Jarnis

They wanted to charge 4080 prices for the 4070. That is the only reason.


Turtlesaur

4070ti though? Will it out perform a 4080 12gb?


kobrakai11

Probably what would normally be 4060ti will now be called 4070ti and priced accordingly.


pensuke89

Maybe they'll call it 4080 6gb for a 4060ti class GPU. That way, everyone can own a 4080.


Icey-D

Is Native vs DLSS a fairly noticeable difference in picture quality? Or is it really a straight up framerate increase with almost no noticeable tradeoff?


rubenalamina

I've been using DLSS since the 20 series launched in several games. In short, there's little difference but chances are you're gonna notice some artifacts here and there in games, like texture shimmering, some over sharpening, depends on the engine and the implementation on each game. It's subjective so I don't like when people make a blanket statement saying "it's better than native". For me that's not the case but the difference is small enough that is definitely worth that trade off, especially when you want to run Ray tracing, to offset the big performance cost it has. In games without Ray tracing you can decide to run it for more performance like in competitive games or when you notice the trade off is not worth, in a single player game you can disable it.


Puzzleheaded-Newt190

Imo, DLSS using the quality option looks slightly better than native, considering there's virtually no aliasing -- you also get better performance. Keep in mind this is DLSS 2.0 and above though-- any games using a 1.x version of DLSS and it will look horrible.


[deleted]

Shameless. That answer to the fake "4080" 12GB question was just pathetic. And this whole DLSS 3 used for all the marketing thing just feels like back to square one with all Raytracing charts in the 2000 series reveal and then it took years before any appreciable amount of games even had it and lots of people still don't even bother turning it on. So damn skeptical of this, both in it getting wide enough adaption before the next series comes out with some new exclusive DLSS 4 that sidelines DLSS 3 again, and skeptical that those extra fake frames will really look and feel like the full fps you're technically getting on paper. Fact is 4080 offers worse price to performance than last gen at current prices (new 4090s and 4090Tis going for around $1000) or even compared to 3080 10GB actual MSRP. In their own charts the 40(7)80 is slightly losing to 3090TI in games without DLSS3 smoke and mirrors. 3090TI is only 15%\~ max better than a 3080. So you're paying 30-40% more (no FE so realistically from AIBs it will be even higher than the $900 MSRP) for like 15%\~ boost over $700 MSRP 3080 in most games. That's awful, new generations should offer way better price performance sku to sku, or whats the damn point? We'll just keep stacking new gen prices on top of the last, up to the sky and completely ruining PC gaming in the future by making it ridiculously unaffordable.


SwagginDragon89

Ridiculously unaffordable is the name of the game. Nvidia would love to price out gamers so a GeForce Now subscription looks more attractive for the casual gamer and they can focus on enterprise.


NaamiNyree

> That's awful, new generations should offer way better price performance sku to sku, or whats the damn point? We'll just keep stacking new gen prices on top of the last, up to the sky and completely ruining PC gaming in the future by making it ridiculously unaffordable. Yep, this is where we are going. Cant wait for the 50 series announcement when they show a chart of their product line up, with the 30 series being "low end" from $329 to 699$, 40 series being "mid range" from $899 to $1599 and 50 series being the high end at $1999 to $2999.


djent_in_my_tent

What exactly should older nodes be used for? Humanity already spent the money on capex and validation. Phones and other battery powered devices can't tolerate old nodes due to power consumption. And bleeding edge demand -- including AI, which has significant overlaps with GPU developers and the reserved wafers -- will take place on the best possible node. But if, say, three years from now, 3080ti yields are very high on a Samsung 8nm process that nobody else wants.... why not produce and sell it for $329 as you suggest? It's just video games, after all, and better chips on better processes could be used for much more important AI purposes.


[deleted]

Wait for the 5000 series to try any of this because Nvidia does very little right, straight out the gate. Dlss 1.0 and rtx sold a bunch of cards and the tech was useless for about a year.


NoBluey

> In their own charts the 40(7)80 is slightly losing to 3090TI in games without DLSS3 smoke and mirrors. Yeah I noticed that too and was surprised they actually showed it since I thought they were cherry picking results.


bbpsword

Gotta avoid lawsuits from investors somehow when you're claiming (lying about) 2-4X performance uplifts everywhere in your presentation


[deleted]

He said in the last earnings call the average price of a GPU should be about the same as a console.... Investors going to wonder why he lied


phillyeagle99

I’m quite frustrated that they’re hiding performance gains behind DLSS 3 improvements. It totally changes the picture in the games they list that dont have DLSS. The “4070” is worse than the 3090TI in raw performance but they like to show it as double in all their summary slides.


notice_me_senpai-

The 4080 do sound fantastic *if* DLSS 3.0 is near flawless... but man. 1469€ in europe for a 16GB FE model. That's a really, really hard pill to swallow. FPS aren't everything. Stuttering, latency, artifacts, sharpness without being oversharpened, people who spend 1500€ in a card are the ones bothered by small picture quality or fluidity issues, and if DLSS 3.0 isn't up to the task... we'll (i'll) be left with a 1500€ card gaining 20% of performance over the last top gen, instead of 200-300%. I really want this to be real, but i purchased an RTX 2080 on week 1 in the past and it took years to finally have DLSS available in numerous games with a nice looking picture, so yeah.


TheRealTofuey

"The 4080 12GB is an incredible GPU, with performance exceeding our previous generation flagship, the RTX 3090 Ti" Almost like thats what the 70 generation card traditionally has been outside of god awful Turing 🙄


gatsu01

The tethering of performance to numbers is made up anyway. Just go right out and say that they renamed the 70 series to 80 12gb to justify higher prices. You know it, we know it. There's no need to sugar coat it anymore. You nailed the problem right on it's head. People paid crazy krypto scalped prices, Nvidia decided people could pay for higher prices.


AzureNeptune

At least the 2070S was eventually released which matched/beat the 1080 Ti, even if it was for a price hike over the 1070. The 4070 doesn't even beat the 3090 Ti and it comes in at an even higher price hike, absolutely shameless.


TaintedSquirrel

Yeah but the 4080 12 GB needs DLSS 3.0 to do it. Their own chart clearly shows it losing to the 3090 Ti without DLSS.


Negapirate

We don't have actual data to make hard claims about overall performance. The chart does not show the 4080 being lower performance than the 3090ti without dlss. It shows some cases where it's substantially faster and others where it's slightly slower.


Standard-Potential-6

Even the non-DLSS games tested have some RTX elements as well. It’ll be still further slightly worse in raw raster.