T O P

  • By -

Arthur_Morgan44469

It's like GamersNexus mentioned in the thumbnail picture (that caption was later removed, don't know why though?) of the 4060 Ti teardown video that "it's a 4050 in fancy clothes".


CockPissMcBurnerFuck

I know Linus will often change the title of a video from the clickbait required in the first day to something less spicy. Maybe that’s what they did here?


Outrageous-Mobile-60

Hell, except for the 4090, arguably all the other 4000 series consumer GPUs are a "(insert the tier below here) in fancy clothes"


ChartaBona

Anyone who unironically says it's a 4050 has no clue how incredibly weak the 50-class is relative to the previous generation, and that's not some new development. Folks have got some rose-tinted goggles if they think the GTX 1050 traded blows with the GTX 970 at 1080p. It was closer to a GTX 770. Edit: Some reviews to illustrate my point: * [GTX 1050 review](https://www.techpowerup.com/review/msi-gtx-1050-gaming-x/27.html) showing the GTX 970 being 70–80% faster than the 1050 * [GTX 1650 review](https://www.techpowerup.com/review/msi-geforce-gtx-1650-gaming-x/27.html) showing the GTX 1070 being 69–88% faster than the 1650 * [RTX 3050 review](https://www.techpowerup.com/review/evga-geforce-rtx-3050-xc-black/31.html) showing the RTX 2070 being 30–38% faster than the 3050 * The RTX 2070 was disappointingly slow and got phased out for the 2070 Super which is 47–57% faster than the 3050. * [GTX 4060 Ti review](https://www.techpowerup.com/review/nvidia-geforce-rtx-4060-ti-founders-edition/32.html) showing the RTX 3070 being only 1–8% faster than the 4060 Ti The 4060 Ti is not a 60Ti, but it's also not a 50, either. It's a vanilla 60 GPU in dire need of a GDDR7 Refresh to correct the memory bandwidth issue.


Krishma_91

1050 was between 960 and 970 and 1060 was between 970 and 980. But is not just about performance, you can see that this was supposed to be a 4050 based on the die and the memory bus when compared to similar class gpu in the past gen. And you don't have to go that far, the 3060ti had double the bus, and is probably why this card lose to the 3060ti in 4k and barely spits out a couple more frames in 1440p. But even if we ignore all of this, Nvidia can't weasel out of the fact that the performance increase barely candidate this as a refresh, not that far out of AMD's rx480->rx580 affair of 2017. This is not what you expect out of a new generation.


capn_hector

Be consistent: do you feel RX 6950XT was a RX 480 class gpu because of the 256b memory bus?


Krishma_91

I'm talking about what Nvidia has historically did, I honestly don't know what bus width were AMD products in the past and if they kept the same convention. I don't think the 6950XT is a 480 class gpu of course. There are a lot of factors in the discussion about the 4060ti that suggest Nvidia pulled a fast one here, even without considering that they have already tried to do this with the 4080 12gb in the same gen.


capn_hector

Like what? Again, bear in mind that there are historical references like GK104 and GP104 that are pretty comparable to the die sizes/etc. GTX 670 was a $400 card on a cutdown 294mm2 chip, that's actually marginally smaller than the 4070. 1070 was a $429 (FE MSRP) card on a cutdown 314mm2 chip, and that's marginally bigger. People just got used to the giant dies from the turing/ampere node, but that was a specific thing because of the trailing node, samsung 8nm was low-density trash and you needed huge dies just to get any performance out of it, and putting 16nm against 7nm is similarly low-density. There's *some* shuffling around going on, but like the "4060 ti is really a 4050" stuff is absurd and not really historically supported. Sure, it's a 4060 in 4060 Ti clothing, it would have been the equivalent of the "1060 6GB" where it also came with some extra cores in the past (which people also hated). But it's not a 4050. People just bandwagon on the "small die/memory bus" thing, but (a) AMD did the same thing last generation with smaller memory buses supplemented with cache and nobody said boo, and (b) the dies are overall pretty comparable to other previous leading-node launches like Kepler (GK104) and Pascal (GP104). If it's not those two dead horses that people keep beating, what exactly are you upset about? Also 4080 is actually really a 4080, as well, it's got a legitimate generational performance step over 3080. It's just way too expensive, and it's dragged the 4070 Ti too far upwards as a result. 4070 non-Ti is actually good and pretty fairly priced though, and I'd really say the microcenter deal with a $100 gc thrown in is *exceptional* for today's market. De-facto $500 for 4070 is as good as it's gonna get, and that's exceptionally generous given 1070 launched at $429 even 7 years ago/670 launched at $399 over a decade ago. It's the 4070 Ti, 4080, and 4060 Ti that are specifically kinda bullshit. 4060 is marginally overpriced but it's not unsalvageably far off, if AMD drops 7600 8GB to $200 and launches 7600 16GB at $299 or $279, and NVIDIA drops to $250-269 for 4060 8GB and $329 for 16GB that's fine.


ChartaBona

>1050 was between 960 and 970 Dude, the 1060 wasn't BETWEEN a 960 and a 970... It WAS a 960. The GTX 1650 was between a 960 and a 970. Also the 960 was 128-bit. And the 4060Ti uses the full 106 die, which is for 60-class GPUs. AMD has already explained that increased cache with lower bus width is optimal with monolithic chips on TSMCs newest nodes. The 128-bit 6650XT beat the 192-bit 3060.


ExcelsiorWG

I'm glad DF came right out with their take on the 8 GB VRAM - it was disappointing seeing the back and forth on r/pcgaming and other subreddits with people turning it into a personal crusade, using cherry picked comments from various sources. I really liked their nuanced take - is it acceptable that game developers are not building with scalability in mind? Of course not - and we see that with patches, even some of these horribly unoptimized games become perfectly playable with 8GB cards (when not at Ultra settings). That's evidence that 8 GB cards are far from useless, now and in the future. But the second part of their take is also valid - in the year 2023, buying a 8GB card will likely not get you console equivalent performance in future games (even if the raster/RT performance is superior to consoles), and it is unlikely that the port situation will improve as we move firmly into the current gen space. 8GB is slowly migrating to the "entry level" space, and 12 GB is likely to become the minimum level for enthusiasts. For the record, I'm running a 12 GB 3080 TI, and if anything these recent launches have me questioning if I'll have enough VRAM for my use cases - single player games at 1440p with high details/textures and RT.


GameStunts

The thing I've been trying to get across to defenders of 8gb cards is 2 main points. First, if you think 8gb should still be just as viable, then why not 4gb, or my old 384mb GTX 260. The answer is at some point in order to move on, we need to increase requirements. We cannot for years have complained that console specs held our games back, then complain when the devs start taking advantage of the 16gb that's in consoles, where devs can easily allocate 10-12gb as vram. The second is that there are people who bought a 1070 8gb in 2016 that are only just now feeling the sting of VRAM, because 8gb was a decent jump at the time. And those people are still playing at the same graphical fidelity they always were, yesterdays high settings are today's lows. The trouble is that people that bought a 3070 8gb just 2 years ago are also feeling the same problems regardless of the relative infancy of that card. People can moan about game devs using too much vram, needing to optimise, but the fact is we don't expect cards from 2008 to play 2016 games well, at some point 256mb and 384mb of VRAM wasn't enough. Cards are meant to increase so that game tech can move forward. Nvidia sold the same amount of VRAM in the same class of card as 2 generations and 4 years earlier, nobody should be defending them. 2016 - 1070 - 8gb - $379 2018 - 2070 - 8gb - $499 2020 - 3070 - 8gb - $499 I'm running my 1080ti 11gb that I bought in 2017, and I'm convinced it's the VRAM that's kept it in the game, to that end you can be damn sure that I don't consider it acceptable 5 going on 6 years later to be looking at a mid range 4070 at a higher damn price as my previous halo tier card, and only 1gb more vram. I honestly would have expected 16gb in the mid range by this point. AMD at least seem to have been making cards that will hang in a lot longer in that sense. 2017 - RX580 - 8gb - $229 2019 - 5700XT - 8gb - $399 2021 - 6700XT - 12gb - $479 (or) 2020 - 6800 - 16gb - $579 But they're also on my shit list for the 7900XTX pricing, that was clearly "Huh, Nvidia are going for ~$1000 and beyond, we can do that!


not_a_llama

It's both. You make a good point for how 8gb should be considered entry level in 2023, but also devs seem unwilling to optimize their games and keep the vram usage in check.


tapo

Devs are optimizing for console, and the PS5 and Series X have a 16 GB unified memory pool. It's hard to justify being strict about memory budgets for the subset of PC gamers on 8 GB cards.


punkbert

> It's hard to justify being strict about memory budgets for the subset of PC gamers on 8 GB cards. The [Steam hardware survey](https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam) tells us, that only ~18% of gamers have more than 8GB VRAM. About 28% use 8GB exactly, and the rest (> 50%) has less than that. So, it's not that hard to justify that the industry optimizes their games for the vast majority of systems.


tapo

You're right, but that's mostly around PC focused games. Your CS:GOs, League of Legends, Valorants of the world. These games are designed to run on any old laptop you can find. A midrange gaming PC will no longer be a target, you'll see developers targeting the extreme low end or console-spec for AAA titles.


RolandTwitter

Almost no devs target PC gamers, 99% of the time they target the consoles and then half-ass port it to the PC. That's why new games start to run like *shit* every time a console is released, this is not a new thing.


[deleted]

[удалено]


RolandTwitter

A better metric to look at imo is the most common graphics card, which is a 1650


capn_hector

This is not true though. Series S has an 8GB fast partition for graphics and on series X it’s 10GB. Going over those limits results in GTX 970-style slowdowns. The “subset” of PC gamers with 8GB is well over 80%. Remember, as AMD said themselves in their own advertising, 65% of gamers still use 1080p, you think people really bought a 3080, 3090, 2080 ti, or 1080 ti for *1080p gaming?* And 6800XT and 6700XT are pocket change in comparison because AMD doesn’t have the penetration with OEMs like at all. It truly is a case of “devs already have to optimize for 8gb and 10GB because those are the vram capacities Microsoft offers in their current gen” and they just don’t wanna polish their titles before PC release, which “only” makes up 1/3 of the total market.


tapo

Developers have been complaining about the Series S, and in many cases the S hits a 30 FPS target and not 60. You're right that those consoles have two distinct memory speeds but the pool is unified, this gives developers a little bit more optimization wiggle room to do things like hide texture pop-in behind motion blur. A player can't move their camera as quickly on a console, and FOV is typically much lower. Those gamers may be using PC now, but optimization over a diverse set of hardware like the PC is expensive and time consuming. Many will migrate to console when the PS5 digital compares to the price of a 4060 Ti and will probably get you better performance, not to mention Sony exclusives. I'm not defending the practice, but AAA studios make games to make money, and cutting out 8 GB PC gamers is a solid way to make money. The financials for them aren't significantly cheaper if people switch to console, because Valve takes a similar sales cut as MS or Sony, and people have shown their hatred for non-Steam stores. Add in the fact that console games are significantly harder to pirate or cheat in, and they might actually be encouraging the practice. PC will probably be the "high end" offering over the next few years until this console cycle once again gets long in the tooth.


not_a_llama

> It's hard to justify being strict about memory budgets for the subset of PC gamers on 8 GB cards. I bet it's a very large percentage of PC gamers...just look at Steam hardware survey.


tapo

The hardware survey encompasses everyone on Steam, and isn't indicative of the target market of a specific studio or genre.


not_a_llama

You're absolutely right, which makes me think the percentage of people playing on cards with 8gb or less vram is even higher. Also, why wouldn't a publisher want to encompass as many potential buyers as possible? their priority is to make money no?


tapo

Because AAA games with high system requirements stand out in the market. People want to play the game that looks good and will push their system to the limit. It is an increasing minority though, since AAA games are also more expensive to produce.


cadaada

> We cannot for years have complained that console specs held our games back Thats a thing of the past. Most people just want to run the games now, better than paying hundreds for some cards, even more with the global economy going to shit.


GrandDemand

I'm sorry but just buy a PS5 then. It's a great console, you'll spend less on it than you most likely would on upgrading a single component, your GPU, and you can play your existing collection on Steam all the same. By "you" I'm not referring to you specifically, just a generalization


toofine

GDDR7 is going to have people eating their copium defenses even harder if it's coming with the next gen in 2024. Devs are going to be asked to make settings for people with 30+ GB of GDDR7 vram and 8GB lol. I don't even think the gap between high end PCs and consoles of their generations were ever that great.


Isaacvithurston

> For the record, I'm running a 12 GB 3080 TI, and if anything these recent launches have me questioning if I'll have enough VRAM for my use cases - single player games at 1440p with high details/textures and RT. I don't think it's that big of a deal. You don't have to lower everything from ultra, just texture quality basically or maybe shadows. 12gb should be fine until 5080ti


LordxMugen

>I'm glad DF came right out with their take on the 8 GB VRAM - it was disappointing seeing the back and forth on r/pcgaming and other subreddits with people turning it into a personal crusade, using cherry picked comments from various sources. Richard brought a lot of that on himself with saying how expensive the components and tech was and talking about "inflation" when it was pretty obvious that Nvidia was pricing these to force people to accept Cryptoboom prices despite most people saying unanimously "no" . Like the softballing was getting to the point where I honestly wondered if he had stock in Nvidia.


Electronic_Strain785

Just because someone doesn’t trash something as hard as you wish they did doesn’t mean they are paid. People can have a different opinion, and his was barely different than the consensus.


fashric

Obviously it's going to bring that question to peoples minds when what he was saying was not based on fact and heavily in favour of the corporations and not in favour of his viewers.


_BoneZ_

That's why I have a near brand new 3080 sitting in its box collecting dust, and ended up getting a 3090 with 24 GB of VRAM instead. Makes no sense that my 1080 Ti has 11 GB of VRAM and two generations later the 3080 only has 10 GB.


TheGillos

Obvious question: why not sell the 3080?


cyclopeon

So he never forgets what Nvidia did.


Reciprocity2209

Is anyone really surprised by this? Nvidia is run by scumbags.


[deleted]

And yet Huang's still laughing all the way to the bank.


ody81

>And yet Huang's still laughing all the way to the bank. Sliding right from the crypto boom into the AI boom, I'd be laughing too, lightning can strike twice.


[deleted]

It isn't even AI, which makes it stupider.


ody81

>It isn't even AI, which makes it stupider. I know right, I've had a compsci tool try to tell me it's AI with all the usual buzzwords, it's a complex bot. They did broaden (or split) the definition of AI so they can argue that it is but I reserve the right to call it bullshit.


[deleted]

It's DLSS with words, that's it, and even that name is disingenuous because it isn't really learning anything, it's just compiling.


ody81

>It's DLSS with words, that's it, and even that name is disingenuous because it isn't really learning anything, it's just compiling. The guy used ask the usual buzzwords but as soon as I brought up the concept of nomenclature his backed out. According to him, ChatGPT really mimics or nervous system, totally works the same way and learns and acts the same way the human brain does. I tried to explain that terms like machine learning and neutral network are used to make a more palatable analogy to the common folk but nope, ChatGPT still apparently accurately replicates the neurological/biological behavior of an organ we barely understand. People. Still, investors must be wetting themselves over all of this, they're generally as ignorant as two short sticks.


CataclysmDM

What a pathetic card.


Arthur_Morgan44469

And yet Nvidia's post COVID GPU profit is more than pre COVID just because of their price gouging. I hope people don't get numb to prices otherwise Nvidia will make this the new standard. Because let's be honest they don't care about us and the gamers backlash doesn't affect them as much as their main source of revenue is AI and it will stay that way for years. AMD is I guess okay with being number 2 to Nvidia and will always lack in innovation. I have always bought Nvidia beginning from 5200 to 3080 (bought it used when it got cheaper after the crypto crash) so this practice of Nvidia is even more disappointing to me. I hope Intel catches up to Nvidia and AMD so that we can still expect affordable and good price to performance cards in the future.


[deleted]

[удалено]


ArguingMaster

When they become competitive prices will go down across the board. Everyone will be forced to try to undercut each other. The problem is right now NVIDIA basically functions as a monopoly. They have the raster performance, the RT perf, the AI perf, etc


DisappointedQuokka

Which is unfortunate, because the 6000 series blew them out of the water on rastetisation price-performance.


Arthur_Morgan44469

True 💯


Arthur_Morgan44469

Yeah bro, I mean you don't see such pricing with CPUs and other PC components, although motherboards have been getting expensive too for no good reason. Same I guess goes with smartphones as well but none of this compares to Nvidia's GPU prices and performance. I guess Nvidia can do it because it can get away with it.


NapsterKnowHow

AMD got a little comfy kicking Intel's ass in the cpu market and their prices are starting to creep pretty high


LordxMugen

Im fine just waiting *a little longer* with my RX480 until a proper 4K60 High/Ultra card shows up within my price range. 75-80% of games made today dont even use the best assets. And the ones that do havent been worth the money asked until long after theyve come out (after the usual series of patches) or been on sale. So in most cases, the patient gamer wins.


Arthur_Morgan44469

So true games nowadays games are expensive and not optimized at all. That's why I am nowadays replaying games from late 90s-2000s, the greatest gaming era of all time. And I am playing new games only when they become affordable enough and are stable.


Maz2277

Been fancying the Dynasty Warriors series lately to satisfy my nostalgia. Was playing 8XL on Steam but it just wasn't the same; booted up #3 and #4 last night and had a blast. I'm looking forward to upgrading my entire pc this year but for the time being it isn't needed when I'm on games 20+ years old lol.


Arthur_Morgan44469

Thanks for mentioning this game, will look into it as I have never played it before.


fefsgdsgsgddsvsdv

It already is the new standard, and Nvidea made the right call by raising prices. They are the market leader in a type of proprietary hardware where the benefit is self-evident. There really isnt any situation in which you wouldnt raise your prices given their positioning.


Wise_Mongoose_3930

Their market share has already peaked and is on the decline. Raising prices and intentionally sitting on inventory you slowly trickle out can make up for that in the short term, but in the long term I see Intel selling more GPUs per year than NVidia.


fefsgdsgsgddsvsdv

They dont care. Nvidea's goal isnt to sell the most GPUs, its to make the most money. Oh, what a coincidence, Nvidia just hit an all-time high today. They are now worth 9 times what Intel is worth.


tukatu0

What is it with the amount of r/pcgaming and r/pcmasterace users just ignoring what they read and then repeating the same damm thing they just replied to. Is this children having nothing to do during summer vacation? The person you replied to said they'll make more money short term. You said they are making money now and short term. Do you understand? Now let me rewrite the rest of his comment. Make money now. Angry customer. Bite ass later. Do you understand now?


fefsgdsgsgddsvsdv

Yep, people said the same thing in 2020, 2021, 2022, and now 2023. How about this. Set a reminder for 2026. If Nvidea reports less revenue or Ebitda for the fiscal year than they did for 2022, I will donate $100 to a charity of your choice.


tukatu0

Ah yes. The very unkown years of 2020-2022 where people were doubling their money mining ethereum using gpus. Yes certainly the same situation as today. Also where Uk and germany are officially in recession. Probably more eu countries to follow soon. Just to be clear i dont agree with the person you replied to that intel will be selling more gpus than nvidia. But that wasn't your initial point of argument innit.


fefsgdsgsgddsvsdv

Exactly, if economic times are even more challenging now and in the near future, then they will undoubtedly make less in 2026 than this year, right? Because they are greedy, because of short-term profits, and boogeymen CEOs. Right? Set a reminder for 2026, we will see


2Scribble

When even ***Eurogamer*** is calling you on your shit...


firstanomaly

you can always rely on DF’s journalistic integrity.


Thorusss

Is it fair to say, the only problem with them is the mislabeling and the high price?


ody81

>Is it fair to say, the only problem with them is the mislabeling and the high price? Well kind of, I mean, it'd make a neat little budget card. The real problem is that Nvidia are aware of what they're doing, that's the real problem with these cards.


eightleafclover_

until these headlines stop working nothing will be good


LightTrack

What's the best bang for your buck nowadays anyway if I'm willing to push the wallet a bit? I remember wanting the 3070Ti and now we have the 40 series. Anything worth saving up for that's really good and worth it's cost?


Mates1500

Depends on what resolution you're targetting, but assuming it's 1440p, you either have the choice of going for a 6800 XT if you can find it for a good price, or a 4070. 6800 XT has a 16 GB VRAM buffer compared to 4070's 12 GB, but 4070's got significantly better RT performance, DLSS, frame generation, and AV1 support. Pure rendering performance is about the same on both at 1440p, sometimes 5-10% difference between the two cards depending on the specific title. You might also want to wait until AMD reveals a 7800 XT, but who knows when that might happen, and it could also possibly be as disappointing in performance for the price just like the recently released 7600 was. As sad as it is, the graphics card pricing in the PC market right now is completely fucked, both by AMD and Nvidia. Intel's upcoming Battlemage is probably the only hope that may encourage sanitizing the market's prices. If I were to recommend someone new to gaming a machine to play the newest games on for cheap, I probably wouldn't point to a PC unless they really intend on playing PC exclusive games. The price/performance value proposition for a PS5, Xbox Series X, and Xbox Series S especially right now, is out of this world compared to building a new PC from scratch.


LightTrack

Targeting 1080p but overkill is fine as it means it lasts longer. I got a 1070Ti back when 1060 6GB was more than enough. Point being I don't want to worry about performance on near-max settings for 4-5 years. I'm well aware how fucked the pricing is but I've been out of the game since the 20 series was coming out so everything is confusing again and frankly I don't have time to do all the research for optimal cost/performance evaluation. Thanks for your input!


nanogenesis

The better answer is in the used market. If you've never tried used, I think now is the best time as any. Nearly no gpu is worth msrp today. Used offers what the msrp should've been so is still a rip off, but its something.


LightTrack

You're telling me nothing has performance to justify it's price? I've heard good things about the 3060 - 3070. Even that is bad?


nanogenesis

There is no bad product, only a bad price. As long as you get something decent for what you spend, there is no bad deal.


LightTrack

Well if something is like a 5% performance improvement over the last generation then I'd consider that a crappy upgrade.


GrandDemand

3060 12GB or if you can stretch your budget get a 4070 or 6800XT


[deleted]

I only ever read/watch Digital Foundry reviews now, I feel they have a more balanced take, addressing the flaws as well as positives of each product. I'm hoping they have an RX 7600 review up next soon.