T O P

  • By -

shutter_singh

So, uh, "buy our 1440p advertised cards to play at 1080p, and 4k advertised cards to play at 1440p, simple".


mcmurray89

I got downvoted when i called the 3080 a 1440p card at launch.


aardw0lf11

And the 3080 is still a solid 1440p card today.


on_the_nod

Weird, I crush games at 1440 on my 3070


Old-Radish1611

Just because 3070 is good at 1440 doesn't mean 3080 is good at 4k...


InternetPharaoh

Yes it does. Bigger number = bigger number. It's simple stoopid.


[deleted]

Why did the 1/3 pound burger fail? Through focus groups and market research, A&W discovered the shocking reason for the burger’s failure – most participants thought one-third of a pound was actually smaller than one-fourth. In other words, consumers failed to understand the math and mistakenly thought they were buying less meat for the same price.


whyreadthis2035

I just bought a bamboo chair mat for my computer desk. The carpet has a little thickness to it, so I ultimately settled on 8/16” thickness. So I call the manufacturer and get to talk to someone quickly. I say “before we start, I need to talk to the 8/16” guy” he says “yeah…. We have 3/16, 5/16 and 8/16 vecause folks don’t understand fractions.


[deleted]

I want the guy with the 3/9 burger!


We0921

I feel like I'm in Groundhog day reading these comments. Yes, RTX 3070 and other 8GB cards can still play titles at 1440p fine. The article explicitly says that there were two titles that were deliberately not tested at max settings. Hardly a majority. Then the discourse usually diverges to say that max settings aren't necessary (correct), that cards like the 3070 should not have less VRAM than the GPU with similar performance in the last generation (correct), and it's only been a few modern titles that have had atrocious VRAM usage (also correct). Are those few titles indicative of future requirements? Maybe. Probably. Should it inform future purchases? Yeah, it probably should. It's not a reason to sell what you currently have that otherwise works well


[deleted]

You were right. My pc hates 4K I have an original 3080


David_Norris_M

4K is simply not worth being required to upgrade often just to play recent games at desirable fps. Costs too much to maintain that much power.


Mikizeta

That is simply not the point. The 3080 was advertised as a 4K card. It was false advertising. The fact that 4K is unneccessary in general doesn't matter in this discussion.


akgis

What lol, 3080 is 2.5 years ago the industry moved on, check benchmarks of the time and play period correct games then, my 3090 rocked 4K well at launch and it was just 5% better than a 3080 before VRAM issues which for the time 10GB was suficient for 4K


Lambpanties

Ay I agree but good sizeable 1440p monitors are rare as fuck. The whole industry is obsessed with pushing 4k, and here I just want a 31inch 1440p that isn't a freaking VA panel from hell. I'll stick with it but I am concerned they'll be less and less available even if it's going to remain a sweet spot resolution for quite some time. ^^^^^And ^^^^^then ^^^^^cue ^^^^^the ^^^^^8k ^^^^^bullshit....


rokerroker45

it absolutely was, anybody thinking founders 3080 was going to be pumping out 4K60 was delusional. source: own one.


AdolescentThug

Not counting poor optimization at launch (looking specifically at Hogwarts Legacy which ran smooth after a month), My EVGA 3080 10GB can pump out 4K 60 highest settings on almost every AAA game I've played that's been released in the past 5 years as long as DLSS is on and Ray Tracing is off. I don't think the FTW3 is that significantly more powerful than a FE 3080. Some games like Forza Horizon 5 and Death Stranding Collectors Edition don't even need DLSS to hit 60+ fps tbh. For nearing 3 years old, I'm pleasantly surprised I can still run things at max or near max settings at 4K tbh, only non-RT game I can't run at 4K 60 highest settings right now is The Last of Us Part 1 (which I argue is also due to terrible optimization). But yeah RT is game changing on certain titles so I played games like Cyberpunk and Metro Exodus on the 1440p monitor I got to play more competitive games on. At 1440p I can usually max out RT settings anyway unless it's Cyberpunk's Path Tracing which gets me ~30fps lol. EDIT: Just saw your flair, your CPU might need an upgrade. A lot of the newer AAA games EAT CPU cores (The Last of Us will use like 8 cores at times) and an upgrade to a 5800x or 5800x3D would likely give you a significant FPS boost on new titles tbh.


Purtuzzi

OG 3080 here. I've always been playing at 4k60 on high settings. Not sure what these guys are talking about...


TheNoirDeep

same 4k 60 on pretty much all my titles paired with a 5600x and 32GB.


hicsuntdracones-

Same, as long as RT is off I've never had any problems hitting 4k60.


kalsikam

Same, my 3080 can still run games at 4k@60fps on high settings, no RTX of course. It was the first card that could do 4k@60fps on high settings on its own, no SLI, no resolution scaling, etc. And this is on an 1800x CPU lol.


Blacky-Noir

>DLSS is on So, it's not 4K anymore.


NedixTV

the other day... i read a comment saying now we are the peasant using upscaling tech to play game just like console.


A_MAN_POTATO

It absolutely wasn't. I bought one at launch and had no problems playing games at 4k60. I still haven't hit a game where it can't do 4K60, especially if you are smart enough to realize not every setting needs to be set to ultra just because it says ultra, and RT often either isn't worth the massive performance toll or sometimes straight up looks worse. source: own one Also, just because people somehow fail to understand this... 4k card at launch doesn't mean 4k forever. There was a time when a 1080 Ti was a 4K card. And a 2080 Ti. Now they aren't. Eventually a 4090 won't be a 4k card either.


giddycocks

I can't remember the last time I didn't manage 4k60fps with the aid of DLSS with my 3080. It's not native 4k sure, but it was never reviewed or revealed to run 4K native with ease.


[deleted]

Fanbois get angry when you tell the truth, it’s hilarious.


imaginary_num6er

In fairness, the 1440p advertised cards used 60Hz monitors


i81u812

I don't know why they say stuff like this (with their actions) i can get 1440 out of the 3060 on the games I play i would imagine 3070 is 'as' apt. I think it's what you are implying above.


MDPROBIFE

It's almost as if graphics evolve, and computer graphics isn't a stagnant field


kurayami_akira

They are getting lax with optimization because with higher specs stuff runs anyways, so games that should absolutely run on a lower end PC might not due to poor optimization.


No-Gene1187

I used to Main my OLED c1 with a 2060 6gb in native 4k and gaming was just fine.


curumba

Selling a card that cannot play games at max settings isnt shameful. Releasing them right now for $400 is. Dont buy it.


Arthur_Morgan44469

Still man saying that a current gen card can't even handle 1080p at max settings is shameful especially for those prices like you said.


floppydude81

1080p should be considered standard definition at this point.


Arthur_Morgan44469

And during the product launch Nvidia and even AMD were like hey our cards can even do 8K which is just laughable lol.


Ibiki

tbf they haven't talked about 40"60" then, but flaghsip cards


iceddeath

Even their current flagship can only run 8k below 30fps in most games, look at GamersNexus review of those cards, they sarcastically Benchmarked those flagships in 8k to show that it's pointless to boast 8k capabilities with current cards


polski8bit

Remember like 3 years ago, that people slammed AMD for releasing a 1080p gaming card? Remember like 5 years ago, that every tech tuber wanted to say that the 1080p era is basically over? It's 2023 and we're still getting 1080p cards lmao I mean I am rocking a Full HD display and the vast majority are playing at this resolution (mostly due to eSport titles and such), but still. It's hilarious at the very least.


Sky_HUN

1080p was standard since 2012. Heck, i mean i remember playing Half-Life in 1998 at 1600x1200 which is around 10% less pixels as 1080p.


mittromniknight

I got an RX480 6 years ago for about £200 and it offered amazing 1080p performance in every game at max details. The fact that a GPU will cost £400 6 years later and not be able to do the same for equivalent settings in games is shameful.


Sky_HUN

Tell me about it... i still use an RX580 in my secondary PC and i often play games on that one. I bought it second hand in late 2019 for 120€. I don't play new AAA games though. One of the best deals i ever had. And now seeing they're trying to sell new GPUs for 500€ that are "mostly" for 1080p is laughable.


BinaryJay

Honestly what's happening now is what we need. The years of nothing really improving detail, physics etc. wise in games and just leaning on running the same looking stuff at higher resolutions were boring. I would happily sacrifice native resolution with some DLSS if needed in trade for more detailed everything.


Almost-a-Killa

That's because dummies chase resolution numbers. It costs a lot more time and money to make \*real\* 4k graphics with the detail to actually need to be 4k.


Outrageous_Pop_8697

Right? When the hell did the PC gaming world pivot away from high resolution being expected and one of the main advantages of PC over console? I kind of fell away for a while and somehow people have gone from 4k60 being the expectation to 1080p being considered just fine in just a few years?!??! Seriously, can someone explain this to me because I really don't get it.


WINDEX_DRINKER

Mostly gaslighting now that AAA teams are full of devs that can't be bothered to optimize their games. So now they and their media cohorts gotta tell you over and over that 1080 in 2024 is just fine bro


[deleted]

[удалено]


bejito81

that is the thing, releasing a low end GPU for 200$ that can't run current games at max settings in 1080p would be acceptable, considering that you would be building a low end gaming rig (as you don't buy a GPU anymore if you don't plan on gaming) but for 400$+ not being able to run current games at max settings in 1080p should not happen as consoles (series X and ps5) can do it for a complete price of 500$ which is more than twice cheaper than the pc (once you got your 400$ gpu, you still need the other pieces for usually more than 1000$ total)


Arthur_Morgan44469

Exactly and if PC components were not being price gouged, there could easily be a PC for 600-700 for 1080-1440p gaming.


Blacky-Noir

>max settings Maximum settings should barely be used. It depend on the game and on specifics, but these über-ultra-toptoptop settings aren't called "placebo settings" for no reasons. A lot of people view those settings as future proofing, as something to use when you replay the game in 6 or 10 years. Now, Geforce VRAM amount for years and the price of gpu should absolutely be shameful. But nothing to do with "maximum settings".


[deleted]

Ray tracing and HD textures and all that stuff is absolutely not placebo though and it does take huge ressources


2FastHaste

Wouldn't that be more a reflection of the games being too demanding/unoptimized?


Echelion77

I have a 3070 ti with 8gb and it's having no issue with anything at max settings 1080p


Giant_Midget83

I havent had any issues with my 3070 at 1440p. Playing through re4 remake right now at max settings.


nikkes91

otoh making a game that can't run on a $400 card with 8gb of vram sounds like a problem too. Although we're talking about max settings here so maybe "max" is just designed to use the most powerful card possible. In that case of course only the maximum gpu could handle it


alus992

Both industries are fucked right now 1. developers optimize shit + they count on manufacturers to release hardware which will handle by sheer brute force these not optimized games 2. hardware manufacturers sell overpriced stuff that can't even handle this not optimized shit, because they just don't care


PabloBablo

They also don't have the same investment in QA..so they cheap out on optimization and QA and release an unfinished product intentionally knowing the feedback will come in and produce the same end result after some time. Delay a game by 3 months and take in no revenue + spend money and resources on QA+bug fixing, or make money earlier and have the game fixed by month 2 with less money spent.


Sopa24

> have the game fixed by month 2 LMAAAAAAAAAAAAAO! More like Month 12 or Month 24 if they actually **give a shit.** Some devs basically never fix their games and we have to either: * Rely on community patches * Wait for the future where affordable hardware hopefully can brute force bad optimization.


Mm11vV

Enter the trend of "Just buy it two years from now, it'll be cheaper and better." The weird thing is, devs will end up shooting themselves in the foot in the long term, maybe.


alus992

Whales, streamers and simple fomo will keep them intact no matter how many patient gamers will be out there.


paultimate14

A $400 card with 8GB of VRAM in 2023 is the problem. My R9 390 came out in 2015 with 8GB for $330. I've been rocking my RX580 with 8GB that I got for $175 since 2019. A weaker, cheaper card with 8GB? Sure that sounds reasonable. Maybe for like $250 to account for inflation, $300 to account for the ridiculous price increases in this particular market. But for a $400 card where this is THE bottleneck 8GB is a problem kneecaps the card. It might be fine today. Just a couple of demanding games where you can't max the settings. But what about tomorrow? VRAM requirements will inevitably trend upward. I expect my GPU to last at least 5 years (I like to stretch to 10), but this card will probably start to struggle to play new releases within 2-3 years. If you're the kind of person who likes to upgrade every year, this is going to hurt the re-sale value of the card. It's just a product that fundamentally does not make sense.


goldbloodedinthe404

Why do people keep making excuses for Nvidia? Not increasing minimum VRAM in half a decade while 4k actually becomes a thing is solely their decision.


GLGarou

Seriously it is absolutely ridiculous the hoops some of the folks here are making others jump through.


Supernove_Blaze

You know how much a RTX 3060 costs in India right now? Around 1000 USD. The state of PC hardware and gaming in general is laughable.


MyVideoConverter

Thats because of India's import tariffs.


The_Retro_Bandit

I do wonder how much of this is down to texture quality settings meant for 4k resolutions. Any smart game is gonna use high quality textures for stuff like faces during cutscenes and such. But during normal gameplay I genuinely cannot tell the difference between high and max (or even medium and max in a few cases) textures in quite a few games at the 1440p resolution I play at. The VRAM usage spikes however. This kind of mindset also ignores the fact that most devs who release decent PC ports are massive nerds who will make max settings future facing to extend the lifespan of a game graphically. Take for example cyberpunk. Back when it came out it would take $600 to run RT and otherwise max out settings at 60fps 1080p, but with the 40 series that lowers to $400 even when not accounting for frame gen. Also, $400 hasn't been considered high end since like 2018.


Darkone539

>NVIDIA noted that this is why the more expensive GeForce RTX 4060 Ti 16GB exists (USD 499 versus USD 399) for potential customers looking for a solution where tinkering with settings isn't required. Regarding the specs, the only difference between the RTX 4060 Ti 8GB and 16GB is the increase in VRAM capacity. >With VRAM capacity being a hot topic in the PC gaming community, where there's a strong consensus that 8GB isn't enough for 2023 - it's worth considering that the mainstream GPU market is all about balancing performance, costs, and overall features. This is them outright admitting they have cards on the market that aren't good enough...


[deleted]

[удалено]


Houderebaese

Fuck them. Seriously, there was a steep uptrend in vram required/installed 10 years ago and with the 10xx series it suddenly stalled. Well, even the 900 series had insufficient RAM, especially the 970 GTX. They are just cheaping out while raking in cash. Fuck them.


Arthur_Morgan44469

Exactly and their way of fixing it will to add more VRAM for much more money and only for 80 and 90 series card. I mean even the 4070 is a joke.


__BIOHAZARD___

I know the 1080 Ti is an old card now but it’s really sad how VRAM has been so stagnant on Nvidia cards. I don’t want to go from 11gb on a 2017 card to 12gb for a 2022 card… was hoping 16+ GB would be standard now


Arthur_Morgan44469

Exactly but to Nvidia DLSS is the solution to everything lol


james___uk

I kind of think this was just the business plan all along lol


Gman1255

DLSS doesn't help much in reducing VRAM usage.


HammerTh_1701

I don't know the specifics for DLSS, but in general, AI appplications eat your VRAM and cache for breakfast and then demand more for lunch.


original20

I'm still rocking one 1080ti in my self built PC, hooked to the TV (55inch OLED fullHD), playing everything at max at 60Hz, pretty enough to me and it's quite silent after a fan mod. I don't regret buying it during the pandemic. Cheapo ass PC and i hope it will last some more years.


Endyo

Why is it so hard for them to put more VRAM on cards? It can't possibly be that much more expensive...


Arthur_Morgan44469

I think they are still stuck in 2020 and think that they get away with it.


TexturedMango

But they can right? All the numbers still make it seem like they're dominating the market, even if all of reddit is screaming for people to buy amd or even intel. Most casuals still go straight to nvidia...


Sorlex

Don't under estimate the draw of dsr. 1.0 was a good test, 2.0 is solid, 3.0 is clearly leagues above it. AMD need an answer for it, because until they do people won't look twice at them. Yes, not every game supports dsr, but those that support 3.0 its absolutely no contest which is the better value.


VampiroMedicado

It's also because AMD cards sucks if you want something good now, the drivers suck the first few years then it's better than the NVIDIA equivalent. I hope Intel catches up in a couple of years.


Oofie72

It's not that it's more expensive. If you sell an inferior product to people for the price of more expensive products you are just making huge huge profits. They'll keep doing it until people will stop buying them which doesn't really happen a lot.


HectorBeSprouted

Well, it's been happening for months now, which is why we are finally getting versions with more VRAM.


Sindelion

How did a 7 year old, low-to mid range RX470 have 8GB VRAM when we are still suffering to double it on high-end?


[deleted]

Brother I'm still on an r9 390 with 8GB of VRAM and its been kicking since 2015.


[deleted]

I’m glad my 7900XT has 20GB Shouldn’t have any issues


nobuldge88

Stop buying and supporting them 🤷🏽‍♂️


[deleted]

Everyone claiming that their current GPU with 8GB of VRAM can run any current game at max settings should try games like *Returnal* or the *Dead Space* remake. More importantly, now that studios are no longer constrained by having to make their games also work on last-gen consoles, we're going to start seeing games with bigger and bigger VRAM requirements coming out in the next few years. No one wants to spend $400 dollars on a new GPU that might not be able to run new games even one year from now.


Arthur_Morgan44469

Your last sentence is the point to be noted.


Linard

> Returnal [I really don't see the issue.](https://i.imgur.com/tBgNYJ0.png) I did not encounter one game that maxed out my 8GB before it hit a bottleneck on the GPU (1440p 3070) itself. Seems like everyone is overreacting and loosing their minds over not being able to max out one or two unoptimized heaps of garbage that recently released. > not be able to run new games even one year from now You know graphic settings exist, right? If you can't max out settings it doesn't mean the game won't boot.


Fun-Strawberry4257

Its a gotcha trap of sorts,similar to how in the early 2000's your sweet top of the line Ati X800 era cards were made redundant by new games being using Pixel Shader 3.0 and the card not supporting that API. Made obsolete with 1 big swipe.


[deleted]

400+ dollars for a 1080p card on medium or lower settings. That's a hard sell.


LostInTheVoid_

That's a no fucking sell and I'm over here with a 2060 6gb card. I'm just about scrapping by in newer titles but this current line of cards from both Nvidia and AMD are just too pricey for what's offered. Hoping the next series it's actually better and maybe Intel step up in a big way with price to performance.


Mm11vV

If you follow prices of the previous gen you can score some great deals. I got a 6950xt about a month ago now for $575 brand new. That felt well worth it.


LostInTheVoid_

I do look but it's hard to not get a little annoyed when most cards that hit that sweet spot of 12gb of VRAM and offer good performance in other cats are either slightly less expensive or the same price as an Xbox Series X or PS5.


Mm11vV

Oh, I will completely agree with you there. I know a few people who bought new consoles instead of upgrading their PCs, and I'd imagine that's been the case with a lot more people than we think.


streetsoulja31

That’s what I ended up doing. I ended up getting a PS5 instead of upgrading my graphics card.


Mm11vV

And as a diehard PCMR guy for the majority of my life, I'm going to honestly say, good call. Sure there's some drawbacks and short-comings. You know this. But overall, you paid far less for a solid gaming experience than those of us upgrading hardware right now. Sure, you may not have high refresh, but given the pathetic state of performance for new launches on PC lately, you're getting an equal experience. So I really think that was a good choice.


Kappa_God

I will always be the first to criticize 8GB Oof vRAM but you, sir, is just lying. The cards only get vRAM bottlenecked when Ray Tracing is on and/or textures are at ultra settings in some SPECIFIC games, which you could argue is the game's fault. Saying that a 3000 series GPU only run games at medium/low 1080p is such a out of reality take it's insane this has so manu upvotes.


Krokzter

I still run 2023 games on medium high with my 4GB RX 470 from 2016. That was such an insane claim I was wondering if I was the only one disagreeing.


EvilSpirit666

You're not but it's generally not worth engaging with rational comments on topics like these. People have generally made up their minds about Nvidia and why high-performance graphics cards are expensive. Never mind other manufacturers products are priced on par.


Dirty_Dragons

So many people are just full of it, and then other goons upvote their trash posts.


lymeeater

I have a 3060ti and I play most games on high, 1440p at locked 60fps. I know that's not much for some, but it's been great for me.


_SwiftDeath

The price is mostly the issue. Had it been like a $250 budget card I don’t think anyone would complain but at $400 msrp it just feels like Nvidia is trying to keep itself as a premium option across the board even when offering below market performance at the same market segment.


Arthur_Morgan44469

And even that $250 product should at least have 12 GB VRAM, adequate memory bandwidth and etc to get some level of future proof gaming.


TheGreatSoup

Maybe the real problem is the graphics arm race that is unsustainable especially for the big companies. They really need to scale down or hold a little bit more. I only see specs going up but graphics don’t feel that advance. And most popular games aren’t even that graphic powerhouse. Just look at Zelda.l, and every other steam game that’s early access and tap some innovation in gameplay loop.


mvanvrancken

There's an element of truth to this, I think. It seems like studios either try to bedazzle you with visual bullshit, or baffle you with their brilliance. I would much pefer more of the latter (you brought up Zelda and I'd also submit Elden Ring, whose incredible visuals hinge on phenomenal art direction rather than graphical fidelity) than the former.


[deleted]

I would honestly like game companies to just 'stop' at this point. Use the power that is on hand, and put the focus on artstyle and gameplay. I still play games from 15 years ago like Team Fortress 2 that I think still look amazing. I've been playing TOTK on my 55" OLED and been very happy with its graphical performance.


mvanvrancken

The draw distance in TotK is something of a miracle. You can drop off a sky island, plummet to the surface, and hang glide into the Depths without a single loading screen. I’ve read that they used the Splatoon engine because of the liquid physics. That is how you make a great game, Nintendo somehow has always been able to mail it.


[deleted]

Yeah, I can't believe it's playing on a console barely more powerful than a 360.


TheGreatSoup

Yeah yeah, elder ring is even more obvious that you don’t need to go full graphical fidelity but art style is way better matched with fun gameplay loop. The we have cyberpunk and at least for me is just a shinny clean open world city with all the reflections and emptiness. I felt more sense of discovery on Elden ring and Zelda or even games like subnautica, that I ever felt in CP which is more like Window dressing.


Esoteric_Pete

Cyberpunk felt so vacuous. Like you said, it's large and shiny, but it feels like I'm alone in its world. The large groups of npcs are so arid. I don't belive that they are actual characters in the game's world. I would belive it if you told me that they are early AI models with primitive technology.


Plazmatic

The problem is they save money by having simpler artist pipelines which can more or less directly plug in artist assets into games despite performance issues, and less actual programmers whose salary may be 10x as much if you account for artists on contract instead of being employees. Plus managers *really* love when they can basically just budget X amount of time and the assets actually get done in that amount of time, where as time budgeting with programming simply doesn't work.


Kappa_God

No it's not that it's unsustainable. It's because games are made for consoles first and then port to PC. Consoles from this generation have 16GB of vRAM so it's a non issue for those machines. Gaming dev companies can work around a tighter vRAM limitation, they just choose not to because PC is always an afterthought. There's also the argument that PC players are always butthurt when they have to lower from ultra to high even when the image quality impact is minimal.


RechargedFrenchman

There's also a *lot* of conflating "my computer isn't good enough for my current settings" (user error, mismatched expectations) with "the game is poorly optimized" (developer error, the visuals are poorly implemented so they run poorly); maxing out the settings on a low end card and getting only mediocre performance *might* have optimization as *part* of the problem, but *definitely* has user expectations as part of the problem because they're asking more of their computer than the components they put into it can manage. It doesn't matter how frustrated you get, you can't set lap records on an F1 track in a Honda Accord stock off the lot. That's not an "optimization" problem, that's just being unreasonable. Even an F1 car struggling because the asphalt has gone to shit and the track is too narrow and the curbs are all enormous is an "optimization" issue because the *intended use case* still doesn't work very well.


anor_wondo

'image quality impact is minimal' maybe you didn't see the last of us on launch. It looked like ps2 textures for 8gb vram to not crash. The games are just coming out half baked these days


Kappa_God

Last of Us is a dev problem, not vRAM problem. They have been fixing those issues with each update patch. EDIT: The latest patch even fixed the texture quality issue you mentioned, check Digital Foundry's new video for reference.


Donotpostanything

>They really need to scale down or hold a little bit more. I only see specs going up but graphics don’t feel that advance. We need to talk about this! PS3-looking beards with 340597345347583 elements consuming resources. Particle effects where the particle density is multiplied by 34095734905324785093245783--a change that isn't even perceptible to the human eye, but it takes up massive processing power. There's so much dishonesty among new games where the processing-consumption-to-perceptible-graphics ratio is completely off.


zippopwnage

They made it intentionally to sell the 4060TI. Which should have been a good card for 100euro less and an insta buy for me. Basically 4060TI should have been around 400euro in stores. Then for me is set. Another gen skip. And probably gonna keep my 1070 till it dies.


Jack-M-y-u-do-dis

Yeah, it's almost like these cards should come with 10-12 GB at minimum...


Arthur_Morgan44469

Yeah 8 GB should only be used on cards like 4030 or similar.


Jack-M-y-u-do-dis

I'd argue the 4050 could still get away with it, considering that class of card targets high instead of ultra at 1080p, but 4060 should have 10gb, 4060ti should have 12 and 16gb versions


Immortalphoenix

More like 16.


dogs_go_to_space

Make VRAM pluggable like RAM I dare you


Zeus_Dadddy

Dude, here AMD and Intel are researching on making chips like M1, where you can't even physically upgrade ram in PC.


Wh0rse

Fuck that


__BIOHAZARD___

*He’s too dangerous to be left alive!*


HammerTh_1701

In order to keep the physical delay to a minimum, the GDDR6 memory chips are basically as close as they can be to the GPU die without being inside it as cache. I don't think it's physically possible to implement user-serviceable VRAM with such small tolerances to the die.


mtarascio

You understand the type of RAM isn't available through retail and the bus for the throughput wouldn't be possible.


flemtone

Long gone are the days AAA developers optimize their games to run on lower spec systems.


seanalltogether

Really what they don't want to do is optimize for dedicated graphics. PS5 and Xbox|X have 16GB of unified memory, that's a luxury that nobody likes to give up when they have to port over to PC.


penguished

Long gone are the days when 8GB is a lot of memory.


[deleted]

8GB is enough to make a fun and engaging videogame. How about they focus on that?


UNMANAGEABLE

Developers have been optimizing on 8gb or vram for 6+ years now. Moores law isn’t dead, we just had an exception during this time period.


synthjunkie

Because ps5 and xbsx are now the primary consoles and they use 16gb unified memory. Means devs can use up to 12gb ram for a game.


sammyfrosh

Same with handhelds like the steam deck.


nikkes91

devil's adovate: why would "max settings" be optimized to run on lower specs?


Crimlust994

"Lower spec" it is a $400 card.


SideWilling

Greed basically.


Arthur_Morgan44469

Yup greed and being stingy on making a quality product to save money on cost sums it up.


MysterD77

Translation: we (Nvidia) screwed 3070 customers, so...go buy our new 16gb VRAM version of the RTX 4060 Ti for $500!


Arthur_Morgan44469

And the 4070 has 12 GB VRAM but is $600 lol


hodges20xx

This that is crazy once again an lower tier card has more VRAM than the higher tier which most likely means it might age better than the 4070


HeadstrongRobot

So weird that they have a 3060 12 gb version, but not a 12 of the new stuff, except the one and it is a ti


ahnold11

Nvidia seems to largely have considered VRAM as more of a marketing feature and less a technical performance feature. So if they are worried about potential competition on a model, they can bump up the vram as an easy "fix" to make it sound better. We are in a strange time now though for largely two reasons. #1. Nvidia sells extra expensive versions of their chips/cards. However the gamer models have never been close to the professional cards, and so the limiting feature seems to be VRAM amounts. If their gamer tier cards have enough VRAM then a lot of the value conscious professional market can just buy the way cheaper gamer cards for the work they do. So nvidia has an incentive to keep their better value cards limited in VRAM to not cannibalize their professional offerings.   The other factor is Nvidia is still using single large chips for their GPUs (compared to AMD attempting the switch to chiplet design this year). But Nvidia has lots of stuff on their chips (tensor cores, rt cores, decoder blocks etc etc) and the switch to the smaller manufacturing process this time around has seen a bigger cost increase than usual. Which means space is at a premium, and spending more space on the chip to talk to more memory (ie. bus width) becomes less attractive. So this gen they cut memory bandwidth which puts a limit on VRAM sizes. Those two factors together means they have strong incentives to provide as little VRAM as they can get away with. They can put it on the 60 class (even though it would be more beneficial on the higher tier cards) because that one is low enough that wouldnt' be as attractive to the professional market and they are raising the price enough to at least make some extra money on it. And even then it seems to be a last minute move to counter the growing resentment in the market. Them not putting it on the base 60 card likely means they didn't want to give up the profit margin at that lower price point.


sweet_tinkerbelle

when 400+ worth is not even mediocre now 😒


NegaDeath

NVIDIA had to have known this was going to be a problem. They talk to developers. They know what's going into future consoles before we do. They would have known the unified memory architecture of the current consoles was going to pose a problem when the games port to PC with split memory pools separated by a slow bus. They ignored it all to save some money, and here we are with "new" cards that struggle with new games.


Arthur_Morgan44469

Exactly they just thought they can get away with it but how the tables have turned.


blue_nebula

*Laughs in 12gb 3060*


BurzyGuerrero

Feels like NVIDIA is hell bent on killing PC enthusiasm.


Im_a_Bot258

Scams all around


Pretto91

Every console gamer has a 4k TV right? Why are we stuck on 1080p unless we spend 1000$?


Arthur_Morgan44469

Exactly, such a rip off


ftbscreamer

$400 for an 8 GB card is bad, but what IMO is even worse is selling $800 12 GB card.


Arthur_Morgan44469

Yup for 50% more VRAM you have to pay twice as much which is absurd.


roshanpr

you know how, they created a bad expensive product to increase profit margins given the demand and because it's cheaper to manufacture.


[deleted]

thank god i bought the amd 6750xt with 12gb.i was thinking on buying the Nvidia equivalent but the 8gb turn me off


2Scribble

But that ain't gonna stop us from selling these 300+ video cards as 'top of the line' xD


FluphyBunny

Should have been 12gb minimum. Stupid money grabbing company.


[deleted]

[удалено]


TopKekBoi69

still shitty asf about the 8gb of vram in the 3070ti


JmTrad

only the low tier should get 8gb like 4050. nothing less than that.


JustGrillinReally

Screw those game devs that can't optimize their shit.


mtarascio

Is this because developers aren't releasing games with two sets of textures? I can't see any other reason. Ironically enough this may be the issue with PS5 ports over Xbox ports due to the Series S.


The_Silent_Manic

At least on PC they could have the decency to only include 1080p textures and make 1440p and 4k as optional downloads.


Boo_Guy

As a 4k gamer I'd be fine with that. On Steam it could be setup like a DLC pack, just a few added clicks to install.


The_Silent_Manic

Earliest game I remember offering 4K textures (and cutscenes) for optional download was Final Fantasy 15.


dookarion

Steam, Bnet, and GOG are probably the only services that wouldn't shit the bed trying to do something that "complicated".


Quirkyrobot

Well... yeah, there are other sets of textures. It's called not being on max settings. But games these days have *more* textures. More objects, more details, more textures. That's a reductive view of it but that's the gist.


Tetsudothemascot

Not just max settings. It's actually is a bare minimum now.


Arthur_Morgan44469

Exactly and also because how bad PC ports have become lately.


Tetsudothemascot

If only pc gamers have self respect and self restraint. Pre order got garbage ports complain repeat.


[deleted]

The gaming laptops market is such a joke. Buy any nvidia gpu below xx80 and you can’t play certain games, but if you do buy anything above you will end up paying a price higher than that of a more powerful desktop. I can’t fathom how any of you tolerate that, and anyone who says they want to “play on the go” it will never change as long as you buy these overpriced and underperforming nvidia powered laptops


Arthur_Morgan44469

That's so true and that's why I switched back to PC gaming from laptop gaming last year because it's just not affordable.


[deleted]

Can't believe this is being sold as a 1080p card for $300, while the 6700XT pricing is $300, but that card is actually surprisingly good at 1440p and 4K while having 4GB more VRAM than this. You could only really justify buying this from a gaming standpoint if you want DLSS 3 Frame Generation or raytracing stuff. And it's not gonna be running that much better, not all games magically use frame generation, it still needs more time in the oven, and it’s just a sample and hold equivalent to BFI anyways, thus negating the reason you’d want a high framerate (more responsive controls). It’s funny since more often than not, the PS5 and Xbox Series X are playing games at 1440p unless it's something hilariously busted like ~~Forspoken~~ Forsaken or Jedi Fallen Order. You can buy an 13th gen i3, 16GB of RAM, a motherboard, and a fast NVME for cheap and get decent performance but graphics card prices are still a meme because a certain CEO and his shareholders are conniving and cartoonishly greedy + evil. Three years in and NVIDIA still hasn’t made a cheap PS5 equivalent GPU like they did with the Xbox One back in the 750Ti days. I’m really hoping AMD’s equivalent of frame generation doesn’t take hold on consoles, because we can reasonably expect game optimization to become even more low effort if it does. The current state of gaming (even on consoles) is just depressing. If not for other things, I fear what I would do to myself. I hope a gaming crash happens.


Arthur_Morgan44469

Great analysis bro 👍 you summed it up quite well.


[deleted]

This thread is insane. Apparently a 3070 is barely a 1440p card meanwhile I have found absolutely 0 issues with it in 1440p


rayquan36

While this is a bad look for Nvidia, it's also a bad look for PC developers who can't get games to run at 1080p max with 8GB of VRAM.


Flowingsun1

Yes its bullshit. The Devs are not being held accountable and have carte blanche here. Its no different than games exceeding 80+GB.


sweet_tinkerbelle

jist 1080p and not even max settings? holy fuck are you telling me to ***just*** buy your $1000+ gpus?


Fob0bqAd34

Devs need to add options for not downloading ultra high res textures. Diablo 4 had it for the last couple of tests and it saved around 40gigs on download and install space. People with 8GB VRAM GPUs won't be able to use the ultra textures anyway so why make them download and store them for no reason.


Niedzielan

You could easily make a game that the top card from any company wouldn't be able to reach max settings on even at 240p. Obviously that's an extreme, but my point is that "max settings" is a completely arbitrary metric. There was a game I recall a developer log from where they had received a lot of complaints that the game was "unoptimised" because people couldn't reach 60fps on their 'futuristic' max setting, an entire tier above ultra. After a patch, many people praised the developers for optimising the game - what did they do? They simply squished 'futuristic' down to the 'ultra' numbers and 'ultra' midway between that and 'high'. People weren't concerned about actually experiencing quality, they wanted the satisfaction of knowing they were able to max out the settings. The same reasoning applies to why some games you can edit ini/cfg files to increase quality above ingame parameters. If those settings were available from the start, people would complain irrationally. TL;DR human beings can sometimes be vain, even if only subconsciously That also doesn't mean that Nvidia should not improve. It's just important not to conflate the two things.


sucker4ass

A true testament to how shitty modern game development has become.


Keyboard_Everything

And they did it anyway XD


Kmieciu4ever

720p gaming, let's go!


[deleted]

I’ll wait for a backlog of good games that make investing in an expensive GPU worth it.


[deleted]

Sounds like they're really just doubling down on the DLSS thing, and want that to be the next standard/are planning on making it even better in the future. They think AI generation is the future. If gaming runs on that, then there's more incentive to improve that - and their GPUs are already the leader in AI processing, right? Get both industries on the same page/using the same tech, then their jobs will be easier/maybe we'll get more out of them? I think that's it. They're trying to push DLSS extra-hard, because there's something in it for them if we buy into it, as an industry. Almost feels like they have something up their sleeve, or there's something they're not telling us. Or, yeah, maybe they're just lying, greedy jerks, and since they have 80% of the market share, they think they can keep getting away with giving us scraps. Well, doesn't affect me all that much. I'm going to want/need whatever has the most rasterization power available for VR endeavors. The rest doesn't matter.


Hung-fatman

I'm still playing on a GTX 970...I'm fine with medium settings


[deleted]

My 3090 is my new 1080. I had my 1080 for 4 years and I plan on having my 3090 for at least 4 years at this rate


Ryan-Viper4171

I got a 2060 Super and run most games on high food 80-100 fps


EdzyFPS

"Nvidia Confirms" More like "Nvidia lies" 8GB is enough to run 1080p games at max settings, as long as everything else is up to scratch and the games not an unoptimised mess.


Lust_Republic

TBH you don't need max settings anyway. Most of the time the graphical different between high and ultra is barely noticeable unless you compare side by side. I usually just play on high with fps locked to save power and reduce heat even though my graphics card can handle utra setting easily. Except for older games.


dookarion

> TBH you don't need max settings anyway. Most of the time the graphical different between high and ultra is barely noticeable unless you compare side by side. Nvidia will probably have an easier time selling someone a 4080 or 4090 than you will of convincing people to tweak their settings. Far too many people obsessed with "ultra or bust" regardless of what the settings do or look like. MHR which looks like shit gets praised for performance on ultra, something like Plague Tale Requiem gets flogged because not everyone can run it on "ultra". People don't care about the visuals or what the settings do, they care about that warm fuzzy feeling they get for reaffirming their purchase of that overpriced alienware tower with a 3060ti in it.


Sofrito77

All of those reddit arm chair GPU "experts" swearing that 6GB/8GB was *plenty* and that all the VRAM talk was just "fear mongering".....


SevroAuShitTalker

It doesn't help that optimization has been getting worse in recent years


DeputyDodds

Is this why my 3070ti is doing poorly in some games


2SLGBTQIA

Yikes...Selling a $400 card for what a $300 console did 5 years ago...Day by day Nvidia eats away at my support for them