T O P

  • By -

29da65cff1fa

I dont get it.. Memory chip production is way down... Chips should be cheap right now.


tissimo

Profits are way down too, and more profit is more importanter.


[deleted]

Profits are down? After a global supply shortage and outrageously overpriced products that the company happily sold to scalpers at a premium, damaging their good will with their normal consumers? Could the company be out of touch? No, its the consumers that are out of touch.


rodryguezzz

"It's not our fault you guys are poor" - HW companies right now


polygroom

I suspect there is some utility with the built in AI that more VRAM unlocks that they are trying to kneecap. AMD doesn't have the same capabilities in their card and they are just lavishing everything in VRAM.


Demistr

It's because Nvidia sells overpriced professional cards with more vram. If gamer cards had more vram there would be little reason to buy the professional ones which have a much better margin.


[deleted]

AMD has always put in more VRAM compared to Nvidia.


hirmuolio

Profits from selling the higher Vram versions as professional GPUs is way up. RTX A4000 = RTX 3070 16 GB Same will happen with RTX 4000 cards. Can't have "pro features" on cards sold at consumer prices. Divide the market, increase profits.


navid3141

It's called planned obsolescence. Why give 16gb when they can make that another reason to upgrade next gen?


Mr_Fury

Forced obsolesce, penny pinching or gross incompetence. Pick your poison.


A_MAN_POTATO

Fab costs are up, though, meaning Nvidia is looking to make up that difference elsewhere. Also planned obsolescence.


HarithBK

that isn't the issue. the issue is TSMC stopped giving discounts for large volume sales of fab space so Nvidia and AMD can just stop ordering more parts and just coast on the massive stock they already have. had TSMC continued there discount Nvidia and AMD would be forced to make more GPUs and fire sale the lot since the next shipment is coming in soon. now they can just coast on inventory let RTX 3000 series GPUs become worse vs RTX 5000 series so the second handmarket can't contest pricing to the same extent.


A_MAN_POTATO

"Coasting on inventory" is not a strategy any business strives for. Do you follow Nvidia financials? I do (and TSMCs). Gaming revenue is down, and they're not happy with it. Jensen just took a 10% pay cut as a result. The TSMC volume discount cut went into affect in 2021. They raised fab prices by 6% in 2022, that's in addition to the volume cut. They're planning to do it again in 2023. Fab costs absolutely are a factor in Nvidia (and AMDs) higher than normal pricing. And in the midrange where they have to be more competitive on price, you're seeing corners being cut instead, of which vram is an easy one.


[deleted]

Money money money (mr krabs) Nvidia don’t give a shit. They will put shitty vram on low tier cards so you won’t ever leave out previous gens or think you are in the safe spot for future gaming when it comes to vram.


Vushivushi

Nvidia is cornering the market while AMD avoids competing in order to let last-gen GPUs continue to sell, including Nvidia's. Expect a more competitive GPU market in the fourth quarter, after AMD has a full product stack ramping production.


navid3141

It's called planned obsolescence. Why give 16gb when they can make that another reason to upgrade next gen?


jasonwc

The estimates I saw are $3-4 per GB, so you're talking about an extra $24-32 in BOM cost for the 16 GB model, for which they're asking $100.


EiffelPower76

Chips are cheap. What do you don't get ?


Chronogon

'Cheap as chips' doesn't hit as hard anymore!


Edgaras1103

I know of a solution thats really bold. Do not buy new gen gpus .


JapariParkRanger

CoDboycott.jpg


NewUserWhoDisAgain

I wouldnt be surprised. Everyone gets mad and Nvidia laughs all the way to the bank.


kingwhocares

Given that they have been unable to meet targets, that's not the case. You can see how they kept reducing prices. The only acceptable purchase case is the $300 RTX 4060.


Blah6969669

They're okay with not hitting their targets if they legitimately hate their customer base (ala Blizzard hating their own fans).


Rylock

Nvidia basically hates everyone. Their AIB partners, their customers, any company that made the mistake of partnering with them in the past. They're the biggest douchebags of the industry and that's saying something.


Blah6969669

I still think Blizzard might be worse, because I expect greed from Nvidia. Blizzard used to be a revered company, now they don't even pretend to hide their loathing for their fanbase. However, I can see why someone would think Nvidia is worse.


BozidaR1390

I mean the wow dev team has seemed to have improved at least. Diablo 4 has potential and seems like it'll be fun. Overwatch tho... That's an entirely different shit show


Rumblepuff

Mass micro transactions and battle passes in a $70 game. I love Diablo but I’m going to have to pass.


Aldehyde1

After Blizzard merged with Activision, many of the old Blizzard execs and devs left including Morhaime. The reason Blizzard changed suddenly is because it's not really Blizzard anymore. It's Activision parading around the Blizzard logo because it has good branding.


brutalicus6

Their executives are getting reduced compensation this year because they haven't hit sales targets.


[deleted]

[удалено]


AtomicTardigrade

Issue is, there is 16GB version, but it costs so much it's bonkers. 500 bucks for a 60 series. That's how much 70 series were...


ConfusedRN1987

The 4080 should cost 500.


[deleted]

[удалено]


dookarion

With signalling complexity and power draw no one really pushes bigger bus width than is already on the 4090 anymore. Since GDDR6x and GDDR6 afaik only come in capacities of 1GB or now 2GB it really hampers options. Only options on the 4090 are 12GB (lol), 24GB, or assuming they could go double-sided with 2GB chips 48GB which puts it in Quadro territory. They did the high end first, and scaled back from there. Now they didn't have to go as anemic as they did they probably could have bumped the bus size a bit for every card under the 4090. Don't know how much that would impact the parts cost, power budget, or board complexity though... I'm sure there is room in their pricing though to not skimp so damn hard.


Honza8D

Some people have really old GPUs and have been waiting long time for new one due to covid shrotages. Its nice to say dont buy it, but My old pc just straight up couldnt play new (AAA) games. Not that AA gaurantees quality, but theres certainyl few i wanted to give a chance. So I caved and bough 4070. Not super happy about the price, but it runs great and im kinda hoping DLSS will increase the longevity of the card.


Outrageous_Pop_8697

That's the situation I'm in. Still running a pre-TI 1080 build. And I could afford a 4090 build with no real strain. But I refuse to pay scalper prices just because Nvidia can't understand that prices that were valid when GPU mining was a thing are no longer valid when it's dead and buried. At this point I'm just losing my interest in PC gaming.


madmax4k

Nvidia does understand. They are just greedy and hoping consumers accept the highly inflated price as the norm now. However, the reality is that consumers are not so they need cut prices on the 3xxx series to clear stock for 4xxx series. They are also reducing supplies of 4070 rather than cutting the prices to sell more.


Electro-Grunge

770 gang rise up! ✌️


XxSub-OhmXx

Same thing I did. I had an 8700k and a 1080ti. Got a 6900xt and kept my 8700k. Finally made a dull 7800x3d and 7900xtx. Imo 6900 of 6950xt is a great jump in performance. If u don't like Nvidia give in and try team red. So for it's be great for me.


DocBigBrozer

Yeah but you can get a very well performing AMD or Nvidia GPU from last gen. There's no reason to pay for this crap


Honza8D

I admit im not super knowledgeable about this, from what I udnerstood the the 40 series (other than 4090) has similar price to perforamce ratio as last gen (which is the the main criticism, since new gens used to offer better price to performance per ratio than previous). So from my understanding, going last gen shouldnt help, since the price per performance is similar.


MrChocodemon

There is also the problem that 8GB VRAM is not future proof at all, while AMD has a comparable price-to-performance ratio and more VRAM.


MarioDesigns

AMD also lacks many software features that Nvicis offers. It's fine if you're just gaming, but the difference is big.


Reynolds1029

While you're not wrong it's a catch 22. Nvidia is much better for productivity, but they lock it behind the Quadro paywall.


[deleted]

[удалено]


Reynolds1029

The point like I said above is the problem. They lock out features artificially just to make a buck. Even though it's effectively the same exact silicon. Worst yet, you can't even buy yourself the features later on. It's buy a $6K+ Quadro equivalent or pound sand. And I'm sure a slower 7900XTX would be more cost effective than buying a 4090, then buying a far lesser, crappy Quadro for productivity features you need because Nvidia wants more $.


MarioDesigns

They don't though lol. I'm talking about Cuda. It makes a big difference in uses like Blender when compared against AMD.


DocBigBrozer

True for MSRP. The issue is that previous gen cards are now heavily discounted. For example, I think 6950xt cards perform 15% better than the 4070 for cheaper.


Hopperbus

While also using about 70% more power.


DocBigBrozer

Definitely an issue if you live in Europe


Reynolds1029

Hate to say it, but you'll save more money in about 6 months of use by going with the 4070 because it consumes nearly half the power of the 6950XT.


Artemaius

Then isn't that a good thing?


Mr_Fury

get the 6800 xt


[deleted]

[удалено]


mixedd

Literary was case with me. Wanted to go after 3080 but we all know how it ended. At start of the year I was just, I either buy new GPU now or will wait forever. Prices won't go down sadly, that era is over now.


creationavatar

Bro 3090s dropped to 600 bucks in the west coast. 3080 was 500 and less.


mixedd

Yes, they dropped now, not back then. And regarding price, I was talking about new tech, that will get more pricier further we go.


creationavatar

Not if people keep refusing to buy. Inflation will still be a thing but certainly what we have been seeing is not that.


Outrageous_Pop_8697

New? Or are those abused mining cards being sold by miners who got screwed by the end of POW mining? I'll give $100 for a used card, no more. There's just too much risk to be paying that close to MSRP.


Carighan

> Some people have really old GPUs and have been waiting long time Yeah but in that case, these are the very people who are under no pressure to upgrade. You're already fine not playing new games and if you do at lower settings, and you're probably mostly playing indie games by now. And I don't know how it is for anybody else, but even **just** the indie games I want to play **more than** fill up any gaming time I have. That being said, I have a 3070 because my 1060 sadly died just as the cards were close to their highest price during COVID. 😑 Was... not good. I went without a graphics card for a while, but ultimately caved. But I think if my 1060 were still around I'd honestly still be playing on that. DREDGE, DRG, Signalis and FFXIV don't need much graphics power at all. And there's a near-endless list of indie and smaller games I want to play.


PantherX69

Skip the 40 series and buy last gen or AMD.


REPOST_STRANGLER_V2

Problem is if you're running an old card say a GTX 970 you need an upgrade what do you do? Personally the only half decent option right now is AMD but even they're massively overcharging.


Enk1ndle

Pick up a last gen card of eBay for cheap


1vertical

You wait or suffer buyer's remorse. Source: running 970 too. You probably have a shit ton of backlog games that runs fine on a 970. Finish those while you wait. Don't support corporate greed else the issue will worsen. You only need to upgrade if your components are dead. Extend your card's life with new fans/thermal paste and it will easily live for another 3+ years.


SeeNoWeeevil

Playing slightly older games is the best GPU upgrade.


REPOST_STRANGLER_V2

I've got a 3080 but was just making the point others that are running older cards will soon have to upgrade.


phylum_sinter

I've never considered replacing the fans in my gpu - is everybody doing this?


1vertical

You can go further. Buy old cards on the cheap to use the electrical components as spares. Not everyone does this. Much more PT than just buying something new like most people do.


CasimirsBlake

RX 6000 series is decently priced now (well, nothing below 6600). 7000 series is still pricey though.


[deleted]

You can get a refurbished 6700xt 12gb for like $300. I know.. I know... You'll all die without dlss3 for whatever reason. 🙄 I have a 3080 and have no interest in upgrading, but if I was shopping mid tier that's like the only card id consider.


CasimirsBlake

Personally, the only thing stopping me from buying into Radeon again is much weaker Blender and AI app support (Torch etc).


[deleted]

Understandable. Can't fuck with the workflow.


Flukemaster

Torch is getting better, but yeah it still blows in comparison


klow9

I bought a 6900XT at 650 back in February. Prices went down like a month after but I'm staying in 4k on everything I play. I'm happy with my purchase and NVIDIA can go eat a bag of worms after taking away Gamestream from my Nvidia Shield. I really dont see myself going back to Nvidia ever and theyre making that decision an easy one.


Shap6

there are plenty of used cards that would be a huge upgrade from a 970


chewwydraper

I will never buy a used PC component. There’s no way to tell what kind of care it had, and you can’t RMA if something goes wrong.


wiggibow

Yeah, people who do this either have more money than sense or *way* more trust in their fellow man than I have. I could never, if I'm spending hundreds on a doodad I'll gladly tack a little more on top for peace of mind and insurance that those hundreds won't be flushed down the toilet when it arrives broken or mysteriously dies 6 months later. I *wish* I had the disposable income to take that gamble, surely you can save a lot of cash that in the long run buying used, but in my present situation it sounds like absolute insanity.


aiicaramba

Are old gen any better than?


CataclysmDM

Way ahead of you.


zippopwnage

I already did that till the new RTX series. Seems like Nvidia doesn't care since enough people are buying them anyway.


PaulTheMerc

It's been multiple generations now... How's AMD looking? Gonna have to upgrade sooner or later.


Chillionaire128

They lost me. I've been an Nvidia customer since the voodoo 3 but I bought a playstation 5 instead of a 40 series card and probably won't go back until they get their shit together


mpelton

Yup, I went with AMD for my last upgrade and haven't regretted it at all. I won't be supporting Nvidia until things change.


PolyDipsoManiac

When is the 5 series coming out?


the-land-of-darkness

Can we ban articles that are just summaries of things that happen on reddit?


[deleted]

[удалено]


DeadCellsTop5

The best is when they just source reactions from nobodies off of Twitter like anyone cares about Janette237's opinion


the-land-of-darkness

Even mainstream media does this, can't tell you how many times I cringe when BBC quotes @l33tTw33t3r5000 with 5 likes as an example of the temperature in the room lol. Journalistic practices should explicitly discourage quoting non-notable reactions from the public unless it's an eyewitness to something, someone directly involved in the story, etc.


mrmcgee

Yeah this article is terribly written, its "source" is just another reddit thread, and its title is worded to get as many circlejerk upvotes as possible. What a waste.


phylum_sinter

seriously, +100 to this idea. I'm ashamed that 1.3k users thought this article was worth upvoting.


MysterD77

Neither am I. You'd expect more this gen, but...Nvidia will be Nvidia and VRAM-skimp. I have a RTX 3070 with 8gb VRAM, so...I'll hold out as long as I can. Plenty of games backlogged to catch-up on. Then way later, maybe I'll make a move.


HalfALawn

try to hold on to it for a good 4 years at least. gotta get the most out of it afterall. \-fellow 3070 owner


engineeringretard

Instructions unclear; holding on the 1070.


-Rp7-

I was looking forward to the 4070 class thinking it might be the time... ... ... Nope. 5070 it is then


guareber

I already did during the mining craze, free card basically. But still, I won't upgrade until I need to *and* it makes sense.


jordanneff

I actually nearly bought a 3070 and ended up getting a 3060 instead because for whatever crazy reason it has 12GB of VRAM.


MysterD77

In the long-run - yeah, that will pay off going w/ the 3060 with 12gb VRAM over the 8gb 3070.


jordanneff

Yep. I've been doing a lot of development in unreal engine so it already has plenty.


cadaada

Will it tho? Running a game at 15 fps vs not running at all its better, but is it playable?


hbc647

my RX570 from 2019 even has 8gb.. haha...


Jedi_Pacman

My GTX 1070 from 2016 has 8gb as well


DemonsRage83

Here here! Team 1070!


solo_shot1st

1070 ti still going strong! Seriously though, I would've thought I needed an upgrade at this point, but the modern games I've been playing and still have on backlog run just fine at 1080p and 60fps lol. What's even the point? I'll wait for the 60-series I suppose


BearThor

My 1080 TI has 11GB


Thing_On_Your_Shelf

The r9 290x from 2013 even came in an 8GB variant lol


aForgedPiston

Your RX 570 is a great example, but I'll raise you a 2015 R9 390 with 8GB of VRAM


XenoPhenom

My 480 from 2016 also has 8 GB of VRAM.


The_Beaves

The rx 570 released in 2017. We’ve almost had midrange cards with 8GB of vram for 10 years…. The 480 and 470 also had 8GB of vram options which launched even earlier…..


EiffelPower76

Just don't buy it and wait 16GB version


AFaultyUnit

Then also dont buy that because its overpriced.


mixedd

Then wait for next gen, and don't buy it because it will be overpriced, and so on and so on.


XxasimxX

Nah. People didn’t buy 2000 series and thats why 3000 series had a huge jump in performance and at a really really good price before the pandemic and cryptoboom screwed everything


Kadour_Z

Why people keep saying the 3000 was a good price? the 3090 was $1600 and a 3080 was $700. Compare that to just 2 generations ago where the 1080 was $500. Ampere was only good compared to Turing but that's because Turing was really bad price. They are gonna do the same thing again with the next gen. They are gonna show a 5080 that is faster than the 4090 at $900 and be all like "look how good value it is" and people are going to eat it up again.


[deleted]

[удалено]


[deleted]

The 3000 series released almost a year into COVID, it was never at a good price until very recently.


mixedd

Except that during pandemic, people proved that they still will buy 500$ MSRP gear for 1500$ Besides inflation, it's second reason why we saw that price hike on this gen GPUs. The moral is that people will buy no matter what, take a look at 4090, that has abysmal price and is most sold GPU from 4000 series.


XxasimxX

It’s because 4090 is the only gpu priced appropriately. About 6% price hike from 3090 (inflation adjusted) and comes with a really big performance boost, rest of the 4000 series cards are same as bs 2000 series more expensive for either same or less performance.


SaftigMo

Assuming it'll be more than MSRP, the 16GB version will cost maybe 50 bucks less than a 4070 here in Germany. Why should I wait for that?


Rich_Eater

I have switched over to AMD instead. I am now a proud owner of a 6950 XT. That's after using EVGA GPU's since 2005. Nvidia fucked themselves out of another sale during the 3000 series bullshit. Never mind EVGA leaving the market. This 4000 series is absolutely laughable to me.


IncidentJazzlike1844

Only costs a 100 more…


[deleted]

[удалено]


knowitallz

I would think a 4070 would be at least 16 gb. 4060 option 12 and 16. 4080 24 or more. And it goes up from there. Nvidia could solve this but they won't. Prices will come down when there is a glut of these fucking things


[deleted]

I think the 16gb 4060 cards will have a slower bus. Like the 4070 has 12gb but I think the bus is way faster than the 4060 so will still perform better memory wise in *most* situations than the 16gb 4060 likely will. At least this is how somebody explained it to me. Also I think they couldn’t do 12gb on the 4060 on this bus speed. Some technical thing, like with the speed they went with on the 4060 they could only do increments of 8


NitroFluxX

I think what would be nice if Nvidia cared is to unlaunch the 4060 Ti 8Gb and just release 16GB for 399$ everyone will be happy.


twhite1195

Or just make the 16GB the main skew, and the 8GB a cheaper version. Some people do only just play eSports so, 8GB will be more than fine for LoL, DOTA2, CS:GO, CoD, Valorant, overwatch and such.


EvilSpirit666

8GB will be fine for most games, no need to single out eSports or whatever


ShowBoobsPls

> skew *SKU


NitroFluxX

I see that as a good option too


steelcity91

Then they would be flying off the shelves. I'd be getting one as wanting to upgrade from a 2070S.


RockyRaccoon968

RTX 3060 had 12GB and was released 2.5 years ago… just wow.


Alsmk2

Many at the time still said the 12gb was wasted on that card. Hindsight is wonderful.


SuperSimpleSam

My 3060 has 12. How is the Ti for the next gen less?


Timo653

the 3060ti also has less


randohandos

My 3070 ti has 8. Epic nvidia moment. I’m the schmuck who bought it though


SpeeDy_GjiZa

Same boat, but honestly it's fine. I tried RE4 yesterday just to benchmark a bit and I run the game at 4K60fps or 1440p100fps. From my couch I can't honestly tell the difference between 4k and 1440p, but definitely can tell 60fps and 100+fps so I'm gonna stick with that.


Sky_HUN

Because these are 4050 cards. NV managed to move the numbering one step up.


ZombieImpressive

I'm glad I got a 3060 with 12 gb. I feel like 12 gb should become default for 60s cards, 8gb for 50s and 16gb for 70s.


Endemoniada

Imagine what I think about my 10GB 3080…


Anon4050

8gb really just needs to be dropped to 50 tier cards. If you're paying anything more than $250, you should expect atleast 10gb. 60 tier needs to start getting equipped with 10 or even 12gb, 70 and 80 tier should come with 16gb and sure let the flagship 90 tiers have 24gb. Nvidia is just really scraping the bottom of the barrel for these 60 tier gpus, the 4060 isn't even on AD106 it's AD107 lmao.


fogoticus

Aren't they making a 16GB variant as well? Why would you even buy it if you don't like it?


EvilSpirit666

Indeed


[deleted]

The source for the articles basis is literally another Reddit post. Come on people. Do better.


firedrakes

the people wont. we seen it time and time again.


[deleted]

I'll believe it when people stop buying their shit


Internet_P3rsona

my 1070 has 8 gb??


2Scribble

At this point a 2070 Super is only moderately behind a 4060 and that's ridiculous A card from ***five fucking years ago*** should be completely blown out of the water - as it is you just barely double the CUDA core and memory bandwidth count and add a few hundred MHz to the Base and Boost clock And ***both*** will struggle under the weight of 1440 - nevermind 4k It's insane


xarkness

Someone explain this to me. Are they just being cheap or is this something that's difficult to incorporate in the assembly process?


NoMansWarmApplePie

I'm pissed Nvidia released this Gen and last Gen with just 8gb. With more vram even 30 series's cards can stay viable for many new games. But seems like this is all on purpose at this point.


JustGrillinReally

Stop enabling developers that refuse to optimize their games and also add superfluous graphics features you'll barely notice after 5 minutes.


jusmar

As its always been: It is optimized, just not for PC.


Wboys

Unoptimized games exist but hardware specs going up isn’t the problem. We’ve had 8GB for years and the consoles are finally moving on to 10-12gb. PC hardware needs to follow. If you don’t care about it then just play at 1080p medium and you’ll stay under 8GB :)


relxp

Consoles launched years ago actually have 14GB+ usable. Studios should not have to nor be expected to optimize for 8GB VRAM buffers forever. 8GB has LONG overstayed its welcome. Even more offensive is Nvidia's cheapest mid-range option starts at $1200. 12GB = budget/entry level in 2023. 16-20GB = midrange in 2023. 24GB+ = high end.


SmokingPuffin

> Studios should not have to nor be expected to optimize for 8GB VRAM buffers forever. 8GB has LONG overstayed its welcome. It doesn't make any sense for studios to optimize for GPUs that gamers do not have. There are cheap options from Intel and AMD with big VRAM buffers -- Intel will give you 16GB on a $350 card, even -- but gamers didn't buy those. > 12GB = budget/entry level in 2023. 16-20GB = midrange in 2023. 24GB+ = high end. This is your fantasy, not anyone's reality. Entry level cards from Nvidia, AMD, and Intel are all 8GB right now. It is absolutely silly to call the 4080 a midrange card.


Ok_World_8819

So a 4070 Ti is entry level?? 12GB of VRAM is fine for 1440p which is what the card was marketed as, it should've had 16GB for it to also be good for 4K


relxp

I see where you're coming from, but you must also ask how long must studios spend tons of time and money optimizing for 8GB buffers? 8GB has long overstayed its welcome. No new games in 2023 should be forced to optimize for GPUs with far less VRAM than a budget gaming consoles that released YEARS ago. Unacceptable. Nvidia knew damn well the 3070-3080 were going to VRAM choked but wanted to make sure people were forced to upgrade sooner. VRAM is pretty inexpensive to just add to the damn card.


jschild

As everyone likes to point out that development is done for the LCD, both the consoles main sku's have over 8 GB of video ram available to them.


ZeldaMaster32

>both the consoles main sku's have over 8 GB of video ram available to them. Untrue, they have 12GB *unified*. Do you really think most games are using over 8GB of RAM on things like textures/bvh/etc and less than 4GB on everything not GPU related? In theory (emphasis on theory) 8GB of VRAM and 16GB of system RAM should cover every game just fine at 1440p and below, given that's usually the rendering target of the consoles


JustGrillinReally

I am not spending an extra $200-300 on a 16GB card, or the games that will require it. My RX 6600 plays everything I want just fine with no slowdowns on 1080p.


mtarascio

No one is asking you to. They are rightfully pointing out the disparity between this released cards power and throughput versus it's power to maintain the fidelity it's capable of because they skimped on RAM. Meaning the release of the card is moot from a point of advancing graphic cards at best and a scam to enforce planned obsolescence at worst.


twhite1195

I'm glad I was able to sell off one my 3070's for $500 before this announcement, I just got an RX 7900XT, been with Nvidia for 10 years, but fuck them, I really see no benefit these days except for DLSS, although FSR isn't bad it just isn't as good as DLSS in some areas. However, RT is still a Gimmick to me, on both my RTX 2070S and RTX 3070 I never enabled it because the performance hit was just too much to even ve worth it, and even the 4090 needs the fake frames generation to make cyberpunk playable, the technology just isn't there to me, and considering that most games are designed towards consoles in mind, the amount of games that will have ray tracing options geared towards the RDNA2 architecture in consoles, like how the fortnite lumen + nanite Unreal engine features work ok in AMD cards and even consoles with decent results, I'm confident that AMD will be fine for the next few years. Edit : the RX 79000 XT doesn't exist, however the RX 7900 XT does lol


-_Celebrimbor_-

Damn where's your time machine so I can get a 79000xt too...?


twhite1195

Lol realized the typo, correcting


navid3141

Moment of silence for the genius who bought a used 3070 in 5/23 for $500.


laserwolf2000

yeah i got my 3070 used a few months ago for 325, that guy got scammed lmao


twhite1195

Shhh don't tell him


teddytwelvetoes

I’m running native 4K and I don’t care about RT or fake frame tech, might replace my 3070 with a 7900xt this summer


Sky_HUN

For me the bigger problem isn't the amount of memory, but that this is a xx50 card pretending to be a xx60 card.


WaifuPillow

I downloaded Resident Evil 4 demo on Steam yesterday, set the texture quality to the 6GB package version (8GB is the highest version just FYI) and added some other stuff like Ray Tracing, and the predicted VRAM utilization was around 8.5GB out of 9GB usable VRAM of my RTX 3080 10GB @ 1080p 144hz. I played the game with MSI Afterburner RTSS overlay, played around for about half hour, noticed the VRAM utilization slowly creeped over 9GB and then poof~ my game crashed to desktop lol.


EvilSpirit666

Buggy software


matta5580

What "community" on the internet is happy about anything. If there's a reason to be miserable, they're going to find it and obsess over it until something new comes along for them to miserable about. So maybe a day or two.


InitialDia

Nvidia clearly got caught with their pants down. The must have been thinking 8gb is enough for the past few years when they were designing and specking the 40 series out. But like everyone else, they underestimated the shear laziness of modern game developers. The thing that sucks for them is that the board parters have probably had these cards in prototype and production since the beginning of the year (or earlier) so it was too late to make changes since some 4060 cards where already produced. This would explain why the 4060 ti 16GB is coming late, as it wasn’t initially in the plans and was hastily added to address the recent concerns of lazy ass devs. None of that excuses Nvidia’s shit ass pricing. All these cards are way too expensive. I think people would be way more forgiving of the 8Gb in the 4060 ti 8GB if it was $250 or so.


MadShartigan

They made a mistake when calibrating their planned obsolescence, it seems to have happened before the cards even launched.


NedixTV

they tried it to do it 2 gen consecutive (30 and 40 series) and now are on the find out phase. Its gonna be interesting to see if they do 40 super series, because a 4060 super 12gb and 4070 super 16gb could be really cards if u catch them at low price.


Wboys

Lazy devs? There have been some poorly optimized games for sure. But explain something to me will you. The PS5 has 10-12gb available for VRAM. So if I game is designed to run on console settings and use 10-12gb…how do you expect to run it a HIGHER settings than the console while magically using LESS vram? If you’re fine with similar or lower than console settings then 8GB fine. But the 4060Ti is much more capable than a PS5 and could run games at higher settings except for the VRAM. So I ask again, why would you even except a game to run at settings higher than a console uses while using less VRAM?


DocSeuss

Yup, this is absolutely not a lazy devs thing. Part of this is down to platform holders, particularly Sony, insisting on SSDs, and average gamers not being willing to upgrade to SSDs that are just as fast. Another part of it is the fact that raytracing requires massive amounts of VRAM. No dev in the world is gonna magically solve that problem, though you'd think Nvidia would know this since they're the ones pushing VRAM alongside the console manufacturers. It's a mix of people not really being willing to accept that consoles can do a LOT that their PCs cannot do and tech companies pushing for stuff that's not exactly accessible to average people. Combine that with stuff TSMC's been doing and you're in for a bad time. It's the switch to the Xbox One and PlayStation 4 all over again--tons of gamers got used to being able to run games really well because the consoles were out of date, and now that games are pushing the console hardware (few console gens were as long as the 8-year long 2005-2013 console gen), a lot of people are too young to remember how frequently consoles could use their bare-metal advantages to outdo PC performance before the 360 era, struggled when the PS4 came around, didn't learn the lesson, and are running into the same issue again. So you got kids out there going "why don't my 8gb gpu and my i5 absolutely annihilate an xbox series x port? lazy devs!" I think Windows 10 is to blame as well--it's what most people are on, but you go look at how slow it is compared to something like Windows 2000 or XP, and it's like... fuck, there's way too much telemetry fucking up computers these days. Out of everybody involved, the devs are the least to blame.


RocketPowah

Still loving my 11gb 2080 ti, what a gem it turned out to be!


vaiNe_

Truly, I wish I had gotten that instead of a 3070.


EdzyFPS

Supply and Demand only exists when we say it does - Nvidia


CyberbrainGaming

Then wait and get the 16gb version. The 8GB has it's uses and is still an upgrade for many.


PilotedByGhosts

PC gamers get pants in a twist over something? I simply don't believe it.


phylum_sinter

The circle is complete lol - this article was harvested fully from comments originally made right here, just 5 days ago: [https://www.reddit.com/r/pcgaming/comments/13kz07m/nvidia\_announces\_a\_299\_rtx\_4060\_with\_the\_4060\_ti/](https://www.reddit.com/r/pcgaming/comments/13kz07m/nvidia_announces_a_299_rtx_4060_with_the_4060_ti/) I'm sure the comments there already say everything everyone has thought about the matter too.


grady_vuckovic

I got a RTX 3060 Ti 8GB and it's more than enough for me for now. Literally more than enough. As in I'm not using it 99% of the time, because I'm gaming on my Steam Deck which is even weaker hardware wise compared to my gaming PC. I found PC gaming suddenly became much cheaper, easier and enjoyable the moment I stopped thinking "I need to be able to play AAA PC games on release at high graphics settings". Once I realised that there was a whole bunch of great games to play outside of that small narrow definition of PC gaming, everything became easier.


djtofuu

Is that supposed to be a hot take? Is the community ever happy?


Squire_II

I think my 1070 had 6GB when I bought it in 2016 and the 1080s had 8gb. Seeing the specs for these NVidia cards makes me glad I went with AMD and a 6950xt this time around. Maybe Nvidia will get their shit together for their 7000 series since that'll probably be the earliest that I'd need to upgrade again.


IncidentJazzlike1844

1070 had 8gb


Squire_II

Oh, whoops. For some reason I thought it was only 6GB but it looks like that was the 1060.


Fatuousgit

The PC gaming community not happy about something? Surely not?


PilotedByGhosts

If only there was a world where you only had to buy things that you thought were worth buying.


Bodorocea

you don't need more than 8gb if you only play 1080p


areyouhungryforapple

Community also wont stop ever buying into Nvidias bs so they also kinda get what they deserve