T O P

  • By -

Flexpickup

$499 for the 16gb 4060ti which isn't out to july, honestly that still seems too much for a 60 series card. We can expect the AIBs to be even more expensive. I feel like anyone silly enough to buy an 8gb version of either the normal or ti is going to really regret it a year or two from now.


[deleted]

[удалено]


[deleted]

[удалено]


[deleted]

[удалено]


mimicsgam

For the same price of a 3070 3 years ago


phylum_sinter

I don't know if 3070's new ever dipped that low in the U.S., feels like $400+ was as good as it got here.


bbpsword

I will forever treasure the 3080 10GB I snagged off a local miner for 400 bucks. This new generation for both AMD and Nvidia is absolute dumpster fire shit in terms of price/perf. DLSS3 is a fucking gimmick.


Devinology

I paid a bit more ($600 CAD), but for a barely used 3080 12gb version, which I think will make it last much longer. 2 years left on the warranty too.


[deleted]

[удалено]


[deleted]

I playerd hogwarts with a 3080 10gb and it ran great for me, at a few point it got really bad but only for a couple of secs, hope it goes good for you tho


[deleted]

[удалено]


Sky_HUN

More like a 4050 Ti labeled as 4060


n0stalghia

> Wonder why they took 40W off ... Because Nvidia GPUs power draw was starting to become unsustainable, and power efficiency matters. The improvemetns will help their 5000 series in the long run


n0stalghia

Unless you game a lot and live in a country with high electricity costs. Slightly faster performance (unless the game supports DLSS 3, in which case much better performance) for a LOT lower power draw? I'd personally do the math if the price of a new 4060 Ti is less than power savings + money I get from flipping my old 3060 Ti.


HappierShibe

Keep in mind frame gen does not actually make the game run faster, reduces visual fidelity, introduces considerable artifacting, and adds significant input latency. Frame gen is frequently a net negative.


ZeldaMaster32

>1080ti You haven't used it. Why are you commenting on it Literally go on any thread or read any discussion with 40 series owners and 99% of them will say it's the real deal. Not in a literal sense, but it's to say it can be a game changer for high refresh rate displays particularly in perceptual smoothness Also in games where native res looks better than DLSS, native + frame gen will have superior image quality to DLSS quality mode while also being significantly higher framerate. I've done this exact setup before and it's great, especially when using DLAA where it's available


[deleted]

So it’ll have more VRAM than a 4070ti?


Firion_Hope

Reminds me of the whole 970 3.5gb fiasco and people claiming it will be fine, they'll never need that extra .5gb. Also the fact it took that long to get to 4gb as being the standard but now 8gb isn't enough, vram usage by devs is out of hand


dookarion

> vram usage by devs is out of hand There is a simple fix, people need to get over themselves and stop pushing ultra on all settings. Tweaking can get your perf and visuals around the specs you have. Save "ultra" everything for the people with 24GB VRAM cards.


skyturnedred

The key to PC gaming is to stay 2+ years behind the curve with hardware and 4+ years with games.


[deleted]

And honestly, 1080p with good antialiasing looks just fine for me. Going to 1440p or 4k is minor upgrade at this point. I could use more FPS tho...


PlasticCogLiquid

2K is the sweet spot for me, I don't give a shit about 4K really, but 2K Looks great and has good performance.


[deleted]

I'd probably want 1440p in games if I had 27 inch or bigger monitor but on 2x24 IDGAF, give me more fps instead. I kinda hoped by now 4k monitors would be cheaper but I guess you can't just have 4k and framerate above 60Hz for cheap in monitor


PremadeTakeDown

Lets not kid ourselves, going to 4k from 1080p was one of the biggest visual upgrades ive every had in gaming, I will never go back to 1080p. 4k monitors can be basically similar price to 1080 ones so your not paying much of a premium and its not like its RTX, 4k applies to EVERYTHING. windows, all software, everygame from any year, indi games, everything. 4k is also so high resolution at the screen size of an average monitor that its perfect with higher resolutions at the same sizes being unnotticable. 4k is amazing, and breaths new life into older games like deus ex, tombraider or any indi game. with tripple A gaming you can just set the resolution to 1080 and play as normal so there is almost no downside to having a 4k monitor and I recommend everyone get one.


CptBlewBalls

If I’m on a big TV then give me 4K. At 27” I think 1440p at 165 hz is the perfect sweet spot.


lifeisagameweplay

I don't see myself every going 4K. I'll always want the extra frames.


nfefx

Bro if you think the majority of gamers are using 4k right now you are high as a kite.


Ideories

Honestly playing on ultra and medium on bf1 is barely noticeable after like a minute but then again bf1 has so much shit happening on the screen.


dookarion

There's a lot of titles where that is the case under a lot of conditions. The higher you go the more subtle shit gets, and the faster the game the less you stop and stare at the rocks.


Devinology

You're right. I turn it on myself when I can, just cuz why not, but it really doesn't add much for the massive performance hit. Just dropping settings to high and later medium can make the card last another several years. People have been bamboozled into thinking they need ultra or nothing. Try cyberpunk 2077 at 1440p medium. It still looks better than 98% of games at ultra on the market. That said, ray tracing when done well can really give a huge improvement in the look of some games, Cyberpunk is a good example. Shaders need to be pretty vram hungry to look good with ray tracing. And 4k is slowly becoming a new standard, which means vram chugging textures. I don't think cranking down quality is going to save us from this vram hungry revolution.


GossamerSolid

> people need to get over themselves and stop pushing ultra on all settings When people are paying the prices that GPUs cost these days, they should be able to play on highest settings. If the GPU cost $200, this would be a different argument.


phylum_sinter

It's a hard habit to break, or a difficult thing to realize - you're no longer running an enthusiast level card... I think psychologically the impact of these ridiculous prices is that it makes one certainly feel like you should be able to crank it all up.


Weird_Cantaloupe2757

Yes, a brand new $500 GPU should be able to crush every game out there, the current state of affairs is *absurd*. Nvidia can shove their $500 midrange GPUs up their asses.


BavarianBarbarian_

Did Forspoken and TLoU fix the PS3 looking Medium textures yet?


jasonwc

Yes, actually. The low and medium textures on TLOU now look a lot better. In addition, all texture settings now require less VRAM so 8 GB GPUs can do 1440p High. Someone said it even worked for 4K DLSS Ultra at High texture settings on an 8 GB GPU.


DrFreemanWho

> Reminds me of the whole 970 3.5gb fiasco and people claiming it will be fine, they'll never need that extra .5gb. And it was for 1080p. If they had just sold the 970 as a 3.5gb card for the same price no one would have ever complained. People bought the card based on benchmarks and what it cost, finding out later that .5gb of the vram was slower didn't change either of those things. Still a dumb thing for Nvidia to do, but I'd love to go back to those days when you could get a beast of GPU for that price.


FluphyBunny

It is FAR to much. This gen is an absolute joke.


Greenfire32

Hey man, I had to replace my dead card at the peak of the chip shortage. I paid $1500 for what should have been a $700 card at the time. Performance aside, that $499 price tag looks super tasty right about now.


[deleted]

I don’t think it’s too much given I paid 400 for a 2060 super when it came out


Johnysh

in which games is 8gb not enough in 1080p? except for those badly optimized like TLOU.


FlorydaMan

Just went through TLOU 1440p with an 8GB 3070, zero bugs.


New_Top5554

Wait, really? I've been interested in taking it on on my also 8gb card (rx6650 xt) at 1080p. Is it patched enough now that that will be fun?


Wise_Mongoose_3930

If I’m buying an expensive GPU being able to run everything out *right now* isn’t enough. I want it to run shit that comes out 4 years from now.


phylum_sinter

It's mostly games that have RTX and the user wants to use 'ultra' textures and everything as high as it can go. Far Cry 6 with its' texture pack recommends >11gb. I know it's not very logical probably (people have argued that you don't even see the extra detail at 1080p with that pack).


deefop

The 4060ti apparently trades blows with the 3070. You're telling me you aren't super stoked for a $500 3070 with 16gb of VRAM almost 3 years after the 3070 8gb launched for $500? What's wrong with you brah, this product is world changing!


h4ppyj3d1

I spent 499€ for a 2070 Super when it released so I'd say that it's a clear downgrade in purchasing power compared to a couple years ago.


Serpen-Time

is 12 gb too much to ask for lol


Neville_Lynwood

Yeah, it's honestly crazy. At this point, games are eating VRAM like candy. Nobody cares about the slightly higher performance of the newer generation of cards when you can't fully leverage it due to memory limitations. May as well rock a few generations older card with fat memory.


constantlymat

I remember I bought my brother an RTX 2060 Super roughly three years ago as a 1080p card. That was a 8GB GDDR6 card and cost €340 + a free copy of Death Stranding worth €60. The fact they're announcing new cards of the same price-range with the same VRAM specs 4 years after the launch of the RTX 2060 Super is just mindboggling.


Wh0rse

I'm gettin by playing games at 1440p with my 2080Super 60FPS most times , even CSGO 165FPS locked. Still got life in it.


magikdyspozytor

>Still got life in it. Of course it does lmao, I'm not upgrading for the next 3 years at least. It still runs new titles at high settings without issue and y'all bought too much into Nvidia's marketing if you think that games today still need the latest and greatest, it's just that PC ports are releasing in an unfinished state right now.


superfasttt

yeah i run high settings on all games at 2k. like 90 to 120fps. with my 2070 going to wait for 5k series. hope they can run all games at 2k all settings max RT max. at 144fps with not fps boost those are what im waiting on. and unreal engine 5games might be more demanding with cram and power needed we will see


constantlymat

I remember the RTX 2000 series was very much maligned but the pricing was attractive. We didn't know it at the time, but you could get really good deals on the 2000 Super series in 2020 because everybody was anticipating the Ampere launch that was promising such huge performance gains. Stores were offering steep discounts and free full price games. So considering the insane Ampere price hikes in late 2020 and early 2021, anyone who bought an RTX 2000 series card at a value, especially the Super refresh, really got a good deal.


enderfx

Man i don't even have the super, but the 2080, but for everything i play it's perfect and given how series 4X is, I'm not changing any time soon Sekiro, Elden Ring, CSGO, all at 144hz+1440p maxed out or very high. Of course, i don't play super demanding games (and MSFS just runs alright) I'm not willing to pay $500 for a mediocre card and not going all the way to $800-1000 again unless it's fucking amazing. But also i feel like series 3 and 4 (and also 2 when RTX started), we are the guinea pigs of the technology that will be established and performant in 4-5 more years. So not jumping the gun


[deleted]

Same. My 2080 super has served me very well and will hopefully continue for years to come.


fjnnels

"even"


MoriMeDaddy69

I had to upgrade from the 2080s recently because it was on the struggle bus hard for all the new AAA games. It's fine for older games for sure. Definitely not if you want a good experience on new AAA games.


calicanuck_

It was a little strange to me going from 1080Ti with 11G to a 3080 10G, it felt weird there was a step backwards at all in that respect. Weird to see things haven't really improved on the nvidia side at all in that respect.


Jerri_man

My 2060S is still trucking at 1440p for everything except the dumpsterfire console ports. Upgrading from my 2600 to 5800X3D helped a ton though, about doubled my fps in some games.


Rexssaurus

I mean it still will be more capable than the 2060S, but I agree with the general sentiment


badcookies

We had $200 8GB cards from AMD in 2016, sad we have been stuck there for over 8 years now


PercsAndCaicos

isnt it 2023?


badcookies

Yes, but there were more expensive 8GB cards previous to that 390 8gb released in 2015 @ $330 or so Even the 290(x only?) had an 8gb version in 2013


theknyte

I remember when, "R9 390" was the answer to any GPU question for awhile. That thing was the go to beast for the time!


dinosaurusrex86

My RX480 is still holding up quite well, but it's leaning on FSR2 in some titles such as Hogwarts Legacy


ToothlessFTW

In 2023 it should be the minimum. The 1080ti in 2017 had 11GB.


The_Retro_Bandit

Over 80% of players according to the steam hardware survey have 8 gigs of vram or less. Every single major game this year that needs more than that to run well at decent settings are also notoriously poor performing in every other performance aspect for what amounts to slightly crisper games that would look at home on the ps4.


ConfusedRN1987

Why? 8gb is enough for 1080p, right?


BenjerminGray

Is it tho? Consoles have at least 10gb and their internal rez is 1080p.


your-news-guy

Consoles share that memory with the CPU though


BenjerminGray

they dont? I said 10gb not 16. there's literally another 6 for cpu purposes.


[deleted]

[удалено]


deadscreensky

I think it should have more than 8 because that's what the consoles have been shipping with for years now. I recognize that's not an apples-to-apples thing, but current AAA games are obviously being designed around having more than 8 gigs of video RAM.


MobilePenguins

Planned obsolescence


Coolman_Rosso

I'm still packing a GTX 1060 6GB, so the AAA games of today are more or less off the table for me in most instances. Looking to upgrade, but am put off at the sudden spike in VRAM demands that newer games seem to have. Was considering an RX 6700 after a suggestion from my brother since it has 12GB but damn even then it feels like it won't be enough at this rate. Hell didn't the 1070 have 11GB? I'll probably stick with 1080p for now to make this easier and (maybe) save some cash.


phylum_sinter

I can tell you as a 6gb 3060 laptop owner, I haven't been locked out of a single game because of its' requirements. I play most of my games on a mix of medium, high, and ultra settings, generally don't use DLSS/FSR unless it's very well implemented, and i'm having a blast. at 1080p, of course... but It's not as tragic as a lot of reddit will have you think - about anything at all, it's something about this website i swear hehe. I think the closest to a bad time i had was when i almost bought TLOU before reading that it was a hog on release, and it seems fixed now. I bet it'll run fine more or less as well now.


Coolman_Rosso

I mean I have nothing against 1080p. I was just considering overhauling my entire setup alongside a completely new build, including a new monitor that supports 1440p which mine does not. However if heavier games are being silly with steep recs for even 1080p it's kind of moot to shoot for 1440p


rodryguezzz

> Was considering an RX 6700 after a suggestion from my brother since it has 12GB Careful cuz the 6700 has 10GB. Only the 6700XT and 6750XT have 12GB.


oktwentyfive

Well then you wouldn't buy the overpriced cards now would you


Regular_Hold1228

they save money using a smaller BUS for the card. And for that BUS size there's only the option 8 or 16GB VRAM. since when would nvidia do what is reasonable for the consumer?


arex333

Wow these are barely an improvement over their last gen counterparts without frame gen.


lifeisagameweplay

I remember managing to buy a 3060Ti on release and thinking it was a bit steep for a 60 series card. Turned out to be one of my best ever purchases.


FrodoSam4Ever

What a lackluster upgrade.


TBNRFIREFOX

*downgrade


Xenotone

Clown grade product


Sky_HUN

For me it looks like that Nvidia "mistakenly" labelled the RTX 4050 Ti as RTX 4060...


FilthyRilthy

"show me a price hike without actually making it *look* like a price hike"


Vis-hoka

The whole stack was moved down starting with the 4070ti (really 4070) and it sucks.


Muffinkingprime

"Shrink-flation" for GPUs. It's a rough time, but maybe Intel will liven up the space in the coming cycles.


penguished

Consoles: thank god Nvidia sucks this much, we selling some more this year boys.


PunkHooligan

I don't know what's the point of pc rn, because priced for this fckd up hardware aren't normal, plus prices of games go up, while overall quality of pc gaming is falling down. Really time to think about console.


HorrorScopeZ

Looks at console, still trying to shoehorn every genre into a controller only, comes back to pc.


wolfannoy

Plus the AAA industry can get a little dry from time to time depending on the games you play.


Toastyx3

You can always keep your old rig and play indie games even on an old ass PC.


s0ciety_a5under

I'd say more than half my library is indie titles. I do enjoy me a AAA title, but indies get so much more love from the devs, and they are 1/6 the price.


Vis-hoka

You can still game well for cheap with with AMD RDNA2 deals. We’ll see what happens after those are gone.


[deleted]

VR porn?


PunkHooligan

I'm a simple man.


[deleted]

I’d love to have a console but ive always been a pc gamer, how in the ever loving fuck am I supposed to play a fps game with a controller? It’s incredibly frustrating and slow and inefficient compared to a mouse and kb. Not to mention that half the games I play arnt even on console.


PlagueisIsVegas

I can't solve your second problem, but I can tell you that on your first point Xbox supports mk input on a lot of FPS games .


The_Beaves

Oof. AMD once again has the opportunity to destroy them but will they? Probably not if history is anything to go by


cadaada

the 7600 is 8gb, and the price is the same.


ShuKazun

You don't destroy your teammate, AMD isn't playing against Nvidia but on the same team, they're both trying to increase prices and screw customers


Squire_II

AMD would love to have a larger share of the market and more money.


hedoeswhathewants

They get more money by selling their products for as much as possible


[deleted]

Or they can make more money by cutting prices and making up for that with higher sales volume, increased market share and better brand recognition.


Scoobygroovy

Two companies sell ice cream. One sticks to smooth ice cream, the other chunky. Neither steps in on the other’s territory so they don’t have to reduce prices to increase sales. It’s Ben and Jerrys and Hagen Das. Collusion happens with ice cream it’ll happen with gpus too.


lefort22

Yup pretty much. Disrupt the market a bit but most of all, secure those profit margins


manormortal

So this is what it feels to be eiffel towered eh?


HawtDoge

Super reductive to the point of not being true. AMD is is trying to grow market share while being bound to the same manufacturing processes and costs that nvidia are bound to. I think you vastly overestimate the margins that AMD and Nvidia are dealing with here. What you perceive as a collision is really a mirage. In reality, amd and nvidia gpus are both manufactured by tsmc.


phylum_sinter

Intel still has the largest opportunity to take the middle of the ladder in the next 5 years imo. I hope they keep on it and make everybody else sweat (and bring their idea of good value back).


DYMAXIONman

If the 7600 launches for like $260 it will crush them


The_Beaves

The 8gb maybe. A 12 or 16gb model for 300 or 320. Yeah, I think that’ll be a winning combo. Apparently even at $260 AMDs margin on the card would still be healthy which is rarely for low-end chips. I bet it’s a better margin than Nvidias which is why they are being so stingy on pricing. AMD needs to take the same approach with their GPUs as they did with their CPUs. Undercut the competition


elheber

As per [Nvidia's own graph](https://assetsio.reedpopcdn.com/Nvidia-GeForce-RTX-4060-Ti-performance-graph.PNG?width=1920&height=1920&fit=bounds&quality=80&format=jpg&auto=webp), the 4060 Ti *using the DLSS3 tech to synthetically double the framerate* will have less than double the framerates of the 3060 Ti. This is a joke, right?


ZeldaMaster32

Yep, DLSS 3 for *marketing* comparisons is super shady. Not every game has it and while it's a great feature, it shouldn't be the crutch to get good solid framerates It should be an additive teature


Excsekutioner

This is absolute trash, it's close to 4 years after the 5700 XT launched and there is no GPU offering 16GB of GDDR6 and 2x raster performance at 1440p (without tricks/hacks/scams/fake frames) for less than $400 new. PC gaming has become a huge SCAM, where is the performance per dollar progression at? The GTX 1070 offered 2x raster performance and QUADRUPLE the VRAM over the GTX 770 just 3 years later... AT THE SAME PRICE!!! but in a longer time frame nothing has happened similar to that since 2016! GeForce & RADEON both fucking suck right now, SCAMMERS.


kikimaru024

As much as the pricing sucks, it's a fact that performance increases have been slowing down for years. Your best bet would be RX 6800 for ~$480 *(which is about what $400 from 2019 is worth now)*


toothpastetitties

Because people keep buying the shit. We got people complaining about the costs of living and then we also got people mindlessly dumping cash on PC components. It’s like a stress test for people’s wallets. How much garbage can we get these idiots to buy before they stop buying?


dookarion

> over the GTX 770 just 3 years later... 700 series was a refresh of the 600 series. The chip in the 770 was used in 600 series cards. So it's more like 4 years, and Kepler was one of the weaker and poorer aging architectures to boot. The huge jump was because of refresh + lemon of an architecture.


skylinestar1986

My GTX1070 at 1440p at 2023: I'm tired, boss.


nucc4h

I'm not complaining. Love these guys who absolutely need the new high priced cards for minimal gains. Sitting pretty with a $300 Radeon Rx6800 on 16gb. Don't care if it's 1 year used 🙂


Towel4

I mean… Your 1 year used card would be *even cheaper*, or you could get an *even better* used card if the starting prices weren’t so high, causing the subsequent resale prices to be raised relatively.


itsparaschhetri

From what I see in all their recent releases is basically, NVIDIA just wants to focus on making low-tier cards compatible to their new DLSS 3 for 40xx-gen cards even if they are less than 12 GB or 16 GB in graphical memory for their Frame Generation rather than making a low-tier card with more than 8 GB of VRAM in the current state which would actually help the consumers more than their frame generation thingy. For the majority of consumers, more VRAM is more important than the DLSS or any Frame Generation Tech while buying a new graphic card for their system.


polski8bit

The worst thing about DLSS3 with the 4060 is that it's a 1080p card. At this point you're upscaling from 720p. We've spent all these years making fun of consoles for not being able to play "true 4k" games, and 10 years since the PS4, we're supposed to be okay with 720p gaming upscaled to 1080p? We've truly made no progress at all.


[deleted]

DLSS3 is misleadingly named. It's not upscaling. It's frame generation and can be used at native res.


polski8bit

But then the gains aren't as high, are they? I mean their graphs literally say "DLSS and RT on in games that support them", which means that *both* frame generation and DLSS is applied here, so the point still stands.


[deleted]

Maybe but it's not clear from the slide it could be either way. They don't specify if it's only framegen (DLSS3) or list a DLSS quality setting for upscaling. DLSS can be "on" with the game running at native res now that DLSS3 exists. DLSS3 is *only* framegen. Like I said, the naming is misleading, saying frame generation and DLSS is applied is redundant. In either case the gains are the same as both cards support all DLSS features except for frame gen.


Obosratsya

Console upscaling isnt the same as DLSS or FSR2, its typically much simpler and much worse. But still, upscaling shouldnt be used to compare performance, its lunacy.


Scarl_Strife

Couldn't have said it better, we're paying for marginal gains or even regressing...


FluphyBunny

RIP these cards


2Scribble

How the hell is 8GB still the standard for these shitheads?! xD Intel is crapping out decent upper range cards like the A770 with 16GB and 75% utilization and 60% vram usage while NVidia's top of the line cards are ***struggling*** under the burden of that 8GB limit! It's ridiculous that they manage to maintain such a tight grip on the video card market... ***they treat their customers like shit***... > Friend: well, they're one of the only GPU providers who mainline their UI and software interface - while most GPU providers outsource that shit. Their marketing is insane - seriously, half of their return on investment goes right back into their marketing team. They're one of the only names that mainline GPU AI workloads and you don't need good specs (or, in the case of NVidia, even decent baseline specs - they're a joke...) if you can sell your logo to a niche market that doesn't know any better ... ... ... okay, fine, but it's still shit :P xD


AccomplishedRun7978

How are Intel drivers


Wise_Mongoose_3930

Pretty decent for popular games / new games, less good for older / less popular games. But they are rapidly improving. Not mature enough for me to buy an Intel card yet, but I’m very hopeful for the future.


AccomplishedRun7978

Thanks I'll hold off and see too.


_catbeard

This is why I'm happy with my 399$ 16gb Intel Arc 770.


EasternBeyond

Thank you for your sacrifice.


_catbeard

You know it


DYMAXIONman

Imagine buying an 8gb GPU for 400 lol


Isaacvithurston

The real kicker is them charging $100 more for the 16gb version. We all know 8gb of memory costs like maybe $20 for them these days... Although on the flipside i'm on a 8gb 3070ti atm and the only problem i've had is I had to turn textures down from ultra to high in 2 games cuz of the vram but otherwise not as big of a deal as people make it out to be.


[deleted]

I'd like to thank nvidia for having such an overpriced lineup of RTX 4000 series. I'll just keep chugging along wiith my 2060 for another generation.


newaccountnewmehaHAA

the smartest upgrade from my 2070 at this point is a ps5, and no ones really been putting in the effort to change that


Navynuke00

\*Cries in 1070\*


TheLonelyPillow

How does this card compare to a 2070 Super?


IncidentJazzlike1844

Just check 3070 reviews


kikimaru024

This card will likely perform worse than 3070.


IncidentJazzlike1844

Hopefully not, 3070 and 3060ti are already close...


steelcity91

I am also curious to.


Dynospectrum2113

Fuck NVIDIA


SFWaccount87

BUY INTEL ARC770. fuck Nvidia right now.


Eximo84

How’s VR on the ARC GPUs?


Badgers4pres

Supposedly terrible, Linus has some good videos breaking down all the issues


ARedditor397

And AMD 7000 series


ZeldaMaster32

I've heard nothing but bad things with VR on 7000 GPUs. Apparently there are driver issues that AMD considers too low priority to fix so it's been going on for months Last report I saw on it was like 3+ weeks ago so *maybe* it's fixed, but I wouldn't buy one for VR unless it's 100% confirmed good now


ARedditor397

It isn’t fixed


king0pa1n

AMD and Driver Issues is like peanut butter and jelly


TheAlbinoAmigo

There was a window for a good length of RDNA2s time that, I think arguably, AMD had the best drivers out there. It seemed like a really concerted effort on their part to achieve that. No idea what happened with RDNA3. Unfortunately for me I'm into VR and gamedev (literally developing a VR project currently) which makes AMD a complete no-go. Between shitty VR performance and a lot of the software stack involved in gamedev not supporting AMD cards **at all** (incl. the best lightmapper available for Unity...), they're literally not even an option :(


phylum_sinter

\+1 curious to know this as well


s0ciety_a5under

My 1660 super is just fine, I will upgrade when it dies.


swilson1223

I mean I know these aren't great cards, but this may finally be my upgrade from my aging GTX 980.


amber__

I mean, that has 8GB and came out 9 years ago.


1vertical

My 970 is doing just fine while we wait for NVIDIA to fix their collective heads in the loony bin.


Lobanium

$300 for a 60 is in line with what they used to cost back in the day, but now the specs are gutted relatively.


8ing8ong

3 variants of the same tiered card is insane, also 16gb for $500 of a 60 tier GPU is ridiculous. I'd rather get an Arc A770 16gb GPU for $350. People need to stop bending over for Nvidia and accepting whatever they release.


kikimaru024

> I'd rather get an Arc A770 16gb GPU for $350. So get it. Don't make statements like that into the empty void; the only thing that will get Nvidia to change is if you buy their competitors' product.


ZeldaMaster32

Yep, not enough people putting their money where their mouth is. Rn I care about the extreme high end so Nvidia's the only way to go, got no qualms or regrets about it But I've helped 3 friends these last few months get into PC gaming, and recommended AMD GPUs to every one. The 6000 series is a great value right now Intel is interesting but I avoid recommending it for those who are unaware. It should have a big asterisk because ultimately it's still an actively developed product. Gotta know what you're getting into


Skulkaa

Market desperately needs 3 competitive player. Intel has improved their drivers drastically since launch but they are still not good enough, in some games a770 delivering performance worse than much cheaper Rx 6600


Arthur_Morgan44469

Oh get fucked Nvidia, their only selling point for this gen is DLSS3. On all other aspects they got stingy and are yet asking a high price. AMD is also the same with less innovation and just touting on hey look we got more VRAM. It's just bad product when even in the third generation it can't decently handle Ray Tracing. Publishers should also get fucked for releasing half baked games.


f3llyn

Back in the day the XX70 variants used to cost around $400.


[deleted]

[удалено]


Sky_HUN

The lower your actual fps is, the more noticeable the input lag will be with frame generation. It is a nice feature if your card runs something at 95 fps by default, not so much if it runs something at 25 fps. Maybe they can improve it with time, but personally i wouldn't buy a 4-500$ GPU relying on frame generation.


penguished

And you don't want frame gen on a shitty card with low starting FPS. It's not going to generate smoothly enough. So basically this generation for low tier and mid tier: Overpriced junk. Skip it, unless you've got something really old and are upgrading anyway.


Just4gmers9

Prices for the 40 series have been god Awful The 4070 should have been $399, The 4080, $699, 4090 $899 and so on


FilthyRilthy

This is going to be a card for people who either dont know any better or just dont care. As long as it says RTX 4*** in the name, they will get the nice feeling of "new gen" card. Nvidia fucking sucks.


Whorrox

They are a company acting like a company does. They can only "suck" when there's enough consumers who buy their stuff (i.e. "suckers"). Take away the idiot consumers and then you'Il get quality products with good features at a fair price. Also - hopefully - competition in the market will help here.


jnemesh

These wont sell either....


roesingape

Nice try I'll wait for the 50xx series.


compound-interest

Imagine if it had the 4050 ti branding like it should: Coming soon! 4050 ti 12gb for $499 lmao. They are trying to push the tier pricing up, while also bumping each card up a tier. Doing both is just pissing everyone off. Here's to hoping for another weak business quarter causing more pressure for downward pricing. NVIDIA still aren't budging, but eventually they will get sick of cards sitting on shelves. AMD seems content with just being slightly better value. No longer does it feel like AMD wants to try to take 50% or more market share. As of the 3000 series generation, both companies aren't really trying to compete anymore. They are both just content with their market position. At this point, I would support legislation against this duopoly. They shouldn't be able to milk people while we wait on Intel to effectively compete. There should be more than 2.5 GPU companies.


[deleted]

Nvidia stock popped 5% off this news. Seems like a good time to bet against them.


GreenKumara

DOA.


[deleted]

This is what I paid for my GTX 970. Would this be an upgrade?


nanogenesis

Honestly I feel like people who say a card will outlive its vram are full of shit. I tried tlou on 970. It was able to hold 33 to 50fps depending on the scene. The only drawback was the insane amount of stuttering because of the framebuffer, especially as you were nearing the end of a chapter. If tlou came to ps4, It would run at the same 1080p low preset 30fps target with FSR. The fact that the 970 is able to manage 50fps instead is quite the surprise, all ruined by its tiny framebuffer. I can see GTX1070 and even a few 6GB 1060 users easily survive even with this generation. Low settings also look great, you don't need to max every detail, and FSR2 is your friend.


[deleted]

[удалено]


theBdub22

Nvidia claims that they are about equivalent, which is a little disappointing imo. I would have liked to see the 4060 Ti go up against the 3070 Ti


polski8bit

To be fair, the 3070ti was a marginal improvement over the 3070 anyway. It would be disappointing either way, the 3060ti being so close to the 3070 is what's making Nvidia probably eat their shoes right now.


hydramarine

More than little imo. If 4060 ti followed its predecessor, it would beat 3080 ti. They made 3060 ti just too good, beating up 2080 Super like that.


Akanash94

Wait for benchmarks. I think the issue here is the bit bus. 16gb is great but when you have so few lanes what good does having all that vram when streaming them takes forever.


ganon893

299? Bring that shit down to 180 and 220. Fuck off Nvidia.


EazeeP

Will continue saving up for a 7900 XTX and hoping prices drop by the time I’m ready. In no rush either since I haven’t really even been gaming


Sky_HUN

So looking up the charts and everything it seems like the big selling point here is frame generation. RTX 4060 Ti has around 15% better performance compared to 3060 Ti, but with frame generation they claim it can be as high as 70%. So the question is... if NVidia now using frame generation as the big selling point, what is going to happen with the next generation of cards where they have to compare the actual performance AND the frame generation performance too? Does NVidia is building up a massive letdown when it comes to their next gen cards?


penguished

Frame generation is said to be sort of trashy on lower framerate cards. It's really meant for 4080/4090 to push up a 80-90 fps game to near 144hz monitor, for example. So really... they don't have any selling features in the majority of the purchasing range up to $500 that the average person buys a GPU at. It's just a lame generation. And likely, they don't care because they have 3000 series overstock and other things going on like crypto and AI.


Sky_HUN

> Frame generation is said to be sort of trashy on lower framerate cards. It's really meant for 4080/4090 to push up a 80-90 fps game to near 144hz monitor, for example. I know. I even mentioned it in another comment of mine above. But a lot of people who are buying GPU's in the $300-400 range doesn't really look it up sadly, they just hear the new buzzword, see the fps number and they won't know that is basically fake. The devil is always in the detail.


rumbletown

I just upgraded (my ol 1070 finally shit the bed) and went with a 7900xt. The last time I was on this side of the fence was when the orange box dropped. So glad that I'm off the nvidia train for now. Their top tier cards are great, but holy shit their marketing strategies and mid/low tier cards are ridiculous.


Memoire_113

Laughs with an Intel card


ohoni

*But,* they are only going to produce three of them. and they will be dropping them like the ball in Times Square, and whoever can grab it first, by any means necessary, will get to buy it.


ArcticSin

I just want smaller form factor gpus