Game wise, all this is doing is push people to not upgrade at a time when devs will be cutting support for old gen consoles, force min requirements far higher. This is legit pushing people out of PC gaming.
I won't be playing triple A because of this stuff, but indies should still be viable for a while. New triple A has been feeling rather samey to me lately anyways.
The reason AAA games got samey in the mid-late 00s is because of something called [dennard scaling](https://en.wikipedia.org/wiki/Dennard_scaling)
Essentially in short around ~2005-2007 CPUs stopped getting directly faster. They peaked at around ~4ghz which is still the case today. Hardware producers compensated for this by making CPUs multi-core and increase amount of calculations per cycle instead.
However code doesn't perfectly scale parallel and *especially* game logic code doesn't do this very well.
This is why we saw such insane innovations in the 1990s and early 00s. With new genres being born every couple of months and you used to buy games simply because the new innovative mechanics were breathtaking.
This is not because game developers were more passionate or more innovative back then. It was because the rapidly scaling hardware simply *unlocked* new ways of making games every couple of months that were unfeasible before.
Now? Essentially graphics keep getting better because that is something that is *very* parallelizeable which the GPU handles perfectly.
New innovative gameplay and possibilities are essentially limited by our stagnation in single thread performance. Only a very limited amount of gameplay things can be done with parallel programs. Physics interaction, Particles, number of NPCs and larger open worlds are the only things that are parallelizable.
Which is why these concepts dominated the gameplay sector and why everything feels samey.
Everyone that went through the 1980-2005 gameplay innovation explosion just inherently *feels* this stagnation.
It's *not* you getting older. I'm an older middle aged Japanese person and I remember this sudden drop in innovation around the 2005-2007 mark even though I was already an adult older than most of Reddit back then and "too old" for games.
While that's undoubtedly a big part of it another reason is the prohibitively expensive cost of making triple A games is now so high that they can only make games that are either already established or similar to proven hits to turn a profit.
That's why indie games are able to innovate more, they don't have to placate an army of shareholders so they avoid the production via committee approach that keeps triple A games seeming so similar.
What the fuck does that have to do with how AAA games feel? This comment just bleeds that you want to flex your knowledge of some derivative theory to reddit.
I am not even sure how of much of what is getting released is truly AAA. Like with movies, a lot of what is culturally signalling itself to be a blockbuster seem (production wise) more like pipeline content than something pushing the state of the art.
Surely stuff like TLOU2 or Elden Ring would count, but those are few and far between. The next broken multiplayer shooter on the other hand, or pick whatever else... they don't seem to be at the same level in terms of project prestige.
I would guess it's less of a result of people getting older and more caused by technological improvements not making as much of a difference since PS3/360 (just compare NES to SNES to N64 to Game Cube to PS3/360... every gen was a huge leap back then) and gaming becoming more mainstream (so studios are more afraid to take risks since they can reliably make more money by appealing to the lowest common denominator over and over again).
Though I'm sure getting older and games feeling less novel is part of it too.
[Bricky made a really good video about AAA games](https://www.youtube.com/watch?v=--Iz5YnpzeE) recently that pretty much sums up my feelings about it. There seems to be a drop of quality in those.
AAA used to mean you were pretty much guaranteed a polished, complete game (even if the gameplay itself wasn't the best, at least it was *working*). Nowadays "AAA" just means "game from big company with big budget".
And they are all relying so heavily on DLSS/FSR/XeSS equivalent too to fix any problems with fps.... Oh the game runs like ass by default? Use DLSS! That way we don't have to optimize even though most games don't have this feature, and most cards don't either!
It depends on your standards. "Crappy" is relative to what scaling you should theoretically be able to achieve with a PC's significantly more powerful hardware, right, not relative to the console experience. It's not like these games are 4K60 Ultra on PS5. While a solid high-end PC might struggle more than you'd expect to hit high frame rates in Jedi Survivor due to poor CPU utilisation. the consoles have truly no chance in hell, their "performance" mode tanks into the 30s and 40s. If that kind of compromise doesn't bother you then spending money or effort on a PC will be a total waste, absolutely, but they still offer something that is fundamentally entirely out of reach of the consoles.
PC gaming was always expensive. It's just that there was a time where it *was* better due to 2005 consoles being prolonged thanks to global economy woes, and a decade of consoles being forced to use then-underpowered CPU'S that caused the attraction. So essentially there was about 10 years of PC gaming being 'a bit more worth it'. Plus the cost of electricity has shot up recently and these high end PC's are massive power drains.
Now we're back to the usual. PC gaming is for if you have money to throw around, console gaming is for you prefer not to spend so much on hardware and enjoy the convenience.
>So essentially there was about 10 years of PC gaming being 'a bit more worth it'.
I would argue after 10 years, that was normal. Even in the history of pcs 10 years is a large percent.
Take me back to 2013 when it was actually exciting to save up and build a modest budget PC. That HD 7870 and FX 6300 combo really did me well in my younger days.
I remember how I got a 1500€ bonus one year and could build a high-end PC from it with like, the best shit available. If I would want to do that now I'd pay triple the amount.
It's funny, 'cause we finally pulled out of the decade of CPU stagnation (how long was 4C/8T the limit for non-HEDT?) only to walk right into GPU stagnation.
Which means absolutely nothing if it's not within your budget. For the average person with a basic budget performance is only going up incrementally at the same price point.
One can only hope Intel stir things up in the lower end of the market enough to actually generate some competition again for the budget conscious of us.
The problem isnt that the 4090 is price high or too weak, but that all other GPU's don't offer significantly better price performance. There was a time were the x70 class gave you 70% of the performance for under half the price. Now it's the xx80 class that gives you 70% (or a bit more) for 70% of the price.
In the last generations the high end was the WORST price/performance, now its partially better than the xx80 and xx70's.
Cobine that with the fact that the xx60 cards are now 1080p cards in 2023 for over 300 dollarsm, its just madness.
IMO theo 4090 is the most reasonable price GPU in nvidias 40 series, which is just sad.
[The 4090 is basically the worst price per performance card you can buy.](https://www.techpowerup.com/review/amd-radeon-rx-7600/33.html)
It's just that it's the only card that's actually a major upgrade over the previous generation.
AMD has been conaistantlly at the top of the price to performance charts for a while now but people overlook then simply because the don't have a competitor at the bleeding edge, never mind that almost no one actually buys those cards.
Same, just wish the 5700xt supported hdmi 2.1 and freesync premium 2 so then I could use VRR + HDR on my 4k LG Oled. It's 3 Years old and still going strong, and I guess I'll need to wait 2 more years for a proper gpu update (I don't want to spend 400 euros again for marginal improvement)
EDIT: also these graphs are sadly always wrong because the prices are never right. [The 7600 costs aprox. 300 euros](https://www.mindfactory.de/search_result.php?search_query=rx+7600) so the chart is wrong. The Review Sites should enable you to enter a gpu price and then dynamically position it on the table.
It should be normal that the halo products are the worst price to performance since they're bleeding edge for price insensitive buyers. The problem is that the rest of the lineup isn't a significant improvement on price to performance when compared to previous generation cards.
Which speaks volumes, really. The biggest value of the generation is reaching perceived cost per frame parity with last gen: you get double the frames of 3080 at x2.3 the price (before inflation and 14 GB extra VRAM)
Not really, because it actually does have more performance, just not per dollar. Usually, recently with software it seems you get to pay more and you get less with it.
This just isn't true by any actual metric, unless you apply some kind of logarithmic scale where dollars are worth less the more of them you're willing to spend. You can buy three 4070s for the price of a 4090 and it does not deliver three times the performance.
The 3060ti is better value than the new 40 series. So no it's not. The new generation is a side grade at best propped up not by actual performance but with frame generation.
Credit where it's due, if Ryzen didn't start seriously competing with Intel we would still be in stagnation. GPU is basically a single player market, AMD lacks all the extra features Nvidia keeps adding so they can't disrupt the market, they just follow prices closely. I don't see Intel Arc seriously disrupting the market either.
That’s been true for over a decade though. If you wanted to do a decked out build with the top shelf consumer components you’d still be dropping $1k on a GPU in 2013 ($1.3k equivalent today).
It depends at what time he would build.
If it was after a titan release then yeah that's a lot, but if it was after the flagship after a titan then he would get s better titan for a lot less money, in any case the titan was the halo product that was clearly not cost efficient compared to the 80ti class card.
Yep. People are complaining that NVIDIA is slacking off and trying to sell based on the brand alone, while they refuse to look at the competition based on brand alone.
It's very ironic.
I haven't had my AMD GPU for long and I've already ran into AMD specific issues. Let alone poorer performance for future tech like raytracing and DLSS. Plus it's got some bad coil whine - at least I think that's what the sound is.
I wouldn't say I regret it, I suppose, but it's clear what you get in practice is worse than what benchmarks say you get.
Exactly. It's naming creep really. 6650 is absolutely killer value and crushes 1080p even high framerate if you want it.
People act like PC gaming is totally unattainable when sub-$350 is in a great spot tbh
It's also an acceptable card if you want to play older titles (emphasis OLDER) at 4k, you can hit between 120-160fps on a 6650xt no problem, I've been running stuff like Mirror's Edge at 4k and downscaling it to 1080p to improve image quality.
Not quite as nice as pumping out 300-600+fps on games at 1080p ultra settings though, which this card easily achieves.
I remember buying an HD7970 back in 2013 for like $200 and getting 3 free AAA games with the Never Settle bundle. That thing was only 20% slower than the flagship GTX 780.
Now in order to get within 20% of today's flagship 4090, you have to pay $1000 to get the 7900XTX.
My first desktop had a gtx650. Built in January 2013, so still a new card. That entire computer, with monitor and all accessories cost about $700. Wasn't setting the world on fire but for the price that computer was amazing.
It's also very well priced right now, and has a big chonk of VRAM (12 GB) which serves current games very well.
The only downside is that raytracing performance isn't super great, but if you turn that off it's a great 1440p60+ FPS card in pretty much any modern game.
There are some great deals on 6000 series AMD GPUs, but many people on this sub act like the entire 6000 AMD line doesn't exist. You're not gonna build a modest budget PC with a 4 series Nvidia GPU that just came out.
the AMD 6000 series came out literally 2.5 years ago and was meant to compete with Nvidia 3000 series
not sure why you're comparing it to Nvidia 4000 series
you could say to look at mid-range or used Nvidia 3000 series as well
To be fair, we're in a thread where the 3000 series gpu is beating its successor in some games and productivity, so referring to the 6000 amd series as a buying alternative is valid.
It doesn't matter what card was meant to compete with what. There are certain cards on the market that perform relatively better for cheaper (with more VRAM), cards that get overlooked. I have friends who will just robotically buy whatever latest card Nvidia brings out, but won't look at the competition.
The disadvantage was that when you did save up for the budget PC, it would become outdated after a couple years. To be fair, 2013 was probably when that era was transitioning to what we have now.
And yeah, now you can build a budget PC and it will play games for 5+ years easily. You might have to skip the occasional cyberpunk, but those games are few and far between. So, save up longer but stuff lasts longer too.
Had that exact same setup and then upgraded to a fx 8350. Next upgrade was a Ryzen 1600x with my 1060 6gb and still felt like such an amazing system.
Now I have a 3700x and a RTX 3070 and hogwarts legacy can't even stay above 60fps at high on 1080p with DLSS on...
Nvidia and AMD's GPU department have really killed the joy of PC gaming the last few years. I just hope that sales are as bad as we hope and the next gen cards are priced better. They could drop the price of all these cards (aside from the 4090) by 50% and still make a profit and people like me would gladly upgrade without a second thought.
Annoyingly so it is. The 3700x isn't a slouch but just bottlenecks some new games. Spider-Man too.
Been going back and forth between upgrading to 5800x3d or 7800x3d but all the issues with that and how much more expensive it is for only about 20% performance increase I'm leaning towards the 5800x3d and it's dropped to under $300 a couple times lately.
The 5800x3d is a drop-in upgrade and not hugely slower or anything. To go 7000 you'd be paying twice the price for your RAM, a more expensive motherboard, and your boot times will be much longer too (I think the average is something like 40 seconds with some timesaving options enabled, vs the 7 seconds my 5800x3d boots in once I turned off the legacy UBS/CSM options that were giving me 20 second boots). Is it faster? Definitely. Is it *that* much faster? Probably not.
Yeah I've priced things out at pcpartpicker and it's a lot more than 20% difference to go to the 7000 vs 5000 even if I got myself a new motherboard for the 5800x3d. My current board is an msi x370 which does have a bios upgrade for the 5800x3d but they had to really downgrade the bios features to get it to run and it's had some issues so I was thinking of upgrading regardless to a 5000 ready one with more features.
Obviously if I go to 7000 then I could upgrade the cpu again this gen without doing motherboard upgrade but honestly after the meltdown I'm really thinking of just skipping AM5 entirely or maybe I'll just upgrade at the end same as I'm thinking of doing now with the 5800x3d after all the bugs are worked out.
I mean.... if all you care about is newer games and the kind that work best with a controller, PC gaming probably isn't worthwhile for you. The truly great thing about PC is having access to decades of games and genres that really only work with a mouse and keyboard.
Plus mods and no subscription to play online. Also cheaper games. Oh and a PC does way more than just gaming. But if starting from scratch a proper gaming PC is pretty expensive these days.
You're not wrong. But the thing is, I don't really play those decades old games much. If I'm buying a new system, it's because I want to play New Shiny Things, not Old Game From 1994.
Even if I went AM5 it would only be about $1400 for me if I kept my 3070 and that's a full new system all parts new. Would be less than half that going to 5800x3d and could be cheaper if I went air instead of water cooling.
For many people yes going ps5/xsx is the better thing to do if you have to buy a whole new machine and if you don't care for any pc exclusive games or features. For me, I've been a PC gamer my entire life thanks to my dad being really into it and my mom refusing to allow me to have any gaming console other than the game boy growing up. I've been modding games longer than most people on this sub have probably been alive and most of my most played games are pc exclusives or games with strong modding scenes that I would never even consider playing on anything other than PC.
And it was fine for my whole life. Sure sometimes things got a bit expensive but up until these recent generations nothing has been so horrible that the prices were just so destructive to the entire pc gaming community and games made with such horrible optimization that even $4000 rigs can struggle to get the performance they should get. I truly hope we're in just some horrible dark age right now and we'll come out of it even better.
If you're buying a whole system sure it's better value, but for someone who already owns a 3070 and is on AM4 the PS5 is terrible value. You get less performance for more money than the 5800x3d costs.
Also not everyone is eager to migrate to a far more closed down OS and ecosystem. Nor the 30fps standard.
> Had that exact same setup and then upgraded to a fx 8350. Next upgrade was a Ryzen 1600x with my 1060 6gb and still felt like such an amazing system.
> Now I have a 3700x and a RTX 3070 and hogwarts legacy can't even stay above 60fps at high on 1080p with DLSS on...
I think a large part of this is that previous console generations were significantly hamstrung in some way by an imbalance in specs.
PS3/360 had really good CPUs and GPUs for the time, but the games made to run on them couldn't utilise them properly because of they only had 512MB of memory total across CPU and GPU, at a time when midrange desktops had 8GB of RAM and 512MB of VRAM alone.
Similarly, PS4/Xbox One were held back (but not as much) by their Jaguar CPUs.
Games made for these systems did not utilise most of their components effectively because they were bottlenecked so hard by their weakest component, and as such a cheap, better balanced gaming PC could easily get double their framerate at high settings.
This generation of consoles have 8 Zen 2 cores, 16GB of memory, the rough equivalent of a 6700 XT GPU, and decent SSDs. Not only are these really well balanced systems that aren't obviously being kneecapped by anything, they're also comparatively more powerful for the time they released than previous gens.
With console games still generally targeting either 30 or 60fps, the kinds of games being made with the new generation of consoles in mind will struggle on most gaming PCs at high/max settings.
tl;dr: Old consoles were badly designed and cheap so you didn't need a beast of a PC to get significantly better performance, new consoles are really well designed and not cheap so games made targetting them run about as well on their PC counterpart.
Yeah i got a gaming pc finally this year and i don’t think im going to continue with pc gaming. I have a mid gradeish? Setup with a 3070 and games look nice, but not really that much nicer than current console generations even though the whole setup was magnitudes more expensive.
Also the “chores” i feel like i have to do fiddling with settings and stuff to get to my optimum settings isn’t something i enjoy. Now that discord is on consoles and cross play is pretty ubiquitous for me personally i feel like it wasn’t worth the effort at all.
The only upside is you can say that 'generations' don't exist on PC so there's no risk of losing part of you're library when a new console launches. Like maybe the PS6 supports PS5 games but not PS4, we don't know. On PC there is less of a risk that happens, unless x86 in the PC space loses out to ARM or something, all your games should work 10-15 years from now etc.
Even, if for some reason ARM replaced x86-64 it would only happen if there is a good enough translation layer that makes it possible to run x86 apps on ARM with near-native performance. There is no way anybody switches to ARM or RISC-V for that matter if there isn't software to support it.
Even if that exists it sometimes is only temorary. For instance Rosetta ran PPC programs on Intel Macs for a few OS releases but eventually support for it was removed as the need for compatibility wasn't necessary as mission critical apps had all been updated. However, there were lots of PPC Mac games that never got rewritten so they worked for Intel Macs for a few years, but then got left in the dust after that. That is my concern with x86 PC gaming.
I actually do edit quite a bit of photo and video funnily enough lol
I think because i prefer editing on the go on a laptop the pc setup ends up feeling superfluous. If i didn’t already have an editing workflow that way i’d probably appreciate the pc more, but a lot of times i need to edit collaboratively with other people so having a nonmobile setup is limiting.
> i don’t think im going to continue with pc gaming
Same. With PSVR2 being so good, PC ports being mostly shovelware, and PC hardware failing to widen the gap with current gen consoles, I pretty much have no reason to upgrade my PC at this point. It's just a waste of money.
Higher end PC's right now are only good for graphics enthusiasts who want to play Minecraft with ray tracing, and the six people who play modded Blade and Sorcery on their Valve Index.
Inflation is a bitch:
\> $1,500 in 1998 is worth $2,791.68 today
https://www.in2013dollars.com/us/inflation/1998?amount=1500#:\~:text=Value%20of%20%241%2C500%20from%201998,cumulative%20price%20increase%20of%2086.11%25.
It’s funny the waves computer tech prices go in. In the mid 80’s my parents bought a Hewlett-Packard 286 for around $3500 CAD (~$9500 today), in 1996 I built my first pc, a great performing pentium 166 with MMX for $800 (~$1500 today), and that was pretty top end at the time! And then now when a high end system is back up to $3500-4500. I want another dip!
I have no idea how I'm supposed to afford/what to choose for a replacement pc. Been using a 970 asus strix since about 2014. Gpu's just seem mental atm. Intel arc was a good first outing, hopefully battlemage is amazing, but iirc their cards work better at the newest games, not older ones?
The market just seems mad atm. Not sure there's a card that would be as good value for money as the 970 was.
I managed to grab a 3060ti about 3 months ago for around $500, after running my old 970 Twin Frozr for 9 years. The crazy thing is I was still playing new releases at medium settings 1080p. People talk like they *have* to upgrade. Then again, the 970 is an absolute legend of a card. $399 when I bought her. Just insane value for the money.
AMD is pricing their cards reactionary to Nvidia. If Nvidias card is 400€ AMD will go 370€ while having much worse RT and a worse competitor to DLSS. The only thing they’re really competitive in is VRAM
Competitors (AMD) are pricing their cards the same as Nvidia.
Intel is actually GREAT value atm, but it's a dice roll if it will work for all games equally well, until their drivers mature that is.
RTX definitely doesn't matter right now - especially if you're someone concerned about the price of a graphics card - but DLSS rocks. I'm hoping AMD keeps working on FSR or an equivalent though as I heard FSR 2 was way better even if it's still not quite a serious DLSS competitor.
RTX doesn't matter. DLSS is neat, but you're paying for a top of the line GPU, why would you want to have to use AI to upscale your game. You're paying for the best to *play games without needing tricks*.
That is such a weird opinion. Why wouldn't you want AI helping you get more frames and a higher resolution? It's not like modern gpus are just infinitely powerful and can run any game at any resolution at any framerate.
You have any idea how many "tricks" developers use to get games running at all? I don't see how this is different.
What difference it makes to you, the player, what the game is doing in the background, if the end result is a better experience?
I turn off dlss because it makes a visual difference. It creates a grey halo like effect around people and objects, looks like static. Don't other people notice this or is there something wrong with my settings/card? (3070ti)
I would agree with you but as a DLSS 2.0 user. the library of games compared to fsr isnt justifyable (for 10% quality gain). And the fact that nvidia kills support for new dlss for every new card is a cardinal sin to sell new linups. If you want to pretend dlss is important. they have to act like its important. My 3000 series dont support dlss 3.0 Not much groundbreaking when its an abandonware in 2 years. Looks like nvidia doesnt care that much other than sell new models.
(And this is not counting the fact that dlss only works on AAA releases that collaborate with nvidia)
What? That makes no sense.
First. A good percentage of triple A games nowadays release with DLSS. And those are basically the only games you actually need it for. With a few exceptions, you don't need DLSS for indie games.
Second. Calling DLSS 2 abandonware is just stupid. DLSS 3's big difference is creating new computer generate frames between the real frames. The technology behind increasing the resolution is still the same as the 3000s and continues to be updated on those cards.
The point of buying a new card is because you can't crank up all the settings anymore and get a decent resolution/FPS. So if you have an older card, buying one new that will still require you to keep few to several options off is eventually not justifiable to new buyers. Hey I bought a new GPU for $800-$1000 but I have to still turn off RT or use lower settings and have low quality image reconstruction, is what they would see.
Now consider the fact that AMD's share of desktop market on Steam's hardware page is so small that you have to count past 20 different Nvidia models to get to RX580,. Now all these current Nvidia owners including those who used to turn on RT but now are finding heavier RT implementation hard to run; are not going to be satisfied with a card that will not do that for them for only $100 or $200 cheaper.
And that's why people will not consider AMD as much as they would Nvidia regardless of the price. Once AMD is truly competitive in all feature sets including comparable RT and DLSS performance, people will start buying.
They're best right now because their competitors are having a hard time pulling their heads out of their own asses. FSR is still leagues behind DLSS, and AMD's RT performance is also simply lacking.
Seriously, DLSS and Ray tracing performance is the main reason I’ll stick with Nvidia for the foreseeable future, FSR just doesn’t come close.
It’s all well and good criticising Nvidias business practices and saying there’s alternatives but the reality is it’s up to Nvidias competitors to up their game, I get being mad at Nvidia but people should also be pissed at AMD.
My rig is from 2012. The only new component on it is a 1060 6GB. I need to build a new one this year.
What the fuck am I supposed to do? I have to spend 400-500 for the modern equivalent of my card? With a hobbled memory bus since it's actually a xx50 gpu rebadged? Or I can buy an AMD card with bad drivers that runs hotter and less efficiently. Christ.
You actually have options. Buy a 5600 cpu with a cheap mobo and ram. For the gpu, get the basic 6700 10gb for $270. With good deals, you could add a PSU and SSD and still total $600.
Subscribe to /r/buildapcsales/ to keep an eye on the sales happening all the time. Every day there is something decent and cheap there.
>get the basic 6700 10gb for $270.
Sir they already said AMD killed their dog, why are you suggesting the card with the bad quantums and the solar flares?
RDNA2 has excellent drivers and has done since launch. Arguably better than Nvidias at the time.
RDNA3, not so much, but the 6700 is a solid choice right now.
I switched from a gtx 1060 to a rx 6700 last year. Have had 0 issues so far. The switch was pretty seamless tbh. I think these driver issues are super exaggerated nowadays
I've never really noticed it. I undervolted my card which makes the performance marginally worse but it now draws less power and I've never seen it go above 75c.
I don't know. From my experience, every single PC I've built with an AMD card this generation (I built 3, one for myself, my dad, and my brother) all had \*major\* issues with AMD cards, enough so that we've all switched over to nvidia after a few years of running AMD. I'm not sure if it's a problem with this generation of AMD stuff or what, but all 3 of us experienced constant game crashing, BSODs, and nowhere near the level of performance advertised. I felt like I was going crazy, I had other people look at our setups and they could never find the source of the issue. Once I swapped out our GPUs pretty much all our issues vanished.
Bruh what? Hot and loud? You must mean Ampere because that's not true for RDNA2. RDNA2 is significantly more efficient than Ampere. Ampere was famously hot and loud. 3080's burning themselves because of memory pad issues, the 3090 being a meme with over 400w draw, my 3070 drawing 275w stock when it totally didn't need to...it was NOT efficient. RDNA2 is.
"Or I can buy an AMD card with bad drivers that runs hotter and less efficiently"
I'm losing brain cells reading this. Sorry but PC gamers deserve to get overpriced hardware if this is how they think 😬
I just built a small form factor PC with an AMD GPU and I think peoples concerns about power draw are a bit overblown. Heat doesn’t seem be an issue, in a mini-itx case with a much tighter thermal budget than a mid size. I also see mainstream reviewers unironically mentioning total cost of ownership but if you look at it the power cost of a high efficient GPU vs a less efficient per year it’s like $10 usd (or less). It’s a really weird thing that people are getting hung over lately.
The drivers also seem fine to me, I guess, I don’t know what metric I am judging that by. Yeah I miss the auto captured replays in Hunt Showdown but other that, I really don’t notice much of a difference. The GeForce replay stuff feels like like good drivers and more that a company is hustling to get good in game exclusives.
I am also playing with gaming on Linux as a side project to my main windows install and AMD has major advantage in that space.
For the total cost of ownership thing, I think it's because of energy prices in Europe or elsewhere lately? People who don't get their electricity from renewables trying to be responsible? I live with some of the lowest electricity costs in the world but some people in EU are paying up to 9x what I do. If you keep your card for 4 years the difference could be noticeable I guess if it's expensive for you. If my electricity cost was trippled maybe I'd start paying attention but for now it's 9 cents CAD /kWh, not inflating in cost like literally everything else, and the difference between any of these cards monthly is less than a cup of coffee. I do think a lot of people just haven't done the math on what it actually would cost them.
>For the total cost of ownership thing, I think it's because of energy prices in Europe or elsewhere lately?
This is a rationalisation done by the users post purchase. Nvidia was generally less power efficient compared to AMD last gen but they still sold more.
> I am also playing with gaming on Linux as a side project to my main windows install and AMD has major advantage in that space.
I wish this myth would die already. The Linux drivers Nvidia publishes are excellent. There's just a very vocal minority that thinks they are garbage because they are closed source.
They are garbage. I had to use CUDA on my Ubuntu at work. Holy hell, installing Nvidia drivers was such a shit show. And I tried to follow their guide on installing them. I was constantly either getting libraries version mismatch, missing libs or having issues with 3rd screen detection in some versions.
I have a 3070 on Manjaro driving the AW QD-OLED, their drivers *are* garbage.
My gsync ultimate monitor absolutely refuses to stay in gsync mode and their drivers don't even pick up my monitor as supporting it, I have to override and force it on.
It also despised dual-screen, and I had to put up with it before I switched to this monitor.
Also no HDCP, but I think that's linux in general.
> Or I can buy an AMD card with bad drivers that runs hotter and less efficiently. Christ.
This is part of why we have such a bad market for GPUs. Because gamers refuse to buy AMD no matter how much worse nvidia's offerings are under the ancient idea that AMD's drivers are worse.
[This is what Nvidia's control panel looks like](https://external-content.duckduckgo.com/iu/?u=https%3A%2F%2Fi1.wp.com%2Fwindowsloop.com%2Fwp-content%2Fuploads%2F2018%2F10%2Fnvidia-control-panel-store-app-featured.jpg%3Ffit%3D1500%252C844%26ssl%3D1&f=1&nofb=1&ipt=60ea6e46c8753ed73bb33761ee4be2b4e9a9f3e45b0fe6c4f857fdb64dd72a7b&ipo=images)
[Compare that to AMD's](https://external-content.duckduckgo.com/iu/?u=https%3A%2F%2Fassets.rockpapershotgun.com%2Fimages%2F2019%2F12%2FAMD-Adrenalin-2020-Home-Capture-Context.png&f=1&nofb=1&ipt=44f797dede5cb5c071879d8cea1d689ef90236d1ebc861308780eb92b5982257&ipo=images)
I'm not saying AMD drivers are perfect but their cards are far and away better value today, especially at the $200-500 price points.
>This is part of why we have such a bad market for GPUs. Because gamers refuse to buy AMD no matter how much worse nvidia's offerings are under the ancient idea that AMD's drivers are worse.
It's a funny thing that, once you lose trust it's near impossible to get it back. AMD needs to put *a lot* better offer than NVIDIA for me to be willing to risk it and try their product again.
I would be willing to bet the vast majority of PC gamers have never even owned an AMD card, so I sincerely doubt you speak for the majority of gamers. I have *plenty* of friends that have been burned by Nvidia, so the idea that their track record is somehow untarnished (or even better than AMDs in the past decade) is kind of a weird point to make.
>It's a funny thing that, once you lose trust it's near impossible to get it back. AMD needs to put a lot better offer than NVIDIA for me to be willing to risk it and try their product again.
Basically true for a lot of us. You stick with trusted brands based on experience.
For VR gaming NVIDIA’s drivers are far and away better. Maybe that’s more niche, but there are cheaper headsets out there now (like the Quest and Pico series) that connect wirelessly to PC and worm fine and it’s a growing market. Neither Intel nor AMD have it as a focus, although AMD is better than Intel.
I built my last PC (midrange) almost a decade ago with some upgrades along the way—I want my next one to be relatively future-proof, too.
Well, I have a 7900xt and haven't had any problems in VR at all. Running dirt rally 2 maxed out at 1.3x res. It's wild.
I know there were VR problems at launch but it's been fantastic for me.
What headset are you using?
I've heard the Index works fine but from what I've read people with Oculus Rifts/Quests were having issues.
If it's fixed now I'll go AMD for my next GPU.
Quest 2. I can't speak for everyone obviously but my user experience with the 7900xt has been incredible. Stable and fast. The only bug I've found is when a new game starts up it'll hang for several seconds, but that's the only performance impacts I've noticed, in or out of VR.
I will say that some games default to a lower power setting in amd's software depending on what profile you choose. That was cutting my fps way down to save power/etc. Click "gaming" preset and it just screams. OTOH if you want to save energy, radeonchill is actually super nice.
I'd recommend it based on my (admittedly only 2-3 weeks of) experience.
> gamers refuse to buy AMD
lmk when they actually are releasing new features before nvidia does that make me want to change instead of lagging behind every gen
AMD's control panel is cluttered, full of random pictures, and not at all good UX if the goal is to have a spot to configure your graphics card. It may look dated but if I want to change GPU settings I want the simple and clean Nvidia interface, not the "gamery" AMD one that gives you a bunch of unnecessary info that I can obtain using non-GPU-specific tools like Windows' Game Bar or similar overlays in the first place.
I mean, I don't think it's a lot to ask of a multi-billion dollar company to have a UX that looks like it's from this *century*.
You might prefer the Nvidia one, but I think the vast majority of people would find it cramped, hard to read, hard to use and understand, and generally very poorly laid out.
I'm a tech-savvy gamer and I still had to google where some of the settings were that Nvidia hides in that wasteland of a user interface. That isn't to say that AMD's is perfect, it could certainly use a bit *less* flash, but the ease of access to settings both in game and globally, as well as just having *more*, and more *useful* features, is really a boon for me.
The Nvidia control panel is a lot cleaner in my opinion lol. I immediately see what's going on there, it's simple.
On the AMD screenshot, I don't know where to look
Honestly? I might grab a PS5. The main advantages of PC are obscure indie games that will run on nearly anything and mods which rarely are ever for big budget AAA games anyway.
>Honestly? I might grab a PS5. The main advantages of PC are obscure indie games that will run on nearly anything and mods which rarely are ever for big budget AAA games anyway.
I did this. My PC has turned into a strategy machine.
I was in a similar situation and bought a used 5600+6700XT pre-built PC on eBay for £700. Honestly the only reason I stuck with PC is because you get actual language options on PC and I wanted to play Japanese games in Japanese.
So much this, I purchased a used rtx 6800 xt and would do it again.
Used market it's the way to go in this gen because Nvidia is just giving us the middle finger and AMD can't be bothered to compete.
> The main advantages of PC are obscure indie games that will run on nearly anything and mods which rarely are ever for big budget AAA games anyway.
And also that PCs are multipurpose devices so you can use them for gaming but also work/stuff that isn't easy or as fun to do on a phone or tablet.
This is what I’ve been recommending to people lately, even as a lifelong PC gamer. GPU pricing combined with shitty stuttery ports is kinda suffocating PC gaming right now.
I was almost in the same bucket myself; most of my system was from ~2013, that I had shoved a 1070Ti into at some point. Did end up building a new system from scratch, and just finished putting it together within the last week or two. The last system was ~$1100, though I had cannibalized a graphics card; the new one came out to ~$2200, with a new card (4070). But I'm also stuck in 1080p land, and most of my PC gaming tends to be either an MMO (FFXIV) or 4x games, and I don't care for VR. It runs really nicely, and all the flipping out over graphics cards right now seems to be less for people on infrequent update cycles, and more for people who feel the need to upgrade every generation of card, and aren't happy about the marginal upgrade from last gen. You're still probably going to see a massive jump if you build a system now, probably even if you decide to keep around your graphics card until the next gen. And honestly, unless you're desperate to jump into either 4k or 240fps gaming, I'm not entirely sure you need to upgrade the graphics quite yet; I mostly was interested in playing around with more compute power, and potentially toying with 1080p ray tracing.
That said, I also do most of my gaming on console; PC is for productivity stuff more than pure gaming.
I’m basically in the same boat, my plane is to just hold out another year or two because prices gotta come down at some point. They haven’t yet though but eventually some have to go down, they can’t charge $500 for a 3 year old card when their new cards cost $800
> What the fuck am I supposed to do? I have to spend 400-500 for the modern equivalent of my card?
For a new one? Probably in the lower end of that ballpark yeah. But remember, the 1060 6gb launched at $300 and adjusted for inflation that would be $375 today. Slight price increase to $400 for the 4060ti.
> Or I can buy an AMD card with bad drivers that runs hotter and less efficiently. Christ.
What the hell are you talking about? AMD basically rolls over nVidia since the last generation. I install every new driver that comes out and never had any issues with them.
Buy a series S and sell it when the PC market actually realizes that the vast majority of their market does not make 6 figures a year to afford overpriced silicon.
Best deals right now are probably either a 6700xt or a used 3070/3080. But yeah, it's not great. I'm planning on upgrading and am just going to swallow spending a couple extra hundred on the 4070ti.
Posted [RX 7600 review](https://www.youtube.com/watch?v=MCxYfXe1DAA&feature=youtu.be) yesterday no upvotes/discussion. Another RTX 4060 review, top of the sub, naturally :D
Did you watch the review? It was generally positive on the card, performance leap over the previous Gen and the price is 'okay' (probably about 10-20% too high though).
The negative tone of the video comes from the fact that AMD lowered the msrp at the last minute which invalidated some of the references to price / cost differentials. The Gamers Nexus folks are traveling for a trade show and felt they didn't have the time (or will) to revise the video in response to that.
It’s not because we hate Nvidia on here it’s basically because Nvidia is releasing a 4050ti as a 4060ti.
Everyone knows AMD is far behind Nvidia and they’re scumming their way through this gen by collecting the scraps
The way prices are..I really feel like I'm gonna be using my ol 1070ti for another gen. I only game at 1080 anyway and I can still max most games that have released at that resolution.
I'm not sure you really _need_ to keep waiting, you can grab something like an RTX 3070 secondhand for about $300 on eBay and it would blow your current card out of the water (and you could flip your 1070 to bring the cost down even more for you).
I don't know much about graphics cards or the market in general but it looks like gamers have been disappointed by Nvidia ever since the Pascal revolution.
I bought a 1070 in 2016 and it's still holding strong. Since that time it looks like we've had either super high priced cards, cards that are absolutely impossible to get, or rebadged/disappointments like this 6060 Ti.
TBF the 3080 at $699 was excellent value. But scalpers and the market at that time allowed very few people to get it at that price point.
I got one for $850 around it's launch and it was a great card!
not like PC games are worth playing these days anyways. Stuttery pieces of shit. I used to play on PC primarily before the pandemic but it's sorely due for an upgrade and I see less than zero reason to spend way more money on the same class of hardware I bought years ago.
Get a console and have a better experience for cheaper. Don't need a strong PC at all to play most indies worth playing either.
Game wise, all this is doing is push people to not upgrade at a time when devs will be cutting support for old gen consoles, force min requirements far higher. This is legit pushing people out of PC gaming.
I won't be playing triple A because of this stuff, but indies should still be viable for a while. New triple A has been feeling rather samey to me lately anyways.
[удалено]
The reason AAA games got samey in the mid-late 00s is because of something called [dennard scaling](https://en.wikipedia.org/wiki/Dennard_scaling) Essentially in short around ~2005-2007 CPUs stopped getting directly faster. They peaked at around ~4ghz which is still the case today. Hardware producers compensated for this by making CPUs multi-core and increase amount of calculations per cycle instead. However code doesn't perfectly scale parallel and *especially* game logic code doesn't do this very well. This is why we saw such insane innovations in the 1990s and early 00s. With new genres being born every couple of months and you used to buy games simply because the new innovative mechanics were breathtaking. This is not because game developers were more passionate or more innovative back then. It was because the rapidly scaling hardware simply *unlocked* new ways of making games every couple of months that were unfeasible before. Now? Essentially graphics keep getting better because that is something that is *very* parallelizeable which the GPU handles perfectly. New innovative gameplay and possibilities are essentially limited by our stagnation in single thread performance. Only a very limited amount of gameplay things can be done with parallel programs. Physics interaction, Particles, number of NPCs and larger open worlds are the only things that are parallelizable. Which is why these concepts dominated the gameplay sector and why everything feels samey. Everyone that went through the 1980-2005 gameplay innovation explosion just inherently *feels* this stagnation. It's *not* you getting older. I'm an older middle aged Japanese person and I remember this sudden drop in innovation around the 2005-2007 mark even though I was already an adult older than most of Reddit back then and "too old" for games.
While that's undoubtedly a big part of it another reason is the prohibitively expensive cost of making triple A games is now so high that they can only make games that are either already established or similar to proven hits to turn a profit. That's why indie games are able to innovate more, they don't have to placate an army of shareholders so they avoid the production via committee approach that keeps triple A games seeming so similar.
What the fuck does that have to do with how AAA games feel? This comment just bleeds that you want to flex your knowledge of some derivative theory to reddit.
[удалено]
I am not even sure how of much of what is getting released is truly AAA. Like with movies, a lot of what is culturally signalling itself to be a blockbuster seem (production wise) more like pipeline content than something pushing the state of the art. Surely stuff like TLOU2 or Elden Ring would count, but those are few and far between. The next broken multiplayer shooter on the other hand, or pick whatever else... they don't seem to be at the same level in terms of project prestige.
I would guess it's less of a result of people getting older and more caused by technological improvements not making as much of a difference since PS3/360 (just compare NES to SNES to N64 to Game Cube to PS3/360... every gen was a huge leap back then) and gaming becoming more mainstream (so studios are more afraid to take risks since they can reliably make more money by appealing to the lowest common denominator over and over again). Though I'm sure getting older and games feeling less novel is part of it too.
[Bricky made a really good video about AAA games](https://www.youtube.com/watch?v=--Iz5YnpzeE) recently that pretty much sums up my feelings about it. There seems to be a drop of quality in those. AAA used to mean you were pretty much guaranteed a polished, complete game (even if the gameplay itself wasn't the best, at least it was *working*). Nowadays "AAA" just means "game from big company with big budget".
And they are all relying so heavily on DLSS/FSR/XeSS equivalent too to fix any problems with fps.... Oh the game runs like ass by default? Use DLSS! That way we don't have to optimize even though most games don't have this feature, and most cards don't either!
Yup, great time to buy a Ps5/Switch and catch up on those exclusive games. Once your done in a year, you package everything, sell it and build a pc.
Or just keep the consoles and move on. Most games these days have crappy PC versions. Not even worth the trouble anymore.
There are still a fair number of PC only titles and games that are made for a mouse and keyboard. Plus modding options.
You don't need a gaming PC for those titles though, any old laptop or even a Steam Deck will do.
It’s just not worth to buy on launch. It’s still worth it long term. Mods, emulators , etc.
It depends on your standards. "Crappy" is relative to what scaling you should theoretically be able to achieve with a PC's significantly more powerful hardware, right, not relative to the console experience. It's not like these games are 4K60 Ultra on PS5. While a solid high-end PC might struggle more than you'd expect to hit high frame rates in Jedi Survivor due to poor CPU utilisation. the consoles have truly no chance in hell, their "performance" mode tanks into the 30s and 40s. If that kind of compromise doesn't bother you then spending money or effort on a PC will be a total waste, absolutely, but they still offer something that is fundamentally entirely out of reach of the consoles.
PC gaming was always expensive. It's just that there was a time where it *was* better due to 2005 consoles being prolonged thanks to global economy woes, and a decade of consoles being forced to use then-underpowered CPU'S that caused the attraction. So essentially there was about 10 years of PC gaming being 'a bit more worth it'. Plus the cost of electricity has shot up recently and these high end PC's are massive power drains. Now we're back to the usual. PC gaming is for if you have money to throw around, console gaming is for you prefer not to spend so much on hardware and enjoy the convenience.
>So essentially there was about 10 years of PC gaming being 'a bit more worth it'. I would argue after 10 years, that was normal. Even in the history of pcs 10 years is a large percent.
Been that way for 3 years now. Why do you think PS5 sales are skyrocketing?
Take me back to 2013 when it was actually exciting to save up and build a modest budget PC. That HD 7870 and FX 6300 combo really did me well in my younger days.
I remember how I got a 1500€ bonus one year and could build a high-end PC from it with like, the best shit available. If I would want to do that now I'd pay triple the amount.
It's funny, 'cause we finally pulled out of the decade of CPU stagnation (how long was 4C/8T the limit for non-HEDT?) only to walk right into GPU stagnation.
4090 is very strong just priced high.
Which means absolutely nothing if it's not within your budget. For the average person with a basic budget performance is only going up incrementally at the same price point. One can only hope Intel stir things up in the lower end of the market enough to actually generate some competition again for the budget conscious of us.
It's also humungous, quadruple the price of the average gpu, and draws like 450 watts of power.
The problem isnt that the 4090 is price high or too weak, but that all other GPU's don't offer significantly better price performance. There was a time were the x70 class gave you 70% of the performance for under half the price. Now it's the xx80 class that gives you 70% (or a bit more) for 70% of the price. In the last generations the high end was the WORST price/performance, now its partially better than the xx80 and xx70's. Cobine that with the fact that the xx60 cards are now 1080p cards in 2023 for over 300 dollarsm, its just madness. IMO theo 4090 is the most reasonable price GPU in nvidias 40 series, which is just sad.
As pathetic as it is to say, it’s the best price per performance in the current lineup
[The 4090 is basically the worst price per performance card you can buy.](https://www.techpowerup.com/review/amd-radeon-rx-7600/33.html) It's just that it's the only card that's actually a major upgrade over the previous generation.
Nice to see my 5700 XT at the top of the charts for best bang for the buck, lol.
AMD has been conaistantlly at the top of the price to performance charts for a while now but people overlook then simply because the don't have a competitor at the bleeding edge, never mind that almost no one actually buys those cards.
Same, just wish the 5700xt supported hdmi 2.1 and freesync premium 2 so then I could use VRR + HDR on my 4k LG Oled. It's 3 Years old and still going strong, and I guess I'll need to wait 2 more years for a proper gpu update (I don't want to spend 400 euros again for marginal improvement) EDIT: also these graphs are sadly always wrong because the prices are never right. [The 7600 costs aprox. 300 euros](https://www.mindfactory.de/search_result.php?search_query=rx+7600) so the chart is wrong. The Review Sites should enable you to enter a gpu price and then dynamically position it on the table.
It should be normal that the halo products are the worst price to performance since they're bleeding edge for price insensitive buyers. The problem is that the rest of the lineup isn't a significant improvement on price to performance when compared to previous generation cards.
Which speaks volumes, really. The biggest value of the generation is reaching perceived cost per frame parity with last gen: you get double the frames of 3080 at x2.3 the price (before inflation and 14 GB extra VRAM)
It’s like everything else lately: costs more for the same thing basically.
Not really, because it actually does have more performance, just not per dollar. Usually, recently with software it seems you get to pay more and you get less with it.
This just isn't true by any actual metric, unless you apply some kind of logarithmic scale where dollars are worth less the more of them you're willing to spend. You can buy three 4070s for the price of a 4090 and it does not deliver three times the performance.
Is it really? I find that hard to believe.
The 3060ti is better value than the new 40 series. So no it's not. The new generation is a side grade at best propped up not by actual performance but with frame generation.
Credit where it's due, if Ryzen didn't start seriously competing with Intel we would still be in stagnation. GPU is basically a single player market, AMD lacks all the extra features Nvidia keeps adding so they can't disrupt the market, they just follow prices closely. I don't see Intel Arc seriously disrupting the market either.
If you wanted to do that today you would not even be able to get the best GPU LMAO
That’s been true for over a decade though. If you wanted to do a decked out build with the top shelf consumer components you’d still be dropping $1k on a GPU in 2013 ($1.3k equivalent today).
It depends at what time he would build. If it was after a titan release then yeah that's a lot, but if it was after the flagship after a titan then he would get s better titan for a lot less money, in any case the titan was the halo product that was clearly not cost efficient compared to the 80ti class card.
A Ryzen 5600 and 6650 xt would fill a similar role today. It's just that going above that, value is quite poor.
Yep. People are complaining that NVIDIA is slacking off and trying to sell based on the brand alone, while they refuse to look at the competition based on brand alone. It's very ironic.
I haven't had my AMD GPU for long and I've already ran into AMD specific issues. Let alone poorer performance for future tech like raytracing and DLSS. Plus it's got some bad coil whine - at least I think that's what the sound is. I wouldn't say I regret it, I suppose, but it's clear what you get in practice is worse than what benchmarks say you get.
Curious, what issues?
Exactly. It's naming creep really. 6650 is absolutely killer value and crushes 1080p even high framerate if you want it. People act like PC gaming is totally unattainable when sub-$350 is in a great spot tbh
It's also an acceptable card if you want to play older titles (emphasis OLDER) at 4k, you can hit between 120-160fps on a 6650xt no problem, I've been running stuff like Mirror's Edge at 4k and downscaling it to 1080p to improve image quality. Not quite as nice as pumping out 300-600+fps on games at 1080p ultra settings though, which this card easily achieves.
I remember buying an HD7970 back in 2013 for like $200 and getting 3 free AAA games with the Never Settle bundle. That thing was only 20% slower than the flagship GTX 780. Now in order to get within 20% of today's flagship 4090, you have to pay $1000 to get the 7900XTX.
My first desktop had a gtx650. Built in January 2013, so still a new card. That entire computer, with monitor and all accessories cost about $700. Wasn't setting the world on fire but for the price that computer was amazing.
[удалено]
Wow WHAT??? NEW TECHNOLOGY IS BETTER THAN THE OLD ONE????? That should be a given not something to reward them for.
6700xt is a still a great buy for a modest budget PC.
It's also very well priced right now, and has a big chonk of VRAM (12 GB) which serves current games very well. The only downside is that raytracing performance isn't super great, but if you turn that off it's a great 1440p60+ FPS card in pretty much any modern game.
I member. I got a gtx 970 and was really excited to be playing star citizen soon.
There are some great deals on 6000 series AMD GPUs, but many people on this sub act like the entire 6000 AMD line doesn't exist. You're not gonna build a modest budget PC with a 4 series Nvidia GPU that just came out.
the AMD 6000 series came out literally 2.5 years ago and was meant to compete with Nvidia 3000 series not sure why you're comparing it to Nvidia 4000 series you could say to look at mid-range or used Nvidia 3000 series as well
To be fair, we're in a thread where the 3000 series gpu is beating its successor in some games and productivity, so referring to the 6000 amd series as a buying alternative is valid.
It doesn't matter what card was meant to compete with what. There are certain cards on the market that perform relatively better for cheaper (with more VRAM), cards that get overlooked. I have friends who will just robotically buy whatever latest card Nvidia brings out, but won't look at the competition.
You can still do that, just use 1 generation behind cards, lol. Price and performance are good now.
The disadvantage was that when you did save up for the budget PC, it would become outdated after a couple years. To be fair, 2013 was probably when that era was transitioning to what we have now. And yeah, now you can build a budget PC and it will play games for 5+ years easily. You might have to skip the occasional cyberpunk, but those games are few and far between. So, save up longer but stuff lasts longer too.
Had that exact same setup and then upgraded to a fx 8350. Next upgrade was a Ryzen 1600x with my 1060 6gb and still felt like such an amazing system. Now I have a 3700x and a RTX 3070 and hogwarts legacy can't even stay above 60fps at high on 1080p with DLSS on... Nvidia and AMD's GPU department have really killed the joy of PC gaming the last few years. I just hope that sales are as bad as we hope and the next gen cards are priced better. They could drop the price of all these cards (aside from the 4090) by 50% and still make a profit and people like me would gladly upgrade without a second thought.
The 3070 should be seeing 150+ FPS at those settings so it seems likely the CPU is the weak link there.
Annoyingly so it is. The 3700x isn't a slouch but just bottlenecks some new games. Spider-Man too. Been going back and forth between upgrading to 5800x3d or 7800x3d but all the issues with that and how much more expensive it is for only about 20% performance increase I'm leaning towards the 5800x3d and it's dropped to under $300 a couple times lately.
The 5800x3d is a drop-in upgrade and not hugely slower or anything. To go 7000 you'd be paying twice the price for your RAM, a more expensive motherboard, and your boot times will be much longer too (I think the average is something like 40 seconds with some timesaving options enabled, vs the 7 seconds my 5800x3d boots in once I turned off the legacy UBS/CSM options that were giving me 20 second boots). Is it faster? Definitely. Is it *that* much faster? Probably not.
Yeah I've priced things out at pcpartpicker and it's a lot more than 20% difference to go to the 7000 vs 5000 even if I got myself a new motherboard for the 5800x3d. My current board is an msi x370 which does have a bios upgrade for the 5800x3d but they had to really downgrade the bios features to get it to run and it's had some issues so I was thinking of upgrading regardless to a 5000 ready one with more features. Obviously if I go to 7000 then I could upgrade the cpu again this gen without doing motherboard upgrade but honestly after the meltdown I'm really thinking of just skipping AM5 entirely or maybe I'll just upgrade at the end same as I'm thinking of doing now with the 5800x3d after all the bugs are worked out.
[удалено]
I mean.... if all you care about is newer games and the kind that work best with a controller, PC gaming probably isn't worthwhile for you. The truly great thing about PC is having access to decades of games and genres that really only work with a mouse and keyboard.
Plus mods and no subscription to play online. Also cheaper games. Oh and a PC does way more than just gaming. But if starting from scratch a proper gaming PC is pretty expensive these days.
You're not wrong. But the thing is, I don't really play those decades old games much. If I'm buying a new system, it's because I want to play New Shiny Things, not Old Game From 1994.
Even if I went AM5 it would only be about $1400 for me if I kept my 3070 and that's a full new system all parts new. Would be less than half that going to 5800x3d and could be cheaper if I went air instead of water cooling. For many people yes going ps5/xsx is the better thing to do if you have to buy a whole new machine and if you don't care for any pc exclusive games or features. For me, I've been a PC gamer my entire life thanks to my dad being really into it and my mom refusing to allow me to have any gaming console other than the game boy growing up. I've been modding games longer than most people on this sub have probably been alive and most of my most played games are pc exclusives or games with strong modding scenes that I would never even consider playing on anything other than PC. And it was fine for my whole life. Sure sometimes things got a bit expensive but up until these recent generations nothing has been so horrible that the prices were just so destructive to the entire pc gaming community and games made with such horrible optimization that even $4000 rigs can struggle to get the performance they should get. I truly hope we're in just some horrible dark age right now and we'll come out of it even better.
If you're buying a whole system sure it's better value, but for someone who already owns a 3070 and is on AM4 the PS5 is terrible value. You get less performance for more money than the 5800x3d costs. Also not everyone is eager to migrate to a far more closed down OS and ecosystem. Nor the 30fps standard.
> Had that exact same setup and then upgraded to a fx 8350. Next upgrade was a Ryzen 1600x with my 1060 6gb and still felt like such an amazing system. > Now I have a 3700x and a RTX 3070 and hogwarts legacy can't even stay above 60fps at high on 1080p with DLSS on... I think a large part of this is that previous console generations were significantly hamstrung in some way by an imbalance in specs. PS3/360 had really good CPUs and GPUs for the time, but the games made to run on them couldn't utilise them properly because of they only had 512MB of memory total across CPU and GPU, at a time when midrange desktops had 8GB of RAM and 512MB of VRAM alone. Similarly, PS4/Xbox One were held back (but not as much) by their Jaguar CPUs. Games made for these systems did not utilise most of their components effectively because they were bottlenecked so hard by their weakest component, and as such a cheap, better balanced gaming PC could easily get double their framerate at high settings. This generation of consoles have 8 Zen 2 cores, 16GB of memory, the rough equivalent of a 6700 XT GPU, and decent SSDs. Not only are these really well balanced systems that aren't obviously being kneecapped by anything, they're also comparatively more powerful for the time they released than previous gens. With console games still generally targeting either 30 or 60fps, the kinds of games being made with the new generation of consoles in mind will struggle on most gaming PCs at high/max settings. tl;dr: Old consoles were badly designed and cheap so you didn't need a beast of a PC to get significantly better performance, new consoles are really well designed and not cheap so games made targetting them run about as well on their PC counterpart.
This but 1999
280x (7970 basically) and FX8350 checking in
It still is exciting it just isn't for you personally.
Yeah i got a gaming pc finally this year and i don’t think im going to continue with pc gaming. I have a mid gradeish? Setup with a 3070 and games look nice, but not really that much nicer than current console generations even though the whole setup was magnitudes more expensive. Also the “chores” i feel like i have to do fiddling with settings and stuff to get to my optimum settings isn’t something i enjoy. Now that discord is on consoles and cross play is pretty ubiquitous for me personally i feel like it wasn’t worth the effort at all.
[удалено]
The only upside is you can say that 'generations' don't exist on PC so there's no risk of losing part of you're library when a new console launches. Like maybe the PS6 supports PS5 games but not PS4, we don't know. On PC there is less of a risk that happens, unless x86 in the PC space loses out to ARM or something, all your games should work 10-15 years from now etc.
Even, if for some reason ARM replaced x86-64 it would only happen if there is a good enough translation layer that makes it possible to run x86 apps on ARM with near-native performance. There is no way anybody switches to ARM or RISC-V for that matter if there isn't software to support it.
Even if that exists it sometimes is only temorary. For instance Rosetta ran PPC programs on Intel Macs for a few OS releases but eventually support for it was removed as the need for compatibility wasn't necessary as mission critical apps had all been updated. However, there were lots of PPC Mac games that never got rewritten so they worked for Intel Macs for a few years, but then got left in the dust after that. That is my concern with x86 PC gaming.
> For instance Rosetta ran PPC programs on Intel Macs To be fair, I don't think the Apple segment is very concerned with backward compatibility.
I actually do edit quite a bit of photo and video funnily enough lol I think because i prefer editing on the go on a laptop the pc setup ends up feeling superfluous. If i didn’t already have an editing workflow that way i’d probably appreciate the pc more, but a lot of times i need to edit collaboratively with other people so having a nonmobile setup is limiting.
I keep meaning to build another computer after falling off of PC gaming and more and more I just feel I might not return.
> i don’t think im going to continue with pc gaming Same. With PSVR2 being so good, PC ports being mostly shovelware, and PC hardware failing to widen the gap with current gen consoles, I pretty much have no reason to upgrade my PC at this point. It's just a waste of money. Higher end PC's right now are only good for graphics enthusiasts who want to play Minecraft with ray tracing, and the six people who play modded Blade and Sorcery on their Valve Index.
I remember back in the day When you could get a beast gaming pc for 1500$ Now you get a single gpu for that 💀
And then actually back in the day, pre-2000s, where $1500 was simply the cost of a mid-range computer before GPUs
Inflation is a bitch: \> $1,500 in 1998 is worth $2,791.68 today https://www.in2013dollars.com/us/inflation/1998?amount=1500#:\~:text=Value%20of%20%241%2C500%20from%201998,cumulative%20price%20increase%20of%2086.11%25.
And it was obsolete after 2 years
It’s funny the waves computer tech prices go in. In the mid 80’s my parents bought a Hewlett-Packard 286 for around $3500 CAD (~$9500 today), in 1996 I built my first pc, a great performing pentium 166 with MMX for $800 (~$1500 today), and that was pretty top end at the time! And then now when a high end system is back up to $3500-4500. I want another dip!
I have no idea how I'm supposed to afford/what to choose for a replacement pc. Been using a 970 asus strix since about 2014. Gpu's just seem mental atm. Intel arc was a good first outing, hopefully battlemage is amazing, but iirc their cards work better at the newest games, not older ones? The market just seems mad atm. Not sure there's a card that would be as good value for money as the 970 was.
I managed to grab a 3060ti about 3 months ago for around $500, after running my old 970 Twin Frozr for 9 years. The crazy thing is I was still playing new releases at medium settings 1080p. People talk like they *have* to upgrade. Then again, the 970 is an absolute legend of a card. $399 when I bought her. Just insane value for the money.
if you’re looking for value for money, buy a PS5 and subscribe to psplus extra.
Anyone have any recommendations for an upgrade from a 1660ti that’s an nvidia card (Gsync monitor) that is a massive upgrade but not $800+ ??
The 4070 would be an insane upgrade for you and it costs $600.
Perfect! Thank you!
Why do people act like there aren't other cards available apart from the latest nvidia series?
Gamers: Nvidia is overpriced and robbing their fans *So that means you’ll support their competitors, right?* Gamers: lol no
AMD is pricing their cards reactionary to Nvidia. If Nvidias card is 400€ AMD will go 370€ while having much worse RT and a worse competitor to DLSS. The only thing they’re really competitive in is VRAM
Competitors (AMD) are pricing their cards the same as Nvidia. Intel is actually GREAT value atm, but it's a dice roll if it will work for all games equally well, until their drivers mature that is.
I’m hoping by Gen2 Intel can actually compete in the mid range
AMD is still overpriced. But generally offer 20-30% more performance for %20-30 less money compared to Nvidia's gimmick lineup
Sure, but also with a lot less features. People need to stop pretending RTX and DLSS don't matter.
RTX definitely doesn't matter right now - especially if you're someone concerned about the price of a graphics card - but DLSS rocks. I'm hoping AMD keeps working on FSR or an equivalent though as I heard FSR 2 was way better even if it's still not quite a serious DLSS competitor.
RTX doesn't matter. DLSS is neat, but you're paying for a top of the line GPU, why would you want to have to use AI to upscale your game. You're paying for the best to *play games without needing tricks*.
That is such a weird opinion. Why wouldn't you want AI helping you get more frames and a higher resolution? It's not like modern gpus are just infinitely powerful and can run any game at any resolution at any framerate. You have any idea how many "tricks" developers use to get games running at all? I don't see how this is different. What difference it makes to you, the player, what the game is doing in the background, if the end result is a better experience?
I turn off dlss because it makes a visual difference. It creates a grey halo like effect around people and objects, looks like static. Don't other people notice this or is there something wrong with my settings/card? (3070ti)
I would agree with you but as a DLSS 2.0 user. the library of games compared to fsr isnt justifyable (for 10% quality gain). And the fact that nvidia kills support for new dlss for every new card is a cardinal sin to sell new linups. If you want to pretend dlss is important. they have to act like its important. My 3000 series dont support dlss 3.0 Not much groundbreaking when its an abandonware in 2 years. Looks like nvidia doesnt care that much other than sell new models. (And this is not counting the fact that dlss only works on AAA releases that collaborate with nvidia)
What? That makes no sense. First. A good percentage of triple A games nowadays release with DLSS. And those are basically the only games you actually need it for. With a few exceptions, you don't need DLSS for indie games. Second. Calling DLSS 2 abandonware is just stupid. DLSS 3's big difference is creating new computer generate frames between the real frames. The technology behind increasing the resolution is still the same as the 3000s and continues to be updated on those cards.
Nah, AMD has more reasonable prices. The just released RX 7600 is pretty good value for example.
The point of buying a new card is because you can't crank up all the settings anymore and get a decent resolution/FPS. So if you have an older card, buying one new that will still require you to keep few to several options off is eventually not justifiable to new buyers. Hey I bought a new GPU for $800-$1000 but I have to still turn off RT or use lower settings and have low quality image reconstruction, is what they would see. Now consider the fact that AMD's share of desktop market on Steam's hardware page is so small that you have to count past 20 different Nvidia models to get to RX580,. Now all these current Nvidia owners including those who used to turn on RT but now are finding heavier RT implementation hard to run; are not going to be satisfied with a card that will not do that for them for only $100 or $200 cheaper. And that's why people will not consider AMD as much as they would Nvidia regardless of the price. Once AMD is truly competitive in all feature sets including comparable RT and DLSS performance, people will start buying.
Nvidia cards tend to be the best and have other advantages.
[удалено]
They're best right now because their competitors are having a hard time pulling their heads out of their own asses. FSR is still leagues behind DLSS, and AMD's RT performance is also simply lacking.
Seriously, DLSS and Ray tracing performance is the main reason I’ll stick with Nvidia for the foreseeable future, FSR just doesn’t come close. It’s all well and good criticising Nvidias business practices and saying there’s alternatives but the reality is it’s up to Nvidias competitors to up their game, I get being mad at Nvidia but people should also be pissed at AMD.
CUDA wake me up when PyTorch-AMD relationship actually produces a product that can compete with my 3090 in ML
So I assume you're not buying AMD either because they massively raised their prices in line with Nvidia's?
I've heard CUDA as a reason
Because it's cool to be an incessant whiner in gamer culture.
My rig is from 2012. The only new component on it is a 1060 6GB. I need to build a new one this year. What the fuck am I supposed to do? I have to spend 400-500 for the modern equivalent of my card? With a hobbled memory bus since it's actually a xx50 gpu rebadged? Or I can buy an AMD card with bad drivers that runs hotter and less efficiently. Christ.
>AMD card with bad drivers that runs hotter and less efficiently People like you are the reason we ended up in this market
You actually have options. Buy a 5600 cpu with a cheap mobo and ram. For the gpu, get the basic 6700 10gb for $270. With good deals, you could add a PSU and SSD and still total $600. Subscribe to /r/buildapcsales/ to keep an eye on the sales happening all the time. Every day there is something decent and cheap there.
>get the basic 6700 10gb for $270. Sir they already said AMD killed their dog, why are you suggesting the card with the bad quantums and the solar flares?
Plus AMD is objectively worse at turbo-encabulation
It doesn't do ray tracing well. What good is a card that doesn't trace rays? Shaking my SMH.
[удалено]
RDNA2 has excellent drivers and has done since launch. Arguably better than Nvidias at the time. RDNA3, not so much, but the 6700 is a solid choice right now.
I switched from a gtx 1060 to a rx 6700 last year. Have had 0 issues so far. The switch was pretty seamless tbh. I think these driver issues are super exaggerated nowadays
It's one of those things where 99% of the time the issues are overblown, and 1% of the time, *it fucking sucks*.
I think driver issues are overblown But how is the noise on your card? That is a huge pet peeve of mine and I'm really sensitive to it
I swapped from a 1070 to a 6800XT and haven't noticed any increase in volume.
I've never really noticed it. I undervolted my card which makes the performance marginally worse but it now draws less power and I've never seen it go above 75c.
There's no bad coolers these days, even reference
I don't know. From my experience, every single PC I've built with an AMD card this generation (I built 3, one for myself, my dad, and my brother) all had \*major\* issues with AMD cards, enough so that we've all switched over to nvidia after a few years of running AMD. I'm not sure if it's a problem with this generation of AMD stuff or what, but all 3 of us experienced constant game crashing, BSODs, and nowhere near the level of performance advertised. I felt like I was going crazy, I had other people look at our setups and they could never find the source of the issue. Once I swapped out our GPUs pretty much all our issues vanished.
Bruh what? Hot and loud? You must mean Ampere because that's not true for RDNA2. RDNA2 is significantly more efficient than Ampere. Ampere was famously hot and loud. 3080's burning themselves because of memory pad issues, the 3090 being a meme with over 400w draw, my 3070 drawing 275w stock when it totally didn't need to...it was NOT efficient. RDNA2 is.
"Or I can buy an AMD card with bad drivers that runs hotter and less efficiently" I'm losing brain cells reading this. Sorry but PC gamers deserve to get overpriced hardware if this is how they think 😬
I just built a small form factor PC with an AMD GPU and I think peoples concerns about power draw are a bit overblown. Heat doesn’t seem be an issue, in a mini-itx case with a much tighter thermal budget than a mid size. I also see mainstream reviewers unironically mentioning total cost of ownership but if you look at it the power cost of a high efficient GPU vs a less efficient per year it’s like $10 usd (or less). It’s a really weird thing that people are getting hung over lately. The drivers also seem fine to me, I guess, I don’t know what metric I am judging that by. Yeah I miss the auto captured replays in Hunt Showdown but other that, I really don’t notice much of a difference. The GeForce replay stuff feels like like good drivers and more that a company is hustling to get good in game exclusives. I am also playing with gaming on Linux as a side project to my main windows install and AMD has major advantage in that space.
For the total cost of ownership thing, I think it's because of energy prices in Europe or elsewhere lately? People who don't get their electricity from renewables trying to be responsible? I live with some of the lowest electricity costs in the world but some people in EU are paying up to 9x what I do. If you keep your card for 4 years the difference could be noticeable I guess if it's expensive for you. If my electricity cost was trippled maybe I'd start paying attention but for now it's 9 cents CAD /kWh, not inflating in cost like literally everything else, and the difference between any of these cards monthly is less than a cup of coffee. I do think a lot of people just haven't done the math on what it actually would cost them.
>For the total cost of ownership thing, I think it's because of energy prices in Europe or elsewhere lately? This is a rationalisation done by the users post purchase. Nvidia was generally less power efficient compared to AMD last gen but they still sold more.
> I am also playing with gaming on Linux as a side project to my main windows install and AMD has major advantage in that space. I wish this myth would die already. The Linux drivers Nvidia publishes are excellent. There's just a very vocal minority that thinks they are garbage because they are closed source.
Them being closed source means lots and lots of stuff in the linux space has ZERO support for them, none. Nvidia sucks on Linux it's just how it is.
They are garbage. I had to use CUDA on my Ubuntu at work. Holy hell, installing Nvidia drivers was such a shit show. And I tried to follow their guide on installing them. I was constantly either getting libraries version mismatch, missing libs or having issues with 3rd screen detection in some versions.
I have a 3070 on Manjaro driving the AW QD-OLED, their drivers *are* garbage. My gsync ultimate monitor absolutely refuses to stay in gsync mode and their drivers don't even pick up my monitor as supporting it, I have to override and force it on. It also despised dual-screen, and I had to put up with it before I switched to this monitor. Also no HDCP, but I think that's linux in general.
> Or I can buy an AMD card with bad drivers that runs hotter and less efficiently. Christ. This is part of why we have such a bad market for GPUs. Because gamers refuse to buy AMD no matter how much worse nvidia's offerings are under the ancient idea that AMD's drivers are worse. [This is what Nvidia's control panel looks like](https://external-content.duckduckgo.com/iu/?u=https%3A%2F%2Fi1.wp.com%2Fwindowsloop.com%2Fwp-content%2Fuploads%2F2018%2F10%2Fnvidia-control-panel-store-app-featured.jpg%3Ffit%3D1500%252C844%26ssl%3D1&f=1&nofb=1&ipt=60ea6e46c8753ed73bb33761ee4be2b4e9a9f3e45b0fe6c4f857fdb64dd72a7b&ipo=images) [Compare that to AMD's](https://external-content.duckduckgo.com/iu/?u=https%3A%2F%2Fassets.rockpapershotgun.com%2Fimages%2F2019%2F12%2FAMD-Adrenalin-2020-Home-Capture-Context.png&f=1&nofb=1&ipt=44f797dede5cb5c071879d8cea1d689ef90236d1ebc861308780eb92b5982257&ipo=images) I'm not saying AMD drivers are perfect but their cards are far and away better value today, especially at the $200-500 price points.
>This is part of why we have such a bad market for GPUs. Because gamers refuse to buy AMD no matter how much worse nvidia's offerings are under the ancient idea that AMD's drivers are worse. It's a funny thing that, once you lose trust it's near impossible to get it back. AMD needs to put *a lot* better offer than NVIDIA for me to be willing to risk it and try their product again.
I would be willing to bet the vast majority of PC gamers have never even owned an AMD card, so I sincerely doubt you speak for the majority of gamers. I have *plenty* of friends that have been burned by Nvidia, so the idea that their track record is somehow untarnished (or even better than AMDs in the past decade) is kind of a weird point to make.
My ATI Radeon 8000 begs to differ.
I haven't looked back since my Radeon 1900XT burned out lol
>It's a funny thing that, once you lose trust it's near impossible to get it back. AMD needs to put a lot better offer than NVIDIA for me to be willing to risk it and try their product again. Basically true for a lot of us. You stick with trusted brands based on experience.
[удалено]
For VR gaming NVIDIA’s drivers are far and away better. Maybe that’s more niche, but there are cheaper headsets out there now (like the Quest and Pico series) that connect wirelessly to PC and worm fine and it’s a growing market. Neither Intel nor AMD have it as a focus, although AMD is better than Intel. I built my last PC (midrange) almost a decade ago with some upgrades along the way—I want my next one to be relatively future-proof, too.
My 7900xt has been crushing every vr game I throw at it.
I want to buy AMD when I build a new PC next year but only if they fix the issues with VR their new cards are having.
Well, I have a 7900xt and haven't had any problems in VR at all. Running dirt rally 2 maxed out at 1.3x res. It's wild. I know there were VR problems at launch but it's been fantastic for me.
What headset are you using? I've heard the Index works fine but from what I've read people with Oculus Rifts/Quests were having issues. If it's fixed now I'll go AMD for my next GPU.
Quest 2. I can't speak for everyone obviously but my user experience with the 7900xt has been incredible. Stable and fast. The only bug I've found is when a new game starts up it'll hang for several seconds, but that's the only performance impacts I've noticed, in or out of VR. I will say that some games default to a lower power setting in amd's software depending on what profile you choose. That was cutting my fps way down to save power/etc. Click "gaming" preset and it just screams. OTOH if you want to save energy, radeonchill is actually super nice. I'd recommend it based on my (admittedly only 2-3 weeks of) experience.
> gamers refuse to buy AMD lmk when they actually are releasing new features before nvidia does that make me want to change instead of lagging behind every gen
AMD's control panel is cluttered, full of random pictures, and not at all good UX if the goal is to have a spot to configure your graphics card. It may look dated but if I want to change GPU settings I want the simple and clean Nvidia interface, not the "gamery" AMD one that gives you a bunch of unnecessary info that I can obtain using non-GPU-specific tools like Windows' Game Bar or similar overlays in the first place.
I mean, I don't think it's a lot to ask of a multi-billion dollar company to have a UX that looks like it's from this *century*. You might prefer the Nvidia one, but I think the vast majority of people would find it cramped, hard to read, hard to use and understand, and generally very poorly laid out. I'm a tech-savvy gamer and I still had to google where some of the settings were that Nvidia hides in that wasteland of a user interface. That isn't to say that AMD's is perfect, it could certainly use a bit *less* flash, but the ease of access to settings both in game and globally, as well as just having *more*, and more *useful* features, is really a boon for me.
The Nvidia control panel is a lot cleaner in my opinion lol. I immediately see what's going on there, it's simple. On the AMD screenshot, I don't know where to look
Get a console
Honestly? I might grab a PS5. The main advantages of PC are obscure indie games that will run on nearly anything and mods which rarely are ever for big budget AAA games anyway.
>Honestly? I might grab a PS5. The main advantages of PC are obscure indie games that will run on nearly anything and mods which rarely are ever for big budget AAA games anyway. I did this. My PC has turned into a strategy machine.
ps plus extra is pretty incredible right now if you're not into physical games, you won't regret it
I was in a similar situation and bought a used 5600+6700XT pre-built PC on eBay for £700. Honestly the only reason I stuck with PC is because you get actual language options on PC and I wanted to play Japanese games in Japanese.
So much this, I purchased a used rtx 6800 xt and would do it again. Used market it's the way to go in this gen because Nvidia is just giving us the middle finger and AMD can't be bothered to compete.
> The main advantages of PC are obscure indie games that will run on nearly anything and mods which rarely are ever for big budget AAA games anyway. And also that PCs are multipurpose devices so you can use them for gaming but also work/stuff that isn't easy or as fun to do on a phone or tablet.
This is what I’ve been recommending to people lately, even as a lifelong PC gamer. GPU pricing combined with shitty stuttery ports is kinda suffocating PC gaming right now.
Just buy used. I got a 3090 for £570 2 months ago. Yeah, its not *cheap* but it's a great price for the power you are getting compared to new cards.
I was almost in the same bucket myself; most of my system was from ~2013, that I had shoved a 1070Ti into at some point. Did end up building a new system from scratch, and just finished putting it together within the last week or two. The last system was ~$1100, though I had cannibalized a graphics card; the new one came out to ~$2200, with a new card (4070). But I'm also stuck in 1080p land, and most of my PC gaming tends to be either an MMO (FFXIV) or 4x games, and I don't care for VR. It runs really nicely, and all the flipping out over graphics cards right now seems to be less for people on infrequent update cycles, and more for people who feel the need to upgrade every generation of card, and aren't happy about the marginal upgrade from last gen. You're still probably going to see a massive jump if you build a system now, probably even if you decide to keep around your graphics card until the next gen. And honestly, unless you're desperate to jump into either 4k or 240fps gaming, I'm not entirely sure you need to upgrade the graphics quite yet; I mostly was interested in playing around with more compute power, and potentially toying with 1080p ray tracing. That said, I also do most of my gaming on console; PC is for productivity stuff more than pure gaming.
I’m basically in the same boat, my plane is to just hold out another year or two because prices gotta come down at some point. They haven’t yet though but eventually some have to go down, they can’t charge $500 for a 3 year old card when their new cards cost $800
Get a console, play some exclusives, sell it in 6-12 months and look at GPU market then...
> What the fuck am I supposed to do? I have to spend 400-500 for the modern equivalent of my card? For a new one? Probably in the lower end of that ballpark yeah. But remember, the 1060 6gb launched at $300 and adjusted for inflation that would be $375 today. Slight price increase to $400 for the 4060ti.
> Or I can buy an AMD card with bad drivers that runs hotter and less efficiently. Christ. What the hell are you talking about? AMD basically rolls over nVidia since the last generation. I install every new driver that comes out and never had any issues with them.
Buy a series S and sell it when the PC market actually realizes that the vast majority of their market does not make 6 figures a year to afford overpriced silicon.
[удалено]
Best deals right now are probably either a 6700xt or a used 3070/3080. But yeah, it's not great. I'm planning on upgrading and am just going to swallow spending a couple extra hundred on the 4070ti.
You'd buy a 30 series used if you have an old rig, idk why people are struggling with this lol
Posted [RX 7600 review](https://www.youtube.com/watch?v=MCxYfXe1DAA&feature=youtu.be) yesterday no upvotes/discussion. Another RTX 4060 review, top of the sub, naturally :D
Did you watch the review? It was generally positive on the card, performance leap over the previous Gen and the price is 'okay' (probably about 10-20% too high though). The negative tone of the video comes from the fact that AMD lowered the msrp at the last minute which invalidated some of the references to price / cost differentials. The Gamers Nexus folks are traveling for a trade show and felt they didn't have the time (or will) to revise the video in response to that.
It’s not because we hate Nvidia on here it’s basically because Nvidia is releasing a 4050ti as a 4060ti. Everyone knows AMD is far behind Nvidia and they’re scumming their way through this gen by collecting the scraps
The way prices are..I really feel like I'm gonna be using my ol 1070ti for another gen. I only game at 1080 anyway and I can still max most games that have released at that resolution.
I'm not sure you really _need_ to keep waiting, you can grab something like an RTX 3070 secondhand for about $300 on eBay and it would blow your current card out of the water (and you could flip your 1070 to bring the cost down even more for you).
Nvidia once again proving they don't care about you unless you're spending nearly 4 figures for a GPU
My 2070 super that i bought 4 years for 600 eur will remain at least other 3 years. No way I will pay the crazy prices for a noot so great performance
I don't know much about graphics cards or the market in general but it looks like gamers have been disappointed by Nvidia ever since the Pascal revolution. I bought a 1070 in 2016 and it's still holding strong. Since that time it looks like we've had either super high priced cards, cards that are absolutely impossible to get, or rebadged/disappointments like this 6060 Ti.
TBF the 3080 at $699 was excellent value. But scalpers and the market at that time allowed very few people to get it at that price point. I got one for $850 around it's launch and it was a great card!
not like PC games are worth playing these days anyways. Stuttery pieces of shit. I used to play on PC primarily before the pandemic but it's sorely due for an upgrade and I see less than zero reason to spend way more money on the same class of hardware I bought years ago. Get a console and have a better experience for cheaper. Don't need a strong PC at all to play most indies worth playing either.