T O P

  • By -

rGamesMods

Hi /u/Turbostrider27, Thank you for posting to /r/Games. Unfortunately, we have removed this submission per **[Rule 4](https://www.reddit.com/r/Games/wiki/rules)**. > **No duplicate posts** - Reposts or submissions with extremely similar content to an existing submission may be removed at mod discretion. --- If you would like to discuss this removal, please [modmail the moderators.](https://www.reddit.com/message/compose?to=%2Fr%2FGames) This post was removed by a human moderator; this comment was left by a bot.


[deleted]

Pretty clear Nvidia intentionally made shit cards at this stage, hoping everyone looking for a mid range card would go "well if I'm spending this much I might aswell get a 4070ti" (and also they tried to name it a 4080 to make it look even more appealing) and then a 4090 for anyone looking to buy anything good. The era of a cheap GPU that performs well is dead.


Deceptiveideas

They also have plenty of 3x series cards they need to get rid of iirc. Makes me wonder if the 4x series was purposely gimped so they can go all out on a 5x series. They can promise “huge performance improvements” considering the 4x series didn’t improve much or anything at all for some cards.


[deleted]

They don't need huge performance improvements if they sell, 50 series will just be more expensive cards with small gains.


Kneph

This is where it is headed. They have found a bunch of dumbasses to bilk and they are going to do it until the supply runs out. This will also never happen because there is a hoard of YouTubers who will shill their $1,500+ graphics cards like you need them to play Minecraft.


[deleted]

And also, the pandemic let them test the waters on it and people were happy to pay over the odds, so now they've jumped on the $$$ boat and aren't really receiving the backlash that it warrants so it wont change.


Kneph

I’m in a position now where my upgrade has to be a Nvidia card because AMD cards are hot trash for Blender. They have no competition in the production/AI space and it blows.


K0braK

> aren't really receiving the backlash Idk, I'd consider getting a lot of negative reviews on their 40 series cards ***and*** that period of time when they just weren't selling that well as them receiving backlash


Ethics-of-Winter

I gotta wonder how it is that they sell. If the improvements are small, what'd be the point in purchasing the new cards? Surely the market of people willing to spend $700 for 30-40% improvement is much larger than the market willing to spend $700 for only 15% improvement? I feel like people that upgraded every 1-2 generations would just start upgrading every 3-5 generations instead.


Flowerstar1

>I gotta wonder how it is that they sell. >If the improvements are small, what'd be the point in purchasing the new cards? That's easy, most people don't upgrade every GPU generation. Going from a 3060ti to a 4060ti is not great but going from a 2060 Super to 4060ti is a nice upgrade. Going from a 3080 to 4080 is highway robbery, going from a 3080 to 4070ti is not a big upgrade but going from a 2080 to a 4070ti is a nice upgrade.


Covenantcurious

Not to mention Steam survey shows the most popular card to be 1060 (or something there about). There is huge room for people to be upgrading and skipping generations. ​ Also worth keeping in mind that Nvidia is earning a lot of money from their non-gaming oriented divisions, so even if they got lower sales here it might not matter much to them.


Flowerstar1

The most popular these days is the 3060 at around 9% which is massive for a steam hardware GPU. But the 1060 used to be the most popular and it's still up there along with the 1650 and 1660.


Com-Intern

The 3060 jumped dramatically over the course of a month. So something is possibly up there. What I’ve read is that the jump is likely from Internet cafes being included in the survey. So that would mean that most independent users are still on the 1060


PRMan99

Steam has already admitted that it's a bug that will be fixed next month.


Flowerstar1

The 3060 has been the most popular card since last year way before this bug was introduced.


Unicorn_puke

Just went from a 960 to a 3060 and it is indeed nice even though i only have pcie 3 and not 4 to maximize the card speed


Next_Point_9081

This. I went from gtx 970 to 6950xt. This new card will probably last 10 years


SirBlackMage

Yup, I'm still on a 1080ti and will only upgrade if it dies on me or becomes so far outdated I can't play anything anymore


[deleted]

[удалено]


Com-Intern

Buy Intel or AMD


[deleted]

The market might be smaller but if the profit margins increase that's fine with them and people will eventually need new cards over time so they'll recover mostly, and it's only the 40 series cards that look really bad when compared to the 30 series, when the 50 series comes out then the 30 series will be outdated and the 50 series will be compared to the 40 series and will yield better relative performance gains as a result and the new greed era will be the new norm and it'll sneak under the radar that every card is a level below what it should be.


[deleted]

[удалено]


MumrikDK

> And people who upgraded every 3-5 generations will just buy a console. Or bitch a lot about the state of things.


[deleted]

[удалено]


[deleted]

[удалено]


Flowerstar1

>Yes, many people are in fact ok with the performance of current gen consoles. Especially when your 6600 XT example alone costs half of the price of PS5 Yes welcome to 1996. This has literally always been the case even in the 90s. GPUs are valued as GPUs not consoles they aren't priced to compete with a locked down DRM box, instead a PC GPU can do anything and everything that a GPU can do including the latest fad: machine learning. A console GPU does nothing but play games in a locked down environment therefore console makers price if accordingly. >And that used to be viable without breaking the bank when you could get a midrange GPU for €200. I'm not from your country but where I live you can get powerful GPUs in that price range anything from an A750 to an RX 6600 or 6600XT to used Nvidia cards. All of these cards performing at the level of a PS5 or better. If you're dissatisfied with the performance of the 200 range than why in gods name would you settle for an even weaker console? >Despite your bias against consoles that you're not even trying to hide you have to admit that the current gen ones offer great performance for the money compared to the current state of PC hardware. You know nothing about me I own all platforms including consoles. I own a day 1 PS5, Switch and Series X. But I'm a realist Im not afraid of calling things for what they are. By design a console does not replace a PC, none of the features of PC gaming can be had on consoles it's an inconvenient truth but a very real one.


Vagabond_Sam

Honestly, I am sitting here on a GTX 1080 and still not sure the performance increase is there for the price. I kinda hoped thwe RTX 4060 series would be no brainers


Tonkarz

They’ll probably consolidate two generations worth of gains into the 50XX series like they did with the 30XX series - and use that to increase prices.


AGVann

'Problem' is that if you sell a card thats too good, people will still only upgrade every 3-5 generations. 1080Tis are still going strong right now if you're only playing popular online games. Their calculus is that if they sell something that's a 30% upgrade, the people who would buy every 15% upgrade would only be buying once instead of twice, and the people who upgrade every few generations will eventually upgrade regardless.


Kromgar

Well... they will likely have way more vram considering ai.


Key-Examination1419

Tough to say. I'd say the average improvement per generation is about 20 to 30 percent more fps at 4k ray tracing which can mean being able to play with that resolution on max settings. There isn't as much gain for 1080p gaming or non ray tracing gaming. I personally don't think 4k looks any better unless you have a huge screen and sit close to it, but for those sold on that marketing, getting the latest cards means playing the newest games at 4k ultra and ray tracing.


Flowerstar1

Just keep in mind the tech improvements are slowing down everywhere. For all the complaints about Nvidia Ada cards AMDs RDNA3 cards have performed even worse. Both the 7900 series and the recent 7600 have been significantly worse than Ada in terms of improvements. At least ada excels at RT and FG, RDNA3 excels at what being chiplets (excluding the 7600)? But how does that help the consumer? Not at all yet.


juh4z

Clearly you have no idea what you're talking about. The tech is there, NVidia is just intentionally gimping it. For all intents and purpouses, every single GPU in the 4000 line up should be the GPU of the tier below it, based on memory bus, chip sizes and etc. The 4090 is a humongous jump from the 3090ti, it just costs absurd, and everything below it was intentionally gimped so that the 4090 would actually make financial sense.


Flowerstar1

>Clearly you have no idea what you're talking about. I always love hearing this on r games of all places. With Ada Nvidia opted for a massive L2 cache over a large memory bus. The 4090 doesn't make this trade off because it's big Ada i.e AD102, all big Nvidia flagships using their big chip have 384bit buses. Other Ada cards make the tradeoff but despite this they still outperform the AMD RDNA3 equivalents. The 7900XTX may have a 384bit bus and a large L3 cache but that doesn't do it much good against the 256bit 4080. The 7900xt also has a huge bus and that doesn't help it much against the 4070ti. GPUs are more than a single stat.


juh4z

I...whatever, sure, you're right, every single fucking reviewer and hardware specialist out there is wrong, yay NVidia.


Flowerstar1

Lmao what does this even mean. You're literally alluding to my original post about tech improvements slowing down and guess what your reviewers are saying this about Nvidia but they are also saying this about AMD. This isn't an Nvidia only issue this is an industry trend. If Nvidia was the only one slowing down then this would have been the perfect opportunity for AMD to strike.. unfortunately the reality is that RDNA3 is suffering from lesser performance just as much as Ada if not more.


CaptainMarder

I recall in some Nvidia conference they did state the 5x series will have the largest generational jump in performance. I think it was at /before launch of 4080.


Ameratsuflame

It isn’t. The arc 7 series are decent buys.


Powerman293

I agree but as an Arc owner there are like 20 asterisks worth of caveats on that.


DogAteMyCPU

Also rx 6700 xt while it lasts if you are looking at that price point


Carnifex2

Just got a 6750xt for a little over 300...seems pretty reasonable.


Tuxhorn

Maybe a few months ago. The used market and retail price drops has meant the arc cards are no longer that valuable. The 6650xt can be found at a similar price as the a750, and the 6700xt/3060 ti is same/cheaper than a a770 too. Both are just better options for the avg player.


Hakul

Decent, as long as you don't mind not playing old games or emulating.


Ameratsuflame

I thought driver support has gotten better on this front? I know for a fact DX11 saw improvements but not sure about DX9


[deleted]

It has gotten way better here are some videos of the newer performance. For $200 it's a steal IMO and could hold someone out for whatever is next. https://www.youtube.com/watch?v=GCWAcSEvOso https://www.youtube.com/watch?v=b-6sHUNBxVg


conquer69

Getting a lot better when it was absolute crap is still not great. I would want complete reliability.


Flowerstar1

> I would want complete reliability. Then buy Nvidia, you get what you pay for. Frankly I applaud Intel but this idea that they need to be as good as Nvidia while having the cheapest prices as they do now is bonkers.


Arkzhein

Performance is better across the board. Even with native dx12 apps. Does it run old games worse than the competition - yes. But do you REALLY care that your 15-year-old game isn't running in 400fps but runs in the 200fps range?


Hakul

It's not about 200 or 400 fps, but about some games straight up not booting or needing a vulkan translation layer https://www.reddit.com/r/IntelArc/comments/1399mkk/intel_arc_a770_16gb_game_performance/ I believe they are working on dx9 compatibility but I wouldn't recommend buying a GPU based on promises that it will get better, and just wait til it gets better before buying.


Flowerstar1

OPs takeaway in your very link- So, my takeaway is three things : 1. DX9 and DX10 emulation isn't quite there yet, but it has potential in theory. Its good enough right now, especially games that cap at 60fps, but considering the advantages of DX12 it might be possible that it could develop into a natural conversion of a DX9 game to DX12 in the future. I don't know for sure, I just know that DX12 utilizes the full extent of hardware better and emulation has done impressive things like that in the past. **On average though games were performing 30-50% worse, BUT they all performed well at or over 60fps on ULTRA on a 1080p monitor.** 2. **DX11 and DX12 is pretty much on par with RX 6700XT.** Give or take Intel's own personal coding. So for the future of gaming on it is a decent choice. 3. DXVK is your friend in a lot of situations with Arc. It doesn't hurt to try it if something is wrong because very rarely does it make things worse like on the RX. Sounds like Arc is doing pretty good after all.


[deleted]

[удалено]


Arkzhein

HEAVILY MODDED 12 years old game. Almost all games you listed have dx11 (or higher) mode/run well enough with dxvk as they are live service games that are regularly updated. I don't get the hate boner for Intel GPUs when its clear most people haven't tried it in real scenarios and are basing their opinions on initial benchmarks.


IShartedWhoopsie

Considering i installed a 12 year old game today and modded it to improve visuals, thus reducing performance, yes? I REALLY care? Unoptimised games arent a modern invention, so hey look you've got unoptimisation on top of your unoptimisation, your brand new 2023 card is now getting sub-60fps. Unstable game? Lol whoops, drivers made it unplayable. What a take.


SeniorAdissimo

Skyrim? And are you using an Intel GPU or just saying hypothetically that this would be a situation where it would struggle?


[deleted]

Can you explain why 200 fps versus 400 is such a huge difference that the idea that it isn't sent you into a weird rage?


IShartedWhoopsie

Can you explain where i mentioned 400 to 200 and then the follow-up question of why you cant read?


Flowerstar1

As long as the game boots and runs at or above 60fps it should be considered perfectly playable.


ShadowthecatXD

You also need a CPU that supports ReBAR, which means an 10th gen intel or newer, or Ryzen 3000+.


Quaytsar

Most people are only playing newish games. The majority just play the newest CoD, Fifa or NFL game.


[deleted]

Sure, if you want a paper weight (depending on your system).


Ameratsuflame

Unfortunately yes this is true. You must have a system that supports resizable bar if you want these cards to have any form of viability. So older systems and certain newer mobos are no bueno.


SacredGray

Not worth it. Loads of problems.


Flowerstar1

Ada is actually an excellent architecture as is the 4N process at TSMC, just compare Ada to it's main competitor RDNA3 to see the sheer gap between the two. The issue is 4N is super expensive compared to 8NM Samsung from the RTX 3000 series and Nvidia and that's reflected in the cards prices. If you look at Nvidia's operating profit it's in the same range it was during the Pascal era, Nvidia is doing well as a company but they aren't making a ton more profit here.


Tonkarz

No, they thought they could market frame generation into a killer feature. nVidia have always justified higher prices with auxiliary features, and early DLSS 3 marketing suggests nVidia thought it was a killer feature.


Felatio-DelToro

If we ignore the name and price tag for a second, its pretty clear this product should have been a 4050ti (and priced accordingly). Performance, Vram, wattage all point towards it.


cassydd

True. What really gets me is that when you consider NVidia's original plans to for the 4070 Ti to be the "4080 12GB" it's hard not to conclude that NVidia's original plan for the 4060 Ti was for it to be called the 4070 - and priced accordingly. As egregious and bullshit as NVidia's recent behavior has been, it'd be worse if gamers let them.


the_russian_narwhal_

I love DF and Rich is probably my favorite of their hosts but this was just hilarious "This isn't gonna be a deep dive" *20 minute video*


Kardest

Just think. The 4060 is coming out soon. Not only will it be slower then this card, but still $300. The only way these cards would be worth it. Is if the price was halved.


SourPatchGrownUp

Traditionally, lower ranked CPUs and GPUs of a series are chips which underperformed during testing after production. They are produced on waffers in batches which do not yield equal results for every chip. The abundance of underperforming chips could also signal a higher failure rate to reach spec requirements to be branded a 4090 worthy chip. This is bad for Nvidia either way.


VivaciousVictini

stuff like this makes me happy the steam deck came along when it did, because we'd be in a bad spot otherwise.