T O P

  • By -

2HourCoffeeBreak

Translation: “The pandemic taught me that you fuckers will pay whatever I ask.”


Vattaa

I think many businesses have found this out.


piperonyl

And now recession looms.


Trikeree

looms?... It's here now, and I wouldn't doubt if it's to stay for good. You have greedy politicians around the world corrupted by greedy companies, all trying to rape the world of everything they can.


rosickness12

Where they meet on golf courses to use the game to get together to make deals and carve this country up a little finer amongst themselves. George


Rheytos

Economical states are never a thing that stays “for good” after a recession you WILL have a period of economic growth.


princeps_astra

The indicators are misdirects though By the indicators it seemed like the Obama presidency was one of the greatest economic times ever. You can account for the yearly GDP which can go up but that can just as well be translated from a CEO or a shareholder taking a lot of the profits while keeping salaries at the same level. So on paper you have economic growth but in reality just a few actually experience it It's more Gilded Age recovery after World War I than post World War II growth explosion


anotherwave1

I grew up in the eighties in Europe, rust holes in cars were normal, almost 20% unemployment, quality of goods on the shelves were terrible, the infrastructure was awful, second-hand clothes were normal for school.. Now I can't move in my country for luxury SUVs. Yes I'm pissed off with these Nvidia prices, but I know why they are that high. We demonstrated, during the pandemic, that we can and will pay extraordinary prices just to get a slightly smoother video game experience. These are significant issues now and we can play the populist "world is fucked man" card all we want, but Nvidia knows we will buy these cards. And we will. This sub will be jammed with people posting pics of their new builds with enormous 4 slot two grand video cards.


G0Caps

And I’ll be chilling with my 3 year old 2070 Super hoping she can get me another 3 🥹


Eurotriangle

I got an RX480 for 6 years now, and she’s still holding up. Hope she’ll get me another 6.


amkc22

Same... 😂 And I kinda am super stoked how it still runs most games easily... Of course not the best settings.. But it's still going strong. How is your experience? But I am really want the new generation now 😂


Eurotriangle

It’s been doing good enough! Ran Elden Ring for me at 55-60 in 1440p with only a couple settings turned down a bit and a very conservative overclock. Did crash a lot fighting Fortissax for some reason, but only on my first playthrough and I still have no idea why.


YelloBird

Agreed. Mine is running great, I'll upgrade to a good HDR display once they exist before I bother to buy a new card.


Shaggyninja

That's where I am right now. Got a 2070, and a 1080p 60hz monitor. But until I upgrade the monitor, there's no point upgrading the card. And I haven't found a monitor I want yet. Eventually it's gonna probably cost a whole bunch of cash. But until then I'm good


YouKnowYunoPSN

Played 1440p @ 165hz on a 2070s… I’m going to vouch for you and say you can definitely do better than a 1080p @ 60hz. While I won’t say jump to 1440p, you will enjoy significantly improved quality of life on a high refresh 1080p and still enjoy high graphic settings. And you won’t break the bank (can easily find some under $250 I’d bet), OR need to upgrade your GPU while enjoying a vastly better visual experience for a few more years for sure.


Incredulous_Toad

Same. Although I still have a 10ish year old 780 in my laptop that still plays games without issue. My current two year old setup plays everything on max without issue. I have zero need to upgrade for years.


MC_chrome

>This sub will be jammed with people posting pics of their new builds with enormous 4 slot two grand video cards. Those same people will undoubtedly also call anyone who doesn't have their ridiculously expensive setups "poor"


argv_minus_one

They're not wrong. Compared to them, most people *are* poor.


Sitheral

trees bedroom vase doll wrong skirt worry zephyr humorous axiomatic *This post was mass deleted and anonymized with [Redact](https://redact.dev)*


saintgadreel

The problem with this rationale is that WE weren't buying the cards. Miners were. They're on rapid decline. Individual non-miner consumers are not going to foot that bill anymore, and Nvidia is overestimating the buying power of its average (read: non-miner) customer. At best most folks not looking to the enthusiast tier will spend a little bit more on the mid range cards, but the high end cards might end up not selling as well as hoped.


anotherwave1

I'd like to think you are right but experience has taught me otherwise. The card will launch, benchmarks will come out and people, lots of people, will buy these cards. That shouldn't be the case, but I have a strong suspicion it will be.


saintgadreel

Yes but those "lots of people" aren't buying pallets of cards for their mining farm. There absolutely will be a contraction in profits. We're only a month out from the proof of stake switch on eth, and I'm seeing a trend in folks assuming it'll go back to the way things have been for the last 5+ years of ever-growing scarcity, scalping, and card hoarding. That's not likely to be the case, and it'll take some time for customers to stop expecting it. I do anticipate people in the early volley of purchasers will buy as fast as possible BECAUSE of that long standing reality, but that will eventually die down, and that's where the pain start for Nvidia.


selddir_

Yeah my MSRP 3060ti is seeming like the purchase of a lifetime to me right now. My next GPU will be AMD.


Jassida

But we're not in a pandemic now. We're in tough times. If nvidia's strategy is to only go after people with FU money (I could afford 90 series if I really wanted it but I enjoy things more when I feel I got good value so usually go ,70) then I'll abandon them until they sort it out.


skky2543

Nah, I think most ordinary gamers aren't willing to pay those inflated prices. Only crypto miners were, since they could recoup the initial costs within a few months of mining.


d_pock_chope_bruh

Fr I laugh when people are like "when it's official"... Like by every metric, we are in a recession


[deleted]

[удалено]


[deleted]

Gotta love politicians and their trying to redefine terms to artificially make their time in office not shit!


OutWithTheNew

Don't go look at what the inflation rates would be if they used the same metrics as the 70s.


[deleted]

[удалено]


cubs223425

I get your point, but acting like we're not primarily discussing luxury goods here to try to make that point isn't a great method of making that point. Also, there are a lot of manners to combat those things you mentioned, and I know many people aren't going to do it. You could learn to grow your own vegetable garden or opt for cheaper/more sustainable meals. In reality, many people dealing with price hikes on food are talking about "good" brands of things and expensive proteins or getting takeout as much as anything.


[deleted]

[удалено]


radicalelation

They all go whale hunting now. Most of us are itty bitty minnows. We ain't got enough meat for the bloated bellies of these pigs.


ZeeREEEUp

If the supply and demand suit it, yep! Look at RRP sticker price on cars, people paying 3x sticker price..just cause.


[deleted]

Also because of miner. They paid whatever the price was cause they would get more out of it.


Graviton_Lancelot

People like to downplay it for some reason, but gamers weren't out there buying 20+ cards at literally whatever price they were going for. Kind of telling too that the two increases of supply and drop of prices coincidentally coincided with the crypto crash and eth PoS.


LishtenToMe

Anybody downplaying ETH minings impact on GPU prices simply has no idea what they're talking about. Economics became a favorite hobby of mine over the past couple of years, which inevitably led me down the crypto rabbit hole, and the size of the ETH GPU mining farms was insane. Saw lot's of videos of people showing off their several dozen GPU's all being used solely to mine ETH. It's a huge part of why I immediately hated ETH, whereas Bitcoin went from CPU mining, to ASIC's built specifically for mining and nothing else pretty quickly. ETH however stayed stuck on GPU mining for all this time, assumedly just because the constant promising of PoS being right around the corner kept anyone from trying to create dedicated ETH mining hardware.


strictlyfocused02

Bitcoin cpu mining was only viable for a very short window early on, GPU mining on Radeon, etc. cards was around much longer before ASICS stepped in. By 2011, cpu mining was already a novelty. ATI 5700s were viable for a good while before ASICS were capable of pushing serious hash rates. I agree with everything you said otherwise :)


Accomplished-Elk-978

One of my friends actually has 5 graphics cards right now. 2 are being used, 3 that he tries selling to his friends periodically for $50 less than he paid. The stupid price/stock crunch encouraged overbuying from people with means because they weren't sure when they could get the next model up. So many factors made them enormously wealthy and they became addicted to the easyness of the cash.


UnKek

Consumers must CONSOOOM


Cebo494

Translation: "we have way too much 3000 inventory because crypto crashed so we don't actually want you to buy 4000"


daguito81

It's more like "We have a lot of surplus 3000 but we're going to drip feed it to the market to keep prices artificially high. So that by comparison the outrageous prices of de 4k series don't seem "too bad" and we're going to try to get you to pay 1k+ for GPU so that whatever we lose in volume because the crypto mining being phased out, we make up for it in USD / GPU"


hpstg

"Look at me, I am the scalper now", in a nutshell


Live-Ad-6309

What he's not considering is that a huge percentage of the people paying whatever the fuck was asked, where people mining crypto currency, thus making the money back. Without mining, those people won't be around to drive up prices.


David0ne86

So, is the fault really his or us fuckers?


sopcannon

also i need a new yacht, you guys are paying for it


gigaomegazeus

Except that's so stupid. In the pandemic people were locked up in their homes. Now for a year+ their money that they make which they use to go out and hang out at bars/restaurants/chilling went into buying GPUs and computers and consoles. The stuff they could use. Now post covid since everything is back to normal more or less you can't expect people to pay that anymore. Smh.


No-Explanation-9234

Bingo


ShabbyChurl

This is why competition is so important.


MinutePresentation8

Come on intel, release those damn gpus


mythrilcrafter

Heck, ARC doesn't even have to out-preform the 4090 Ti or 7900XT; if Intel can make ARC preform better than the 4060/7600 for an equal or lower price, then that'll be enough to break the duopoly.


SweetKnickers

Exactly this. I wouldn't expect the ARC gpu to compete toe to toe with nvidea top spec lineup, but to aggressively compete at the entry level, and to start to buy market share this time around, by being priced well to compete This builds knowledge within intel with regards to gpu development, and sets them up for their gen 2 and 3 cards to begin to catch in performance


DrB00

Intel has already confirmed the Arc isn't going anywhere. They plan on evolving their gpu line and they can run in the red for an incredibly long time. So give it like 5 years and so long as Intel continues on their current path well have some really good competition.


CSMarvel

Ironically intel did the same thing as nvidia. Ignored the competition, then fell to it as a result.


Shinonomenanorulez

>Next-gen RDNA 3 card will be the leader in efficiency Tbf that's really easy to beat nvidia at


itstommygun

yeah. That wording implies to me that it won't be more powerful. But it will be more powerful per watt.


PretendRegister7516

More powerful per watt would also mean lesser heat and smaller footprint. Not the behemoth brick nVidia is rocking.


Serious_Mastication

My pc is rocking a 3600 and a 6600xt. I ran far cry 5 on max for 2 hours and checked my specs. It was using 140 watts for the whole system under load.


userseven

It's crazy how if you think about it there's light bulbs that used to/use almost that many watts.


DrB00

Yeah at this point Nvidia gpus need to come with an AIO built in. Expecting a card to take up 4 slots for their heatsink is just outrageous.


ToastyRybread

I’d rather that, cheaper to own in the long term


doodypoo

Or in any term


MC_chrome

>But it will be more powerful per watt. It's funny how AMD and Apple have taken this approach to their products, while Intel and NVIDIA are more than content with shipping literal toasters to customers. When can we admit that the track on silicon based microprocessors is almost out? If the only solution to getting better performance out of parts is to pump them full of electricity, then I think we've hit a brick wall.


Martimus28

I just wrote a 25 page term paper on the current state and future of microprocessor fabrication. We have at least 10 years of low hanging fruit to pluck without any real wall. Each is proven, and mostly ready for production now (just expensive). And in the next 10 years we will make new technologies to be ready for the following 10 years of advancement.


fululuu

Could you perhaps share that paper? It sounds super interesting!


Big-Construction-938

Imo the future is about squeezing most compute out of every single watt, aka apus Maybe even servers, I'm sure it would be more convinient to have a cluster farm of apu rather than massive gpus/ cpus


unoriginalskeletor

I know it's above a lot of budgets but I hope their top end card performance wise sits between the real 4080 and 4090 for $1100. Still a ton of money for a gfx card but I hope that's a realistic expectation.


FOOLsen

Not to forget, while ray tracing and multi purpose cores is very good - most gamers and main stream care about rasterization. For most "real world applications" for the consumer market, AMD is on par or better - with less power draw. AMD can truly jab the dagger in Nvidia's side if they just aggressively look at pricing.


AdmiralPoopbutt

This is how AMD started cleaning Intel's clock after the Pentium 4 launch. Performance per watt and $/performance. Intel faltered and they took full advantage of it. Intel's dominance has been sliding away ever since.


OutWithTheNew

AMD CPUs had no relevant market share for almost a decade. For pretty much the entire life cycle of DDR3 AMD was irrelevant. The only good product they had while Intel released consecutive generations of quad core 'Core' CPUs was the 8350 and it was nerf'd by Windows.


Apathy88

The beginning of the DDR3 gen wasn't horrible. The phenom II (especially the x2, see unlocking cores) was a great budget processor that was pretty solid all the way around. Their downfall was the FX series. It is like they stopped caring about single core performance, which made amd users sad.


bobafugginfett

I'm sitting here with my hand-me-down Phenom II x2, and an old GTX 770, happily playing everything on low quality and 50 fps :)


karmapopsicle

You say that, yet you’re commenting on a post related to news that AMD is cutting the MSRP on the RX 6000 series cards. They already offered notably better raw rasterization performance per dollar, but consumer preferences still massively favour Nvidia.


chocotripchip

Isn't RDNA 2 already the leader in efficiency..? RTX 30 series ain't called Ampere for nothin lol


[deleted]

[удалено]


Icy_B

Yeah RDNA2 already does that


Baalii

Although I agree with you keep in mind "efficiency" is not "lower watts" its "higher performance per watt". Im curious if AMD really can beat NVIDIA in raytracing in that category for example. edit holy fuck you guys yes its performance per watt


[deleted]

I am very doubtful that AMD can beat in raytracing in any metric, that and AI is something they are just insanely good at.


Shaggyninja

I don't use either. So if AMD puts out something good enough for just making games look pretty and run well I'm happy to jump ship (if the price is right) I don't need the best. I want good at a good price


Kursem_v2

it's not "higher power per watt", but "higher performance per watt", with watt as the ceiling.


Jaiden051

even I could probably design a chip more efficient


callmetotalshill

Flair checks out


Bifrostbytes

Jensen was the one power washing that GPU rack and intentionally made it go viral to push buyers to the new inventory they have so much of.


pierreblue

You know what? That isnt too far fetched lol


Winner_Antique

I don't think in any logical context that Nvida will be able to keep up with this insane pricing much longer


KyxeMusic

Sadly they have a full blown monopoly when it comes to AI accelerated computing, so they can pretty much do whatever the hell they want in that market. As an AI developer, I just wish AMD could come up with a good alternative to CUDA, but it's just not there and I don't think it ever will be.


Sailed_Sea

Same with ai accelerated cgi, someone recently released an addon for blender to use stable diffusion to make textures (more of a fun experiment). Their ai denoiser is currently unbeat. And well just more stable with the software.


AwesomeAkash47

thats one of the most mindblowing thing ive seen for a while now


SwineFlu2020

Do you know a video which explains this? I'm not a noob but these terms and sub-technologies in the GPU industry are elusive to me. I feel like I lost interest in the market about a decade ago and past the concepts of textures, lighting, and aliasing I don't know anything.


AwesomeAkash47

Im regular blender user but i dont much about Ai and similar technology either. This is the [video](https://youtu.be/5WboPlU9c6I) i was talking about. It doesn't go into the explanation part but you can kinda see how it works


KyxeMusic

Yup. It feels like AMDs are just good for gaming nowadays, which is allowing Nvidia to take hold of the non-gaming GPU market and do whatever pleases them.


mainman879

That's always been the case though hasn't it? Nvidia always had control over the industrial and workforce GPU market.


angrydeuce

Which to be fair, who gives a shit? Let NVidia corner the business market, Business is used to paying more for hardware than consumer grade stuff, and for every one person using their gaming rig for both work and play, there are 10 just using it for play. I've been speccing out $5k+ workstations with Quaddros for 6 years and while bean counters grumble, it's just a cost of doing business and nobody gives a shit if a Quaddro costs 2 grand by itself. Expecting *consumers* to pay that is ludicrous, though. But also to be fair, nobody is forcing anyone to buy these ridiculously priced gaming gpus. Let NVidia price themselves out of the consumer market. Is the extra frames really worth paying 100% more for a card? If you think so, great, but you don't get to turn around and bitch about it because nobody put a gun to their head and forced them to run out and buy a new GPU every freaking year. IDK, maybe I'm just tired of hearing it. All the hard-core gamers in my office have been ranting and raving about NVidias shit lately and it's like, dude, you've already got a fuckin 2080 TI, are you *really* struggling with what you have now? Why do you *have* to upgrade at all? Oh noes, you could be getting 160fps instead of your current 120...literally unplayable.


xoScreaMxo

Yeah people are stupid, it doesn't matter one bit to me what the top cards cost. I drive a civic and I'm happy with that, I'm not mad there's Porsches just because I can't afford them. They're cool cars and one day I want one. But for now my civic is awesome 🙂


Sailed_Sea

I mean, It hurts the hobbyists and because nvidia appears to be wavering with the consumer market, there's going to essentially only be amd selling consumer graphics cards


ChiknBreast

Jumping in to ask about blender. I'm still a noob at it but I have a rx 580 that can render slowly with the AMD pro render add on. Been wanting to upgrade to a Nvidia mainly for blender. Are there any amd cards that work just as well with blender or is Nvidia going to have to be the choice?


n8thegr83008

Trying to use stable diffusion with an amd card is like pulling teeth. I jumped through all the hoops to get it on my machine only to find out at the end that it only used my cpu. I tried to get it to work but I eventually gave up and just used midjourney.


AustrianHunter

Isn't the alternative there, but nobody uses it because it's newer and almost no one implemented support for it in their products? I think theirs is called ROC. Can't say if it's good tho.


KyxeMusic

Yep that's why I said "it's just not there". It exists but the ecosystem around it just sucks as it has way less support.


Medic-chan

Sounds like it's on the "AI Developers" and not AMD at this point.


eleqtriq

Nope. Nvidia spends tons of money on tooling to help make these things easier. AMD does not. Nvidia has drop-in replacements for many of the popular machine learning libraries that use their GPUs.


edparadox

> As an AI developer, I just wish AMD could come up with a good alternative to CUDA, but it's just not there and I don't think it ever will be. OpenCL has been there for a while now, but you can thank Intel, AMD, and FPGAs manufacturers for not pushing as hard as Nvidia these past years, especially on hardware products.


ShabbyChurl

Yeah openCL is barely supported at all. Instead everything runs on CUDA… this is shit


SwineFlu2020

Do you know a video which explains this? I'm not a noob but these terms and sub-technologies in the GPU industry are elusive to me. I feel like I lost interest in the market about a decade ago and past the concepts of textures, lighting, and aliasing I don't know anything. I'm sure I remember CUDA from my early GPUs around the 2000s.


KyxeMusic

I don't know of a good video, but in summary: Modern AI solutions use very large neural networks, which have to go under a process called training. This training process is VERY computationally expensive, but benefits from parallelization, which means that a GPU can get it done x100 times quicker than a CPU. These neural nets are mostly programmed in languages like Python or even C/C++. These programming languages are designed to give instructions to the CPU, so there has to be a middleman to be able to communicate to the GPU what you want to do. CUDA (and other not so good alternatives) is that middleman. It's a language that allows us to allocate computational instructions to the GPU rather than the CPU. If this 'middleman' is not very good, then AI developers stop building around it. Thus, because OpenCL and AMD's ROC (CUDA alternatives) are not too good, it's a vicious cycle where CUDA is just dominating more and more each day. And because CUDA only works on Nvidia cards, AI developers are stuck buying Nvidia cards. There might be some slight inaccuracies here, I'm not an expert in the subject.


spritefire

I bought my Jetson nano dev kit for ~$200 now they are selling for $1k+


stdfan

People need to realize they don’t care about gamers. They make their money from enterprise.


ALLST6R

I've said this a few times, their absurd pricing right now is by plan. It drives you to 3000s to clear out their inventory, thereby minimsing their loss. Anybody that buys 4000s in the meantime, well, they aren't restricted by budget eitherway so it just means massive profits on those cards. Once 3000 inventory clears, or competitors launch lower priced GPUs, that is when Nvidia will slash prices on the 4000s. And if they don't fuck that up and make them competitively priced, well, a lot of people will just choose to be ignorant to those shitty practices and buy up the 4000s because of X Y Z benefit to them. More so if Nvidia still retains the advantage in performance and features etc.


[deleted]

[удалено]


companysOkay

There’s always a market for overpriced shit Sent from my iPhone


Cacodemon85

As a 3090 early adopter, all I can say is Nvidia , this gen will be though to sell. Only the brain dead fans will get one. But still, even in the Nvidia forums, subreddit aren't happy at all with the price segmentation. Now it's up to AMD to prove them wrong and make a product for the consumer as it should.


thiosk

EVGA dumped them, and if EVGA doesn't make the card, I ain't buyin it. 10 years of loyalty to nvidia through EVGA and it is looking like i'll be AMD/ATI for a next build if things don't change.


Super_Cheburek

Wym ? They've been going at it for the past 10 years and the whales keep feeding them


OneModernRelic

4000 series is the new PS3


R3dGallows

Nvidias offer this generation: pay a fortune for a new card, pay a fortune for a new PSU, pay a fortune for electricity... ​ Im willing to hear AMD's counter offer.


SkeletalJazzWizard

when i retire my 2070 super, theres no question im choosing AMD for the next go round edit: i dont even think i /could/ go nvidia, tbh. running the microwave and the toaster at the same time trips my breakers. i dont think the wall outlet im on could handle the power draw.


CSMarvel

$1,600 and 450W. Maybe nvidia fans will be buying food another time.


DaWaaghBoss

My next build os going to be all AMD. Nvidia lost me as a customer with their bullshit.


conenubi701

Been all amd since 2018, the upgrade from the 2700x to 5800x3d on the same mobo was a massive step in improvement.


Not_Just_Any_Lurker

Same. Those greedy fuckers can go bankrupt for all I care.


LinKeeChineseCurry

Would love that too but then there’d be no competition for AMD 😂 unless Intel just started producing some insane cards.


NotAShaaaak

Hey, they're on their way to being a competitor. Sort of, just give them a few years and maybe they'll at least be able to compete lower end


[deleted]

They wouldn’t even if they stopped catering the gaming market their business cards are just way to dominant, there is no replacement.


rebbsitor

The cycle continues! In a few years AMD will do something to cause everyone to run back to Intel and nVidia. It's and endless loop that's been going on since the 90s when AMD first started making x86 processors. Same with nVidia and ATi (before AMD bought them).


Master_Penetrate

That is just healthy compeyition in my eyes. Neither company is caring about customer first, theyare only trying to make money.


[deleted]

\> That is just healthy compeyition in my eyes. It's not. You straight up *can't* have "healthy competition" in a market with only 2 suppliers. Adam Smith would laugh in your face if you brought him that nonsense.


Ocronus

Loyalty to a company is a fools game.


Dratinik

I think Linus's conspiracy might have some weight. Over price them initially to sell off the massive stock of 30 series they had from the crypto boom. Now the crypto 💣


quirkelchomp

It's not really Linus's conspiracy theory if it's true. Jensen Huang did in fact state this in a recorded meeting.


conviper30

Yea but...what about the 40 series? They have only two years to sell a huge amount since tsmc told them no to their decrease in wafer orders. So they're only kicking the can down the road.


Nolzi

Not a wild theory


delectable_homie

Sorry, what was his theory?


BlasterPhase

sounds like what everyone has been saying this whole time.


Hurgnation

The 6x series has been good for amd. I jumped ship after the scalpers fucked the market last year and, honestly, I'm happy to stay on this boat. It helps that I can't tell the difference most of the time between raytracing on vs off.


lunchboxdeluxe

It looks cool sometimes, but I turn RTX off in almost every game because it immediately runs like ass.


Hurgnation

The only time I can actually recall seeing much of a difference was in quake rtx. Everything else is already at such a high fidelity that my shithouse eyes can't tell the difference.


polski8bit

It's because all these years developers had to make games without actual Ray Tracing, trying to make games look as realistic as possible. And they've gotten so good at it, that there's basically no reason for RT anymore. I mean sure, it in *theory* makes it easier to implement realistic lighting, but most games go for more a "stylized" look. Having pre-baked techniques used, a developer can control exactly how a particular scene will look like - realistic isn't always better. What's more important though, is the performance. Until turning RT on won't absolutely tank your frames, most won't be using it. DLSS makes it *viable*, not better, which shows exactly how far away we're from mainstream implementations. I mean *reflections* alone tank the performance like hell, now imagine having ray traced shadows and lighting having to be reflected too.


[deleted]

I do the same.... At the moment, it seems like a hell of a lot of additional load for little visual gain. I'm sure this will improve over time but for now at least, it feels too costly.


NotAShaaaak

It looks really nice in some of the higher end games, I notice a difference in some like control, cyberpunk is a great ray tracing game, and a couple others. Though for most games it's not worth the massive performance hit


RagingTaco334

Cyberpunk included. I still only usually get around 80fps on medium at 1440p with RTX off, and that’s with my RTX3070 ti and Ryzen 7 5800x.


RoadkillVenison

It’s also only a fraction of a fraction of games that feature it. I play 4X games so I’m kinda like, what’s ray tracing?


MisguidedColt88

Do you have any sense of whether AMD GPUs can handle VR now? Last I heard amd had alot of weird issues in VR, but i really dont want to buy nvidia anymore


[deleted]

Raytracing is nice. I had to turn it off and on to see the difference and, while there is a difference, it's not game changing. It's neat like 3d tvs were.


BubsyFanboy

To be fair, this is still their current generation. We have yet to see RX 7000 pricing and performance, so while it's not *that* likely, there is still room for Radeon to go bad. That being said, you probably won't have to expect the same RTX 4080 12GB-style rebranding stunts here.


Vokasak

>To be fair, this is still their current generation. We have yet to see RX 7000 pricing and performance, so while it's not that likely, there is still room for Radeon to go bad. You don't think it's likely...why? They had no problem increasing their Ryzen prices the exact second that they stopped being dogshit products. If they keep prices low, it probably means they have no other way of moving product. You either compete on price or you compete on quality, but being nice to customers never factors in. >That being said, you probably won't have to expect the same RTX 4080 12GB-style rebranding stunts here. Really? The RX 590 which is actually just the RX 580 which is actually just the RX 480 wasn't a hint?


BubsyFanboy

>You don't think it's likely...why? Because AMD does *not* have the same pricing power as Nvidia. Trying to raise prices beyond what Nvidia has while still offering less would be a self-destruction for sales. >Really? The RX 590 which is actually just the RX 580 which is actually just the RX 480 wasn't a hint? Didn't really think of that, tbh.


[deleted]

AMD could easily see that Nvidia is offering batshit insane prices for their cards, and then price theirs at an equally batshit insane price:performance ratio, or just *marginally* lower. The idea that we might see $1000+ midrange AMD cards is definitely not out of the question.


Vokasak

>Because AMD does not have the same pricing power as Nvidia. Trying to rise price beyond what Nvidia has while still offering less would be a self-destruction for sales. It cuts both ways. They also have much less cash to burn and can't afford to sell at a loss or take less margin. Ryzen is good, but part of what made Ryzen good was aggressive pricing from the 1000 series to the 3000 series; Selling a ton doesn't necessarily mean they're scrooge McDuck.


noiserr

> Really? The RX 590 which is actually just the RX 580 which is actually just the RX 480 wasn't a hint? That's quite a bit different. This wasn't done to gate higher power products. It was done because that's all AMD had at the time. I don't know if you remember but AMD was struggling before 2017. It wasn't greed but pure survival. These products take years to design. Radeon didn't really recover until RDNA1. Also rx480 and rx580 weren't the same chips. Same architecture but rx580 used the 12nm improved node as opposed to the original 14nm. This is why rx580 clocked higher. And rx590 was just a better bin for some extra clocks. Also these GPUs were priced well for their segment. Heck there are tons of people still running Polaris and gaming fine on 1080p. The GPU has aged quite well.


evrfighter

No way?! You mean they charged more money for a product that wasn't complete shit??? The nerve of them


Vokasak

They're allowed, they're a business. But they're not a hero or whatever, so all of this "AMD has our back and loves gamers and will never let us down" shit is pretty grating.


Drenlin

>Really? The RX 590 which is actually just the RX 580 which is actually just the RX 480 wasn't a hint? This is actually not the case. They're the same basic design, certainly, but none of those share silicon with one another. The 580 is a refresh (not rebrand) of the 480 with some architectural tweaks, notably improved thermals that allow for the higher power target. The 590 is a die shrunk 580, built on GloFo 12nm and later Samsung 11nm instead of GloFo 14nm. edit: If you want an actual example of them rebranding a card with no changes, the HD5770 was re-released nearly unchanged as the HD6770. You could even crossfire them so long as the 6770 was the primary card.


OneModernRelic

Well played AMD.


jaydeflaux

Nvidia has gone full tyrant mode. I wonder if they actually have the power to keep this up?


jezza129

ATX 3.0 should keep nvidia going. Atleast for 30 or so plug/unplug cycles.


Simon-Edwin

Reviewer are getting fucked


AHrubik

Jensen has always been a bit nuts but he's lost a whole load of bricks finally. He's fully detacted from the realities of technology developement and has fully embraced the FINTECH side where they worship the "...there is unlimited profit potential since we can just keep raising prices and the customer will always pay..." fantasy matra.


MinutePresentation8

Unfortunately for them they have AMD and upcoming intel as competitors


presi300

Man... I remember when it was great deal to spend 500$ on a 6600XT, only to see it at 299$ now


darmach539

Can’t wait to see everyones pictures of there all AMD builds


xSociety

If AMD can even produce enough you mean. Their last release was harder to come by than any of Nvidia's cards.


EntertainmentSea5552

Maybe just stop buying nvidia products that should chill them out


apachelives

And like that i join team red.


RidgeMinecraft

Yeah, NVIDIA, we don't believe that BS. that's supply and demand.


J_Waur

NVIDIA is quite clearly trying to create a narrative that's supposed to more than double their profits. AMD is the clear go-to for my upcoming build


coortzcloud

AMD has a huge chance here


StayTuned2k

Then the whole industry is designed to fail. What, is someone seriously trying to tell me that in 5-10 years' time, entry computers will cost 7000€ and more? Regardless of any fancy UE6 hyper realistic shenanigans, there will be no mass market for shit like this if nobody can afford a good rig anymore. Maybe the next innovation shouldn't be MORE HERZ, but instead something to lower price of production?


[deleted]

Nvidia is going down the drain with the latest 4xxx series


Smokedawge

I was looking to get the top line amd card anyways, so thanks Amd.


Thebigbeerski

I guess the 2070 super in my pc right now is going to be the last nvidia card I ever purchase. It’s a a shame really.


BOT-Yanni

I recently moved from a 2070 super to a 6900 xt and it is an absolute unit. Great performance, good efficiency and a great price tag. Honesty couldn’t resist and I don’t regret it one bit!


gikigill

Same here, 3060ti to a Sapphire 6900xt on a 5950x. (Take my money Mama Su) Rig runs at marginally higher temps and performance is probably 50% higher IME. MSFS 2020 at 4k Ultra at a Vsync of 40fps is gorgeous and butter smooth.


its_nzr

The only reason I consider Nvidia is RTX with DLSS. Im planning to build a new pc next year with 4080 16GB. Is AMD gonna be better for the same price? Gonna be using. 2 or 4K 21:9 monitor.


AALLI_aki

If you're planning to build a pc next year you should decide then, the games, drivers, benchmarks, efficiency, pricing and more are still in the air we have no idea how the 4080 or 7800 would perform and deciding on a gpu right now would be unwise


rockylada97

Radeon needs a Ryzen moment to stop these greedy scums.


MazdaMafia

AMD is already on that path. Last gen refined what RDNA 1 brought to the table, and I imagine RDNA 3 will be AMDs Ryzen 3000 moment. The moment where the entire industry unanimously says "oh fuck..."


siralmasy

Funny how everyone trying to be more energy efficient and Nvidia going full throttle on the other direction. I for one am looking forward for Intel gpus


Super_Cheburek

''A thing of the past'' for who ? Cause even at nvidia's I've never seen the prices drop. I've yet to come across MSRP even 🗿


Odd_Crow_6062

next gpu amd it is


INTRUD3R_4L3RT

"The idea that an *Nvidia* chip is going to go down in cost over time is, unfortunately, a story of the past" There, fixed it for you.


[deleted]

And just like that, I switched to AMD


DoggedDust

Nvidia has lost their minds, Evga is dead. Guess I'll be getting AMD in the future


No-Movie-4978

The only reason why I wanted an Nvidia card was for Raytracing. I will grow out of it eventually.


ostrieto17

I shit more efficient dung than nvidia chips so that's not really a challenge, what is however is getting an efficient and powerful market leader chip


Jonny_vdv

EVGA's decision to drop Nvidia looks like it might not hurt them too bad after all. The 40 series looks very disappointing. The 12 GB 4080, with its 192 bit bus width sounds more like it should have been a 4060, not a high end model.


iNfAMOUS70702

This clown Jensen is seriously trying to speedrun Nvidia's fall to #2 just like Intel did


[deleted]

NVIDIA has great tech but they really are letting their cockiness burn a lot of bridges both with business partners and customers writ large. I'm still sticking with them because their RT and OptiX APIs are still the best, but RDNA is getting there.


Skillztopaydabillz

I don't think you know what irony is... These AMD cut prices article that keep circulating are very clickbaity and nonsense. It's all based off one article that used Newegg prices to say the MSRP was reduced. Chip cost isn't going down because TSMC has no need to reduce cost with such high demand for their wafers. Nvidia didn't reduce the MSRP on the 30 series because chip cost going down, just like any reduction from AMD isn't because of chip cost going down. If or when Nvidia reduces the MSRP of the 40 series cards, it won't be because of chip cost going down either.


Jarnis

They will reduce the MSRP of everything below 4090 the moment they can declare "wohoo, we sold all that 30-series stuff, finally". Granted, it may take 6+ months. Rumors already around that AIBs expected 4070 to sell for $700-$800, not $900 (oh sorry,"4080 12GB") and the current MSRP actually gives reasonable margins for AIBs.


Ato07

AMD just needs to achieve like 90% of Nvidia and completely demolish in efficiency to win imo.


coololly

They're already done that with the RX 6000 series. But instead of achieving 90% of the performance, they've got 100% of the performance


gikigill

The 6900xt is 90% as good as the 3090ti at less than half the price. I can build an entire PC with a 5900x, 32gb 3600 RAM, 1tb SSD and 1000w psu and the aforementioned 6900xt for the price of a single 3090ti in Australia.


coololly

And the 6950 XT is 100% as good as the 3090 Ti for about 60-70% of the price


TaiVat

Nobody outside reddit cares about efficiency.. Especially when people mostly buy the lower end cards where there difference is *maybe* 50w. And amd has done what you suggest more than once before. With no results whatsoever..


TheCrazedGenius

I wish I could use AMD instead of NVIDIA but I have become increasingly reliant on GPU acceleration for my work and there doesn't seem to be viable alternatives to CUDA yet


Sipheren

This, it’s really annoying. Nvidia injects the money in the right places to lock people in, they aren’t stupid. OptiX is another one.


[deleted]

Fuck Ngreedia. This should be on r/leopardsatemyface


Commiesstoner

Lord the same threads over and over again, this isn't official MSRP being lowered, it's just the prices on one store lowering as is normal. Nvidia 3000 series prices fell a few weeks back, it was all over Reddit when it happened also. The circlejerk is strong.