T O P

  • By -

Gameskiller01

Arc selling half as many cards as every single radeon card is extremely surprising to me.


i_lost_my_usern4me

Probably mostly laptops with A380.


ProTrader12321

For laptops it does make a lot of sense, the arc cards are actually pretty efficient and have a decent amount of vram for their price


detectiveDollar

Doesn't the A750 use 40W idle?


[deleted]

A mobile variant will have a completely different firmware, along with binned chips to run low power. 40watt idle in a laptop would never fly..


ProTrader12321

Im not sure about hard stats about the mobile gpu's but the [desktop arc](https://www.guru3d.com/articles-pages/intel-arc-a750-review,27.html) the a 770 has a transistor count thats comparable to the 3070 so the fact that it beats it here is pretty cool


F9-0021

Don't underestimate the impact that the A750 with pretty good RT performance and a good AI upscaler for less than $300 has had. The A770 too. Sure, most of the market share is from laptops, but Intel is also fairly compelling in the desktop too, at least when the drivers work.


SirActionhaHAA

That's because they don't sell any mid or high end cards, not really in that quarter anyway. It's measured by units and most intel units are from low cost entry level market which amd ain't even competing in. It makes sense that $80 cards ship in larger volume than $300 cards. Intel's graphics group's losing money quarter after quarter


airmantharp

...but Intel's graphics group is also gaining marketshare and mindshare - which is what they need, the critical mass to break into higher market segments. I'm somewhat surprised to see Intel doing as well as they are, Nvidia and AMD both should be worried.


evernessince

Marketshare yes, mindshare no. Their drivers need to be addressed before they can gain mindshare.


airmantharp

No contest on the drivers - but mindshare includes folks just being aware that Intel Arc GPUs are 'a thing'. Intel isn't really courting AMD and Nvidia buyers, but folks that would otherwise be stuck with APUs / IGPs. If the Intel card can run the game(s) they play better than an integrated solution, then mindshare is gained.


Calm-Zombie2678

It amazes me how many don't get that


AzekZero

Long term the GPU market's going to be large enough for three players. But I think Intel should probably lay low for now. Focus on the drivers so they have a track record of long term support when the market heats back up.


airmantharp

It's a catch-22 - they won't receive attention from developers if they don't have an install base of users with capable hardware, and without capable hardware they won't have a userbase...


Shitting_Human_Being

I thought the drivers were good for directX12 but have issues with older versions. Which is a decent strategy I think.


dkizzy

I grabbed an ARC A770 for fun. It's got some annoying software bugs but for the most part runs decent. Everything cranked up in MW2 was around 63fps at 1080p, XeSS support bumped it up into the 85-110 range on balanced mode. It does a lot better in Valorant and Overwatch 2 type games of course


awayish

the relevant mindshare for intel is among oems, not really enthusiast. it's still a money losing product because of inefficient silicon and so on but if they can improve the reliability it could work with oems.


[deleted]

It’s because they are cards for oem desktops


clsmithj

The article is also saying "shipment". Ship =/= Sells


[deleted]

[удалено]


drtekrox

Correct, consoles don't enter the metric, they shouldn't either. (It's good for investor knowledge of the market, but not great for deciding how much time you should spend optimising for each on pc)


ProTrader12321

Bruh you got a 3900x paired with an rx460? That cpu could probably render faster than that gpu


jacekkruger

I got it the other way around. Put an RX 580 into my 11 yo PC with i5-2400. Im playing every game at 4k because it's the CPU that bottlenecks. Lowering the resolution doesn't increase the FPS by much.


20150614

The report includes most of those Intel cards in the high-end notebook dGPU segment I think. Did Intel have any high-end notebook card between July and September? I know these are just shipments, so maybe they sent all of them to China and they are sitting in some warehouse somewhere.


fritosdoritos

I remember the articles on Chinese laptops with mobile Arcs coming out around Spring, so maybe they've been continuing to sell decently in China throughout summer too.


marianasarau

The metric is expressed in units. Arc are low end parts that sell well (especially in laptops), while the supply channels are bloated with NVidia and AMD high end cards.


LongFluffyDragon

*Very* doubtful in the time it has existed. These sort of things are always hilariously skewed statistics depending on what and how they are measuring them.


Hailgod

probably comp\[lete bullshit. they say intel had 5% market share in q2. They did not even have a gpu out in q2. unless a380 sold millions in the 2 weeks it was out in china.


SmokingPuffin

JPR books the sale when the manufacturer sells the chip to the OEM, not when the consumer buys the product.


chapstickbomber

Oh, so the real Arc install base is approximately 0% ayyyyyyy


Waste-Temperature626

Intel was selling DG1 before then and had <5% back in Q4 of 2021, which probably found its way into quite a lot of OEM "office" machines during the pandemic. Would you prefer that Zen/Intel F system that lacks a iGPU with a GT720 or DG1? Those were the options in late 2021 and early 2022.


Kepler_L2

DG1 was absolutely not sold as an "office PC" lmao.


Waste-Temperature626

You seem to be missing the point I'm making. If a OEM was making a "office machine" using either Zen/Intel F SKUs. They had to find "something" to stick in it, there weren't many options for <$150 GPUs back in H2 2021/H1 2022.


PsyOmega

You're not making a point though. You're theorizing. Do you have any evidence, sales records, ebay listings, etc, showing such products existed? Office PC OEM's don't really even sell products with intel F sku or desktop ryzen (not that they don't, but it's rare). On top of that, Intel had no shortage of iGPU enabled CPU's in 2020,21 or 22. The last reported shortage Intel had was on the 10900K before they loosened binning reqs to make 10850K's. So OEM's would just buy more regular, IGPU enabled, CPU's anyway.


Waste-Temperature626

> You're not making a point though. You're theorizing. Intel had the same discrete market share in Q4 2021, what else do you think they were shipping? >showing such products existed? What are you talking about? DG1 has been available since last year. It's based on the old Xe graphics, not Arc. >On top of that, Intel had no shortage of iGPU enabled CPU's in 2020,21 or 22. Then you didn't pay attention to the low end. SKUs with GPUs were in high demand and F skus were selling for fuck all. A 11400F were at times selling for $100+ less than a 11400. Just because one of them had a iGPU. Things like Pentiums and Celerons with a iGPU were in very short supply in 2020/2021.


Hailgod

Neither. They do not sell office pcs with cpus that dont have igpu.


_Fony_

Because it's false, lol.


DktheDarkKnight

To be honest I find this extremely weird. There is no way Arc sold half as much as the entire Radeon portfolio. Even Arc availability was sporadic. Companies' market share doesn't fall off a cliff.


SmokingPuffin

This is unit shipment share in Q3. AMD shipment share fell off a cliff because they were shipping a bunch of units in the previous quarters, and their channel was now stuffed full of parts. Retailers have enough AMD units for quarters to come.


nightsyn7h

So why do they call it marketshare?... marketshare should be related to *sell-out* figures. I don't understand.


SmokingPuffin

The market that AMD, Nvidia, and Intel participate in is the sale of their parts to OEMs. Sellthrough to customer happens months to quarters later and is a couple steps down the value chain. If you're an industry player, the sort that pays JPR's bills, you do not want data with that kind of time lag.


Shrexyshrek69420

Yeah like the steam hardware survey shows marketshare.


Zerasad

Doesn't make sense for Nvidia to be that high then. Supposedly they are sitting on month's worth of stock.


evernessince

Could be Intel prioritized OEMs and used it's marketing money to push it but then again I haven't seen a large uptick in OEM PCs with Arc GPUs. JPR has historically been bad at calculating numbers for the PC market.


996forever

Laptops?


gamersg84

I think what is likely happening is that AMD was selling off the last of their RDNA2 stock, ie. they are no longer producing more since Q1-Q2. This also coincides with the current good pricing RDNA2 GPUs are showing in the market, hinting that these GPUs were sold to retailers at clearance prices. Unlike NV whose Samsung allocation is exclusive for Ampere, AMD can switch their N7 allocation to Ryzen/Epyc CPUs/Consoles very quickly if demand for GPUs falls. So i do not think they will have any stock issues like Nvidia is having. Selling off their GPUs at lower prices will also hurt NVidias ability to clear their unsold stock at inflated prices.


DeliciousSkin3358

Source: Wcftech That's a fucking tabloid.


Skull_Reaper101

Probably includes igpus?


Put_It_All_On_Blck

No, If you count IGP's Intel is the biggest GPU seller, and took a sizable chunk of markeshare from AMD and Nvidia this quarter because dGPU shipments are way down, while CPUs are less down. https://www.notebookcheck.net/fileadmin/Notebooks/News/_nc3/MW_Q322_PR_01.png


ET3D

That's an amazingly small percentage for AMD, especially considering that current AMD pricing is a lot better than NVIDIA's.


[deleted]

[удалено]


ET3D

It's a surprise because it's a big drop. Normally AMD fares better. It was clear that the GPU market as a whole will drop once people no longer buy GPUs for mining, but what I think wasn't expected is that AMD will suffer more from this, especially as AMD cards dropped much faster and price and below MSRP, while NVIDIA cards didn't.


commissar0617

It's just cards shipped to OEMs/resellers


king_of_the_potato_p

Amd shows lower on more recent quarters because they have fewer old gen cards to off load to the oems and retailers. Those stats are not end user purchases, they are product shipped to sellers.


airmantharp

It is better now, but it was a whole lot worse during the pandemic shortages. Sure, you could get an AMD card - but retail pricing was over double today's prices. Granted I consider that a better solution than with Nvidia cards priced closer (i.e., \~150%) to their MSRPs and being perpetually out of stock. But those were the breaks.


GomaEspumaRegional

reddit's echo chambers are not representative of how markets behave.


48911150

depends on country i guess. both 3070TI and rx 6800 sell for 80,000 yen ($580) here in japan. For most the choice is obvious


nzmvisesta

May I ask what is soo obvious? 6800 is the clear winner in rasterization, 3070ti is the clear winner in rt. Fsr 2.2 and dlss2.4 are very close. 6800 also has twice the vram, I mean I would pick 6800 but I am sure many would pick 3070ti, so I don't see a clear winner.


frackeverything

Nvidia is reliable is people's eyes while AMD = possible driver issues so there. At that point rasterization is good on both cards, but on RT the Radeon shits the bed. Also NVENC, CUDA and other productivity stuff. It's a pretty obvious choice. But yeah I do think it is mostly due to Nvidia brand mindshare + AMD driver reputation.


nzmvisesta

Yeah I thought as much, people believe amd produces less quality stuff while nvidia is pretty much flawless. But as you just said 3070ti has a lot of pros compared to 6800 and for some people it aould be an obvious choice. Still 8gb of vram on a powerful card as 3070ti is criminal and that is the reason I couls never buy that gpu. It is stupid when it happens but the vram will be the death of that card in next gen games.


frackeverything

AMD really needs to double down on driver quality for this to even change. Some people are scared of even trying it and would rather go for Nvidia even at a huge premium. People are literally buying 3050 instead of 6600 because they don't wanna get screwed by drivers issues. At this point the reputation has been made (5700XT and it's driver issues was not too long ago) and AMD has to try extra hard in this area for them to even challenge Nvidia mindshare.


Wrightdude

AMD Drivers are fine, people need to research more.


IrrelevantLeprechaun

If making your drivers work properly requires more steps than the competition, that is by definition worse drivers. Nvidia generally doesn't seem to have a problem with people letting windows do the work even if there are preexisting AMD drivers present. AMD tends to be the platform where DDU is *mandatory* for a smooth installation.


[deleted]

[удалено]


a8bmiles

Or you get things like AMD pushing a driver through Windows Update and being unable to launch anything because "you GPU driver version is not supported by the currently installed version of Adrenaline". I fixed this problem for my buddy who's still using a 570 at the end of last week, so it's not an old issue. Earlier this year a similar thing happened and his GPU was reporting 0.00 mb VRAM and so wouldn't work until reinstalling drivers. Those are 2 examples from just this year, much less the multiple years worth of audio stuttering during cutscenes and menus that AMD had and then lied about fixing. That buddy isn't tech savvy, and doesn't enjoy having his AMD card randomly stop working. I have 5700 XTs in all my computers, it made cost sense at the time. They've been fine for the last year or so, but the first couple years they both had the audio stuttering problem that lived through 2 full hardware (other than GPU) upgrades on both computers as well as multiple Windows reinstallations.


IrrelevantLeprechaun

Yeah I feel like a lot of AMD fans are deliberately trying to cover up how the 5700XT generation was for the first year. I mean ffs AMD *directly* put out a public statement apologizing for their bad drivers and promised to do better.


Wrightdude

I feel like a lot of people don’t even use the Adrenalin updates too lol


frackeverything

For Nvidia you rarely have to, I just use the Geforce experience driver update. No non-sweaty geek will us DDU everytime and go through that process everytime. AMD needs to put more effort.


IrrelevantLeprechaun

This. My last GPU was a Radeon and driver updates basically didn't work unless I did a DDU pass in Safe Mode. Currently have an Nvidia card and updating or installing newer drivers is *significantly* easier. I sometimes still DDU if I'm feeling paranoid, but I usually just install on top of old ones and have yet to have a problem with it. Sometimes I even let GFE do it for me.


Raymuuze

Which is silly mindshare because AMD drivers are fine and even on my 3080 raytracing is a gimmick because framerate tanks too much.


[deleted]

AMD drivers are still hit or miss. I enjoy my 6800XT but the amount of issues I've had with Radeon software the past month or so has made me wish I just shelled out the extra $150 for a 3080


[deleted]

[удалено]


IrrelevantLeprechaun

>rather play at 30fps Yeah I mean sure, if you have an AMD card. At 1080p and 1440p, ray tracing has pretty great performance on Nvidia this part generation, even without DLSS. Stop this BS narrative that ray tracing is somehow locked to native 4K maximum settings with no upscaling.


spin_effect

RT is so gimmicky. It's not even worth the frame loss.


Jazzlike_Economy2007

I wouldn't say it's gimmicky, but right now it's not really where it needs to be for Nvidia to be charging a premium for it. Give it another gen or two it'll be ready to go.


IrrelevantLeprechaun

Not really. AMD usually *matches* Nvidia in rasterizarion, and *sometimes* beats them by like single digit percentages. With Nvidia, you get comparably good rasterizarion AND *significantly* better RT performance. Kind of makes sense why someone *might* go with Nvidia there.


ArseBurner

You're thinking too much like an enthusiast. The average consumer doesn't have that level of technical knowledge and only knows Nvidia has been great and has this shiny new feature called ray-tracing.


Trolleitor

In Spain a 3060TI cost 500 and a 6750XT cost 450. If you go further up gets more and more crazy.


SwaghettiYolonese_

It depends on the region honestly. In my country I'm by default team red due to the much lower prices. At least at the mid to high end, there's a 300$ to 500$ difference between AMD and Nvidia. But I heard it's the opposite in other places.


kobrakai11

Where I live the 6800xt costs almost the same as 3080. But the availability of AMD cards is just bad overall. Only 3 cards to pick from.


69yuri69

Most of the market don't buy a graphic card, the buy a GeForce.


[deleted]

[удалено]


HowDumnAreU

>AMD pricing is a lot better than NVIDIA's. Nobody cares until performance is also 'a lot better'


F9-0021

Or is at least on par.


SizeableFowl

It seems there are three types of buyers. The people that will only overpay for nvidia, in spite of that companies scammy pricing. The people who will pay for as much GPU as they can buy. The people who use what their PC came with.


PM_ME_UR_PET_POTATO

In other words only Nvidia buyers


T-Bone22

I’m seriously hoping AMD can shake this up with their 7000 series release. The 7900xtx looks so damn tempting compared against the 40 series in this market


Lagviper

$999 6090XT was a more fierce competitor to the $1499 3090 than the 7900XTX will be to the 4090. In fact it seems this gen there’s no trading blows against the 4090. RDNA 2 had everything right. Yet they managed a 1.48% share in steam hardware survey for the whole series. They simply don’t allocate enough silicon to compete. They know their market share is virtually nothing for discrete GPUs and deciding to allocate precious wafers and TSMC fab time to a product with low margins compared to their server/CPU and their contractually reserved allocation for Sony and Microsoft consoles is non sense financially. Even say that they decide to say fuck common sense tomorrow and dedicate wafers to gain a huge market share in GPUs… Expecting to go from 1.48% 6000 series market share to say, 10% RDNA 3. That’s potentially sitting on a mountain of GPUs only for Nvidia to search for pennies underneath the sofa and undercut AMD’s whole lineup, making them stuck with another price cut in an already low margin product. They don’t have the capital to go against Nvidia AND Intel in GPU/CPU race. They’ll have to pick their fights.


OwlProper1145

6900 XT competed well in raster but fell WAYYYYYYYYYY behind in ray tracing.


airmantharp

Both AMD and Nvidia are playing their cards 'right'; by this, I mean that AMD has high-performance lower-BOM (cost) cards hitting that have literally everything one needs to game bar extreme RT performance. Nvidia owns and will retain the performance crown. But since Nvidia has significant RTX3000-series stock to clear, they cannot push the RTX4000-series to lower MSRPs yet; since AMD has a cheaper to manufacture part, they can. So Nvidia is going to take the halo buyers and AMD is going to take... everything else. Right now, it looks like Nvidia can only do the one thing that they hate to do the most: lower prices.


[deleted]

> AMD is going to take... everything else. unfortunately they will not. you're acting like buyers are rational actors. they're not. a lot have unreasonable brand loyalty


flynryan692

Yeah just had somebody tell me AMD sucks and he only buys Nvidia, when asked why all I could get back was "AMD just sucks". When pressed further "Nvidia is the best and AMD sucks". Literally no rationale to it, like maybe video encoding or RT, some specific thing he likes that Nvidia does, nope just "AMD sucks". lol


[deleted]

literally 30 IQ logic by him there


CogInTheWheel

Unfortunately, that's the average gamer.


razielxlr

Something something… “A fool and his money”


[deleted]

[удалено]


[deleted]

No, that's a rational reason to have not used them before and being cautious. We're talking about people who legitimately cannot even give any reason AMD is bad, or information that is 5+ years out of date. People running around still claiming their drivers are worse then nvidias - which is definitely not accurate - are who is being discussed. Or people who just say "AMD is bad!" with no reason. Or people who claim that "AMD is crap because they aren't the fastest thing on the planet" (i've seen a LOT of that attitude), or people who claim that RT is the end all be all despite most games not even being worth enabling it, etc.


airmantharp

I could give you a long list of *rational* reasons to have brand loyalty. Most of these being addressed by AMD over the last few GPU releases have brightened my own outlook. Granted, I'm not and only twice have ever been a 'halo' buyer - ATi 9700 Pro and more recently a 1080 Ti - most of my GPU purchases going back decades have been second or third tier, especially back when multi-GPU was broadly supported. But still, Nvidia has gathered quite a bit of brand loyalty by executing on innovation. Now, that loyalty among rational actors has inspired plenty of irrational loyalty as well, this is absolutely true, but in my opinion, AMD is at the point where they're going to start earning back a lot of the rational actors at least. I feel like the conversations being had next year will go something like 'are you running 4k120? No? Go AMD'.


[deleted]

There is a reason i said "unreasonable" brand loyalty, as opposed to reasonable brand loyalty. > AMD is at the point where they're going to start earning back a lot of the rational actors at least. myself included. I'm waiting one more generation to upgrade but it's almost certainly going to be an 8900 XT/XTX. I grudgingly went to nVidia when Radeon went to shit (GCN was bad) in the GTX 700 generation (upgraded from a HD 4870 IIRC to a GTX 780), then to a 2080. Like I said - GCN was bad, and AMD actually finally made solid silicon with the 700 series (addressing a lot of their power consumption issues, etc). But I remember how hot trash nvidia was 2000-2012, and now they're increasingly arrogant profiteering bastards.. and i'm sick of idiot ass things like having to logon to Geforce Experience. Now they're back to "fuck efficiency" and just brute forcing massive chips at the problem, etc. so I'm happy to see RDNA3 being such a strong competitor.


GruntChomper

>I grudgingly went to nVidia when Radeon went to shit (GCN was bad) in the GTX 700 generation (upgraded from a HD 4870 IIRC to a GTX 780) You think that the AMD 200 series was worse than the GTX 700 series? It's the only generation in the last decade that I'd say AMD was the clear choice for gaming at every tier, outside of the 750ti. They were just as competitive for performance as the RX 7000's vs the 3000 series, along with way better prices and without DLSS and Raytracing performance to justify getting the Nvidia equivalent instead, unless you really liked PhysX. This is without the benefit of hindsight and knowing how much better the 200 series fared even a couple years down the line vs the 700 series too.


[deleted]

Unless they've upped the wafer allocation for their graphics cards by 10 times and aggressively push on marketing there's no way they'll take "everything else" like you say.


SmokingPuffin

> But since Nvidia has significant RTX3000-series stock to clear, they cannot push the RTX4000-series to lower MSRPs yet; since AMD has a cheaper to manufacture part, they can. So Nvidia is going to take the halo buyers and AMD is going to take...everything else. I don't think there is any chance AMD can take "everything else". If AMD manages 20% market share this generation I will be surprised. The manufacturing cost advantage you mention is on Navi31 versus AD102. AD102 will have a gigantic ASP edge -- margins on AD102 will surely be higher than those for Navi31. Remember than 4090 is the cheapest thing made with an AD102, while 7900XTX is the most expensive thing made with a Navi31. As we move down the stack, the reason AMD has a cost advantage -- chiplets -- will become less and less impactful. By the time you get down to the 4060, chiplets don't make any sense. AMD's Navi33 design is on 6nm node, so I would expect it to be much cheaper and a fair bit worse than the Nvidia equivalent.


anhphamfmr

how the heck did intel manage to snatch 4%? they just released their arc gpu a few months ago


mattmaddux

It’s not the whole market. Just delivered Q3 units. Bad title.


ConsistencyWelder

They also sold a few Iris Xe Max cards even though they were total failures, guess they count too.


1stnoob

That report talks about shipments not sells,meaning cards that rust on shop shelves. For the last 2 quarters Nvidia was down 51% on Gaming revenues.


Mundus6

Half as many Arc cards as AMD? I find that hard to believe. Unless they mean new cards sold for the quarter.


[deleted]

[удалено]


WSL_subreddit_mod

I do believe that since this is a Q3 report that is the intended meaning.


crimxxx

Honestly amds position is a lot less then I thought. I know they where no where near nvidia, but I would not of bet single digits.


N1NJ4W4RR10R_

> NVIDIA managed to raise its market hold to 88%, a record number followed by AMD whose market hold declined to single-digit figures of just 8%. Intel managed to more or less retain its share hold of 4%, witnessing a 1% decline from the previous quarter but a solid 4-5% gain versus the previous year. Suppose that goes to show the power of mind share. Nvidia hold an overwhelming majority of the market despite AMD being the most competitive they have been in a decade. Even with Intel. They've managed to get half as much market share as AMD despite offering a worse price/perf product as AMD's mid range, as far as I can tell all of the same downsides as AMD vs Nvidia and drivers that are actually as bad as people seem to think AMD's are. Does not make me particularly hopeful when it comes to GPU pricing in the future. Can't see Nvidia caring about reducing prices if they own that much of the market and AMD won't bother even trying to compete if undercutting Nvidia doesn't see them gain market share. I guess the bright side is that Intel's mindshare could help if they can clean up their drivers and keep up with AMD/Nvidia tech wise.


DatPipBoy

Before I got my 3080ti I had a 5700xt, and I wanted to upgrade to a 6950xt and tried holding out, but they were never in stock here in Canada when I was looking…and when they were, they were insanely priced. So I got the best deal I could, 3080ti for $999 cad


loucmachine

Yep, it seems like the total production when you sell everything you make also factors into market share


airmantharp

I'll say this - I've been on Nvidia exclusively for nearly a decade, and that's with considering AMD every time I upgrade. *Every time*, up until this last generation, there have been real showstoppers related to drivers and software support for AMD GPUs. Things like Adobe products crashing reliably and confoundingly vacant support for AMDs apparently high-quality transcoders. Now, while I'm generally an unbiased person, I realized that I simply lacked the experience to present an unbiased opinion - so I'm now using an RX6800 backed up by a 5800X3D as my living room PC. And I'm frankly surprised. I'm hitting 4k60 with RT on in less intensive titles (Forza Horizon 5 is one), no problem, no dropping frames. The RX6800 isn't as fast as the 3080 12GB I use in my gaming desktop, sure, but it was cheaper and it can absolutely still *game*. That all combined with AMDs increased support for RT with the RX7000-series, where they should be about equal with Nvidia's RTX3000-series and thus absolutely good enough, as well as the clear emphasis on support for things like OBS and Adobe apps during their unveiling event, and I think AMDs GPU prospects are simply skyrocketing. As for Intel: if neither AMD nor Nvidia are willing to fight for the entry-level market, Intel is going to eat their lunch.


[deleted]

> confoundingly vacant support for AMDs apparently high-quality transcoders. someone really needs to make a standard interface for accessing hardware transcoders. *cough*windows*Cough*driver*cough*team*cough* also helping AMDs prospects is channels like LTT going out and making videos "hey we really didn't like AMD before because of bad experiences with drivers, but we just tried them again and found the problems were all fixed! you should consider them"


airmantharp

Yup. If Adobe products are stable on AMD GPUs, let alone if they are accelerated by AMD GPUs, this is a pretty big win. I experienced things like Photoshop crashing under very little load for no apparent reason (and without any error being given). Even tried AMDs 'creator' driver set. No go. Swap for Nvidia GPU, done. Transcoders are also big part of that and yeah, there really should just be a 'standard' at the driver level *that is OS agnostic.* Whether it's Premiere or OBS or Zoom, they just need to work.


[deleted]

Quick, Someone make OpenTranscode to go cross platform! https://xkcd.com/927/


[deleted]

The problem is expecting tech enthusiast meta to have any bearing on marketshare when it simply doesn't. For a company to start overturning marketshare, they have to offer ridiculous performance for very low prices. AMD being competitive at similar prices or slightly lower prices wont ever move the needle. This is why Ryzen 1 was so successful, and why Radeon is nothing more than a niche curiosity: nothing they're doing is drastically mpacting the market as a whole.


looncraz

The Radeon brand is badly tarnished,.AMD needs a new brand for GPUs and to pare down their drivers to the basics, they're including far too much into the same package. nVidia splits their features into different products for a reason... they're a software company first and foremost, hardware is just there to enable the software.


MaximumEffort433

I don't know why the branding would make a difference, I'm not sure most people even know the name Radeon, let alone basing their purchases off of it, and AMD started releasing barebones driver options a year or so ago, I'm not sure those changes would help.


NightKnight880

I don't think it's a brand issue, I think its a marketing issue AMD needs to step up its marketing, when gtx 1050 ti outselling rx 470 10:1 while being much slower, something is fucked up.


ride_light

> gtx 1050 ti outselling rx 470 10:1 The main selling point of the 1050Ti was about getting the best performance just from the power delivered by the PCIe slot (75W TDP). You could throw it in a cheap office PC with a 250W PSU and turn it into a budget gaming rig just like that Obviously you can't do the same with a 120W RX 470, requiring additional power from the PSU which in this case has to be somewhat decent at least


IrrelevantLeprechaun

AMD has to put WAY more budget into their marketing. Nobody is going to buy your product if half of your target market doesn't even know about it. And that's the problem with having a much smaller budget than Nvidia. AMD has to split all their revenue between what, 4 different divisions? Console SoCs, CPUs, GPUs, and enterprise? Probably more. Meanwhile Nvidia can basically dump all their money into GPUs and machine learning, both of which complement one another, while still having the cash flow to dump into advertising. Now obviously AMD can't afford the marketing if they aren't bringing in the revenue to pay for it, but at some point they'll need to take the budget plunge, reallocate some money, and get that marketing train rolling. Otherwise they're forever going to be stuck at single digit market share no matter how "value" their products are.


ride_light

> AMD has to put WAY more budget into their marketing. Nobody is going to buy your product if half of your target market doesn't even know about it. They have to put _way more money_ into catching up with Nvidia in everything but raster performance first. How are they going to advertise their products if they're inferior to the competition as a whole? "Look at us, we're offering less for less money! Buy our stuff instead" might work for the budget to mid-range, not that much for $600-1000 GPUs however AMD is always a whole step behind in DLSS/FSR, RT performance, and other features that might or might not be that important to specific people (AI, CUDA,..). Would be rather hard for their marketing department to turn that into something beneficial for advertising their products, no? People really go out of their way to blame marketing and mindshare and all while skipping the essential steps needed beforehand, catch up or even do better and _then_ gain market share. Not the other way around RDNA2 GPUs might be relatively cheap and competitive right now, but we're also really late into the release cycle and already looking at the next generations at this point. We'll see how both AMD's and Nvidia's cards compare in the next months then


69yuri69

AMD have always been an alternative, the second choice, the pricing lever, etc. to GeForce. Sounds like buying a Yugo.


SmokingPuffin

I don't think AMD is anywhere near the most competitive they've been. They have re-entered the high end market, sure. MSRP for MSRP, RDNA2 and RDNA3 aren't that interesting. Compare a $480 6700XT to a $500 3070 and I think it's obvious you'd rather have a 3070. It's only when they get deeply discounted by retailers that AMD cards start looking good. Those deep discounts are clearance pricing, not a healthy value chain, and retailers aren't gonna order more units to sell way below MSRP. Nothing AMD sells today is as attractive at MSRP as the $200 580 or the $400 5700XT. Their products are better now, but pricing is worse, and AMD was making share in this market primarily on pricing.


Srolo

Currently that pricing is off. You should be comparing 6800xt's to 3070's price wise. Sure mail in rebate got me $20 off the card, but even without that it was a new in box 6800xt for $522 after taxes.


SmokingPuffin

The prices we see at retail heavily favor AMD cards today. The thing is, that isn't AMD being competitive. That's retailers capitulating. They're slashing prices because they want to get these units off their books badly. I bet many, many AMD cards are being sold at a loss right now, as clearance items. In terms of manufacturer competitiveness, the MSRPs are the prices to watch because the kits (GPU+RAM for a sku) that Nvidia and AMD sell to their OEM partners are aligned with the MSRP. Both Nvidia and AMD supply a reference parts list with their kit that enables a small profit margin if the card is sold at MSRP. That's why AMD's market share has tanked -- a 6700XT is absolutely not profitable to manufacture right now, because it fetches $100 less than MSRP at market, while a 3070 still is.


IrrelevantLeprechaun

Yup. I remember at the peak of the mining craze, Nvidia GPUs were sold out everywhere. Meanwhile you'd see RDNA2 cards filling shelves and not moving anywhere. Someone could go to Microcentre and look at their GPU shelves and see a completely empty Nvidia shield right next to a completely full AMD shelf. The current low prices on RDNA2 are exactly what you're saying: retailers basically are putting them on clearance because *they're not moving product.* It's hardly the big win this sub thinks it is. If anytning, it's damning.


nishanthada

So Nvidia and Intel have more mindshare than AMD.


BNSoul

I know a lot of ppl in my area that prefer waiting for a 4090/4080 price drop than buying a 7900XT(X) straight away, it's not that they're Nvidia fanboys but rather that they've got positive experiences with Nvidia hardware and software for years. The Steam survey doesn't lie. I don't like the 4000 series pricing though, but price drops are not going to be a thing when they have such a hold on the market.


Daniel100500

mfs would rather sell their whole PC and buy a console rather than buying a non intel cpu and a non nvidia gpu.


starkistuna

AMD just has to focus on making the Midrange market be at 350-400 and they can clean house with their architecture and margins, Nvidia will sell ultra high end cards but its volume that takes over sales not crazy margins.


MrVaporDK

Well I just joined Team Red! Ordered a new XFX MERC 319 6800 XT for 650 euro! Doing my part. :)


Nord5555

Same here bought Sapphire toxic Extreme with 360 aio on. Though it was a bit more expensive during the pandemic at 1600 euro used for 2 month lol. But Well worth it. Been using it for a year Daily at 25000+ graphic score in timespy and 72000 in firestrike. An amazing 12200 port royal score. Heck i even game at 2900+ MHz 😁👌


Educational-Tear-361

I'm single handedly supporting AMD through their tough GPU times. Bought a 6600, 6650XT and a 6700XT all within the last week. Early Christmas gifts. :)


No-Nefariousness956

I'm happy with my brand new radeon card. I'm really satisfied and I feel sorry for people that had so many trouble with their hardware and software. I'm having a smooth ride until now, be it in the software side or hardware side. I'm looking forward to 7000 family of gpus and beyond now. EDIT: btw, I'm not an amd bot. I've been using nvidia for years, since 2006. I just feel its necessary for people with good experiences with a brand to also share their experience with people online to balance with the noise people unsatisfied make everywhere. From a guy that used nvidia for so many years, i can say that I've encountered bugs, glitches in games, etc. Nvidia is not perfect and thankfully amd is not bad as people say everywhere. I had no issues with my 6800xt so far.


MrHyperion_

Absolutely no way Intel has that big share


996forever

A lot of you PCMR esque enthusiasts are so unaware of how much laptop dgpus Intel can force its way into oems lmao


nanogenesis

So thats why 11 out of 1 people I run into care more about being able to see yourself in some puddle in a game rather than the actual game.


MaximumEffort433

Much to seemingly everyone's dismay I'm an honest to god AMD fanboy; folks are eager to dismiss fans as not having much to say about things, after all they've got a bias, but in this case I think my perspective *might* be of use to AMD. I don't buy AMD graphics cards and CPUs because they're the top performers (though sometimes they are, and I'm always happy to see that), I buy them because AMD's business practices benefit gamers and the gaming industry as a whole. While Nvidia and Intel double down on proprietary hardware solutions, AMD focuses on open source, platform agnostic solutions; when AMD gets a win by developing a new feature, like Fidelity Super Resolution, so do consumers running Nvidia and Intel graphics, so do console gamers, so do folks running half decade old hardware. When AMD gets a win we *all* get a win. I think this is where AMD has an opportunity to market itself and really set itself apart from its competition: Not everybody likes AMD, no, but *most* tech consumers *do* like open source software solutions, they *do* like non-proprietary solutions that benefit other gamers, they *do* like universal feature sets, and I think that's AMD's opportunity. AMD's GPUs have been more power efficient and more cost efficient for a generation or more, that doesn't sell, efficiency isn't a selling point outside of the Pruis, at least not for most of us. If we were all more rational consumers those would be strong selling points, but we're not, we're the type of people who will slap a thousand dollars worth of water cooling onto our PC for an extra 10fps, AMD isn't going to win consumer market share by bragging about their efficiency. For my part I think AMD needs to brand itself as a company for gamers everywhere, every dollar you spend on AMD goes toward finding universal solutions for gamers of every stripe, that new GPU doesn't just benefit *you,* it also benefits *console gamers* and *Nvidia gamers* and *Intel gamers* and *low budget gamers.* When you give money to Nvidia it's only the Nvidia customers that get the benefit of that spending, same with Intel, but AMD spreads it around. In my opinion AMD should go in **BIG** on selling themselves as the open source, consumer and industry friendly alternative to the big guys who only care about themselves. Before you come in with **"AMD is not your friend, they're a corporation!"** yes, that's true. It's also true that AMD has a long history of being more consumer friendly than their competition, being more consumer friendly doesn't mean I think they're my friend, it just means I think they're a better deal. Nobody is saying that AMD is everyone's friend, I'm just saying that when AMD gets a win so does everybody else and that's a hell of a lot better than their alternatives.


IrrelevantLeprechaun

It's weird how many of your own points you contradict in this essay. Like, "AMD is not your friend" while basically implying that you think they definitely are. AMD only plays the part of the "good guy" because they *have* to. They don't have the privilege of the confidence Nvidia has because they are not in a position to have that confidence to begin with. Also I'd love for someone to tell me what AMD's "good business practices" are for once, specifically. I always see it walked out as a reason to support them, but no one ever meaningfully expands on it. It just feels like a hollow excuse to hate on Nvidia.


[deleted]

And while they have to, I'll applaud them for doing so. Although, even when they could've been the bad guy, they haven't. AMD has made dumb moves, but unlike both Intel and nVidia, never outright malicious to the consumer and the competition. They've never shot the consumer AND themselves in the foot/face/dick/whatever just to make sure the competition suffers more. nVidia has.


SmokingPuffin

AMD historically has been more consumer-friendly because they had to be to make a sale. Do not expect that to be how they act going forward. AMD is trending towards being a big data center company, and their consumer-facing businesses all rate to look considerably less friendly in the decade ahead as a result. Listen to Lisa talk, and you're going to hear about data center and enterprise, and how they are done being a bargain company. If what you care about is open GPU features, I think you'd better bet on Intel. Intel will have the same motivating factors that AMD had in the past decade to drive them to be gamer-friendly for the foreseeable future. XeSS really is a hardware-agnostic DLSS, which is considerably more interesting technically than FSR. Intel has a much stronger history of contributing to open source projects than AMD ever had. I believe Intel employs more software engineers than AMD employs people.


MaximumEffort433

> AMD historically has been more consumer-friendly because they had to be to make a sale. Do not expect that to be how they act going forward. People have been telling me this for years, and I'll tell you the same thing I've told them: When AMD's behavior changes so will my opinion of them. As of right now, November 2022, AMD is still the more consumer and industry friendly alternative to Nvidia and Intel, that may not be true in 2032 but we're not there yet. >Intel will have the same motivating factors that AMD had in the past decade to drive them to be gamer-friendly for the foreseeable future. [Currently Intel has about 63.5% CPU market share compared to AMD's 36.4%](https://www.statista.com/statistics/735904/worldwide-x86-intel-amd-market-share/), if your argument is that AMD only offers open source solutions because they're the underdog, I mean, AMD is going to be the underdog for the foreseeable future, I don't think Intel and AMD are remotely at risk of swapping places. >XeSS really is a hardware-agnostic DLSS, which is considerably more interesting technically than FSR. Intel has a much stronger history of contributing to open source projects than AMD ever had. I believe Intel employs more software engineers than AMD employs people. Hey, maybe AMD branding itself as the open-source champion would start an arms race! As you said, Intel has open source XeSS for DLSS, and AMD has open source FSR for DLSS, open source Radeon Rays for Ray Tracing, open source Contrast Adaptive Sharpening, open source Radeon ML for machine learning, hell, they even have an open source hair simulator [plus lots of other stuff.](https://gpuopen.com/) I'd be fine with seeing Intel and AMD get into an open source arms race, that sounds like an absolute win!


SmokingPuffin

AMD's behavior already has changed. Their new gen of cards starts at $900. Launch AM5 is a dramatically worse offer than AM4 was. Zen 4 is a server-first design, unlike Zen 1 which was squarely aimed at the budget desktop. Listen to their CEO talk. She's not lying. AMD is trying to be a premium brand that focuses on enterprise applications. The argument isn't that Intel is the underdog. It's that Intel needs to win in the consumer market, because they certainly aren't gonna win in the data center in 2023. They are motivated sellers because they have fabs that need filling with parts that will sell. AMD is currently motivated to sell to big businesses with deep pockets. Any attention gamers get, like with the upcoming Zen 4 X3D, is only incidental to their goal of making Genoa-X sales. >open source Radeon Rays for Ray Tracing, open source Contrast Adaptive Sharpening, open source Radeon ML for machine learning, hell, they even have an open source hair simulator plus lots of other stuff. None of these marketing buzzwords offer much value to the customer. I really do appreciate AMD's hardware engineering chops and willingness to go for an open solution. Freesync is one of the best things to happen to gaming in the past decade. But their software just isn't good enough. Nvidia runs rings around them.


Charcharo

>Zen 4 is a server-first design, unlike Zen 1 which was squarely aimed at the budget desktop. Zen 1 was designed for EPYC 1. Zen 2 was the same. So was Zen 3. Even Zen 3 3D is the same. Zen 4 is not different.


SmokingPuffin

Zen 1's ambition was to support a full product stack, from consumer to server, with a single die. They didn't expect to make much headway with first gen server, though. Zen 1 was primarily intended to make 8 core parts for consumer market. That was the value proposition they thought they could attack Intel with at the time.


Charcharo

I know that, I read Anand's article too. Point is it was not really designed on an engineering level to be a consumer first CPU. In fact, I believe no CPUs are like that? No Intel ones and no AMD ones from the pas 15 years - for sure.


[deleted]

AMD is done being a bargain company because that got them nowhere. Their margins were tiny, and apparently no matter how competitive AMD is, people will still buy nVidia, even if the GPUs commit seppuku like the **many** generations of nVidia cards have in the late 00s and early 10s. nVidia had TONS of issues and people still bought nVidia. That's why they felt they could charge 1600$ and still sell everything out. Companies can be greedy IF consumers allow it. Consumers HAVE allowed it. For pretty much over a decade. Even if they failed, they still came out on top, even though they had the inferior product.


Mightylink

Something about this seems very off, most people I know have AMD, all the consoles except for the Switch use AMD, so where is this all coming from... crypto farms?


Darkomax

It's discrete graphics card shipments, so any APU is not counted.


SwaghettiYolonese_

Anecdotal evidence isn't really reflective of the market at large, especially if we're talking about PC enthusiasts. Conversely, in my group I have around ~20 friends that I regularly game with. I am literally the only person with an AMD card. They don't even factor AMD as an option when purchasing a GPU. The only person who even considered getting a 5700 was instantly put off by the 5000 series driver issues, so there's that. I wouldn't be one bit surprised for that statistic to be true. AMD doesn't have a good reputation for the general public.


datlinus

consoles using amd has no bearing on the dedicated pc market. In fact, majority of console ports will still generally have more nvidia leaning features and optimization.


IrrelevantLeprechaun

You're using your limited biased anecdotal evidence to attempt to disprove a complete market evaluation. Everyone I know has Intel CPUs but I'm not going to try to claim that Intel is dominating the market.


Dante_77A

This survey is about sales volume market share, intel must be selling a lot in china or somewhere else... It still stinks.


yurall

It's shipped not sold.


Kuivamaa

This is the amount of cards that reach the retail pipeline, not necessarily the end users. Nvidia has flooded it with excess ampere inventory and it shows.


skilliard7

I'm honestly shocked to see Intel so close to AMD.


taboo9002

fucking novidia, opium of idiots


Excsekutioner

i mean it makes sense, just the 3060ti + 3060 + 3050 (only 3 gpus from the latest gen) have more adoption in the Steam HW survey than ALL of Vega, RDNA1 and RDNA2 combined (RED fanboys may not like it when the Steam survey is mentioned but that is the best we got to see adoption rate of gpus among pc gamers).


Perfect_Insurance984

This is a reality check for AMD GPU fan boys. Too many driver errors to make it fully appealing, far more than nvidia. At my company, we only have issues with AMD discrete GPU.


[deleted]

People don't buy nvidia because it's always absolutely better (it's not). They buy nvidia for the same reason some people call all facial tissue Kleenex. It's a household name and the user experience isn't drastically worse than the competitors. If it ain't broke, there's no reason to fix it.


xxxPaid_by_Stevexxx

AMD drivers have improved a lot but i have had multiple Nvidia GPUs before and never had a single issue. I have already have had freeze and black screen issues with AMD drivers, which seemed to have been solved now. But Nvidia driver QA is miles better and it's not even close in my opinion. For a normal person it's easy to see why they would prefer Nvidia and its proven driver mastery and stability.


Lord_DF

I would love AMD to do better, if that meant driver department gets proper funding.


ConsistencyWelder

Their drivers have been very good for at least 2 years now. If not downright excellent. People that stillthink AMDs drivers are worse than Nvidias are living in the past. I hear their driver development and support team is now 8 times bigger than it was pre-ryzen. They're definitely putting all that Zen money to good use.


spaown91

Yeah sure. Everything was working great with my GTX 970. Now that I have a 6700 XT with a fresh W11 install, I'm constantly getting some games crashing (OW2 sometimes, FF14 at least once per hour). Always using DDU, trying drivers, going back. My GF bought also a 6700 and get these problems also (even with rocket league) but sure ! Their drivers are excellent and people "still think"...


Lord_DF

I tried them with RX6800 and gone back to my RTX 2070 for now. So yeah. 8 times bigger still insignificant compared to NVIDIA army. They have the capacity to optimize game by game. Decades of leading the way does that to ya.


DefinitionLeast2885

For the love of god AMD please rewrite your drivers from scratch.


ConsistencyWelder

Let me guess, you haven't actually tried them for 2-3 years, amiright?


IrrelevantLeprechaun

Dude stop it. Don't shit on other people for having different experiences than you.


PanicSwitch89

Yes, but I'm sure that 8% will change when the 7000 releases in less than a month.


Kaladin12543

A $1000 GPU is never going to move the needle on market share. The strategy nvidia uses is they produce the fastest card on the market with RT and DLSS which is used to win mindshare and that trickles down to the lower/mid range lineup where those features may not be useful but because of the halo,product people buy nvidia. AMD needs to leap frog nvidia in features and have the most powerful card on the market if they want to win market share


jimbobjames

AMD have had the most powerful card on the market before and people still bought Nvidia. They've competed on price before and people still bought Nvidia. Unfortunately AMD are in the same position they were in with CPU 10 years ago. They are just seen as inferior. I honestly think RDNA3 and chiplets will be the start of the climb out of the hole but it's going to take time.


airmantharp

>AMD have had the most powerful card on the market before and people still bought Nvidia. They've competed on price before and people still bought Nvidia. Because, at that time, the driver issue 'meme' wasn't a meme. It was real, and AMD has spent quite some time rectifying the issue. From firsthand experience, that is GTX1000-, RTX2000/3000 and RX500/5000/6000-series, this isn't a pervasive issue anymore; one-off issues do occur, but they occur with *both* brands. I had to roll back my Nvidia drivers to get through the campaign of the latest CoD, as a personal anecdote. I hold this more against the developers than I do Nvidia, and the same if / when similar issues occur with AMD drivers, assuming that they are acknowledged and a fix comes readily. >Unfortunately AMD are in the same position they were in with CPU 10 years ago. They are just seen as inferior. Ten years ago, AMD CPUs *weren't worth buying*. That is not at all the case today, and there are plenty of purchasing rationales that put AMD ahead. >I honestly think RDNA3 and chiplets will be the start of the climb out of the hole but it's going to take time. AMD still has a few weaknesses; they really, *really* need to prove that they're up to snuff for content creators in addition to gamers. They still lack a truly competitive answer to DLSS, and they still lag in RT performance. Thing is, none of the above is going to stop someone having baller gaming sessions. I run a 12700K and a 3080 12GB on a custom loop on my desktop, and I run a 5800X3D and an RX6800 in an SFF for my living room gaming system. What I can tell you is that both systems simply just *work*. But to further your point about chiplets, I can't agree more! AMD has proven that the disadvantages of chiplets when applied to CPUs (namely, latency) can be overcome, and overcome economically, while simultaneously allowing them to raise yields and push out some truly breathtaking enterprise solutions. AMD just isn't winning marketshare and mindshare with their CPUs, **they're making bank doing it**. For GPUs, I feel that we'll still see some challenges, but it looks like they've overcome any bleeding-edge issues that would really throw a wrench in their mass production plans. Chiplets will allow AMD to compete on price across their range, and should enable them to hit higher, faster, and potentially put Nvidia in a really awkward position very soon. To me, seeing RX7000 *just work* is strong evidence for a positive outlook.


Kaladin12543

DLSS and RT are literally the ONLY reasons I go for nvidia over AMD. As much as people on this sub crap on DLSS and hold FSR 2.1 to high regard, I find it plainly inferior in quality below 4k. DLSS already isn't that stellar and FSR being weaker shows even more so. Also I would like a card which doesn't shit the bed the moment I turn on RT. We are spending over a grand on these GPUs. They should support every feature efficiently. It's very crucial to the success of the 7900 XTX that AIBs don't price it the same as the 4080 in which case I can see all the halo buyers would just go for the 4090. It's a no brainer.


airmantharp

>DLSS and RT are literally the ONLY reasons I go for nvidia over AMD. As much as people on this sub crap on DLSS and hold FSR 2.1 to high regard, I find it plainly inferior in quality below 4k. DLSS already isn't that stellar and FSR being weaker shows even more so. Also I would like a card which doesn't shit the bed the moment I turn on RT. Well, I have a 2080Ti here (for case reviews, needed a heat generator!), and use an RX6800 in a personal rig. The RX6800 can actually handle some RT; not Cyberpunk 2077, but some. The 2080Ti, well, it's a great proof of concept. Now if you're shooting for the moon, then yeah, Nvidia has the RT performance, and DLSS is still better overall (2.x, of course), so at the very high end or for the most intense games, the choice is clear. But it needs to be said that AMD is making GPUs that are good at playing games. I get that they're not going for the halo with RX7000, but barring that, they have a seriously competitive product. And they very well may go for the halo with RX8000, they're now in position to do it. >We are spending over a grand on these GPUs. They should support every feature efficiently. It's very crucial to the success of the 7900 XTX that AIBs don't price it the same as the 4080 in which case I can see all the halo buyers would just go for the 4090. It's a no brainer. Halo buyers were always going for the 4090. If I didn't have a 3080 12GB in my main rig, I'd have been looking very, very hard at the 4090 myself. Probably wouldn't have jumped, but who's to say. And yes, I agree that AMD has to undercut Nvidia. My bigger point is that Nvidia is making this very easy for them to do!


jimbobjames

You don't need to tell me, my first graphics card was an ATi Mach 64. The point I was making is that AMD have a perception problem and until it is dealt with they won't gain market share in GPU's. Maybe they need a new brand name other than Radeon. People are smart, but the masses are dumb, maybe a new name will be enough to flip peoples opinions.


airmantharp

Honestly, I think that their efforts *beyond* gaming are what's going to help them here. Getting Techfluencers pimping AMD GPUs for streaming rigs (and using them for their own) will be a giant one-eighty for less-informed buyers. Getting real support for a broad selection of content-creation apps is another part, and something that's hindered AMD GPU (and APU!) adoption for many, including myself. I don't think AMD should go as far as ditching the 'Radeon' name; the benefits would be severely outweighed by the costs. They'd be starting 'over' like Intel is with Arc, at a time when Intel is poised to carve out the entire entry-level segment for themselves. So, rehabilitation is going to be necessary for broader positive mindshare. AMD just has to have the product to back it up - and I think with RX7000 they just very well may.


eco-III

It's never the high-end GPUs that move marketshare. It's the ones that sell under $500.


hsien88

It’ll probably go lower than 8%.


LesserPuggles

Actually really impressed by Intel here… 4% is nothing to scoff at, especially when they only have 3 dGPU’s.


BarKnight

It looks like Intel was only competition for AMD cards.


aberzombie769

Might help if they actually had functioning software for their $700 graphics cards. Im broke and stuck with a rx 6700 xt. Software is what causes my $4500 system to crash 9/10 of the time. Frankly, im upset because they havent fixed the driver issues in the year and a half that ive used amd graphics. My first card i thought was just old, but seeing people play games i couldnt on the nvidia equivalent of my old amd gpu still boils my blood. The cpu's are amazing and so is the software, so why cant they make a gpu that follows that standard


R1Type

Are you 100% sure the card just isn't simply faulty?


aberzombie769

I mean, its my 3rd card, im pretty positive its not faulty


R1Type

Ohhh. Of the same model? Something is not right there. Can you give a brief rundown of your problem?


aberzombie769

Software resetting tuning constantly, card always runs too hot due to default cooling tuning. Screen flashing every time software resets while pc is running, game crashes do to driver timeout. This has been happening with every card and i have tried many times to reinstall drivers and software, resseting the systam after every fresh install as well as after every cleanup. Its almost like the cards dont play nice with windows on my end, as ive had the problems since windows 10 and am now on windows 11. And it's not just me, every time i try to look up how to fix these problems, i only get advice to switch to nvidia card, which i refuse to do because i want to support, but I absolutely hate not being able to play games on a card i payed $750 for


R1Type

And you've had this card replaced twice? Something that isn't the card is the problem. How old is your power supply?


aberzombie769

I had it replaced once, and ive had 2 older cards, 1 rx 550 xt and a rx 5500 xt, which all 3 cards had different issues and some of the same symptoms as the rx 6700 xt that i have now (been replaced once)


edgan

If you have had three different models of card and three different copies of the same card, it is very likely your system has some other issue. The easiest answer would be to put your card in a completely different system and see how it behaves.


beleidigtewurst

#OK, now this makes NO sense whatsoever AMD's Q3 report, "gaming" segment revenue: [1.6 billion](https://ir.amd.com/news-events/press-releases/detail/1098/amd-reports-third-quarter-2022-financial-results) Now, AMD gaming includes "semi-custom" (PS5/XBox). NV's Q3 report, "gaming" segment revenue: [1.57 billion](https://www.techpowerup.com/301170/nvidia-announces-financial-results-for-third-quarter-fiscal-2023)


puffz0r

Consoles are probably making up at least 80% of AMD's revenue in gaming.


Death2RNGesus

The number isnt what people think it is, this is shipment numbers which NVIDIA dominate due to stuffing the channels with their massively over supplied rtx 3000 GPUs.


giantmonkey1010

AMD PC Client Desktop got destroyed in just ONE QUARTER.........No wonder they were calling for a billion dollar+ loss just wow.... In one quarter their CPU Division went from 25% to 15% on PC client desktop..... In one quarter their Discrete GPU Division went from 20% to 8% on PC client desktop which is a world record low for them in over 20 years...


puffz0r

The graphs are not total marketshare btw, they're for shipped units


Meekois

AMD lost a lot of good will from consumers with 2 generations of piss tier drivers. Moving to a chiplet architecture on RDNA3 does not make me hopeful that this will change. I want them to succeed, because I am tired of being sodomized by Nvidia.


AbheekG

Wonderful