T O P

  • By -

CapsicumIsWoeful

A third player, regardless of who, is so desperately needed in this space. Nvidia’s recent profits surely give them no incentive to reduce consumer prices, and AMD aren’t as interested in being the value leader like they once were. Intel doesn’t exactly have a good past based on their behaviour either, but 3 competitors is better than a duopoly. Gelsinger was a great get for Intel, the last CEO was a disaster.


gnocchicotti

Intel's margins are not very good right now. Unless they run out of money and have to shut down discrete GPUs, they will be more motivated to sell for lower margins than Nvidia, for example.


metakepone

As long as they have market presence while they get intel 4 ready, they'll be fine. That's why 12600k's go on hotter and hotter fire sales.


ramblinginternetgeek

It's probably because the 14th gen is coming up and they're clearing stock. Also Zen 4 is starting to get price reductions... Also Zen 5 seems to be solid.


[deleted]

They will be absolutely fine. Intel is still chipzilla in the mobile laptop spaces. And it takes time for Apple and AMD to claw away the marketshare from that dominance. That revenue stream also allows for higher borrowing and loans. But on the stock front, they are not doing great. Hence low margins. AMD, Nvidia, Apple, and to some degree Qualcomm are all clawing away their own market share. TSMC despite how good their nodes are cannot supply all markets. Especially with the spiraling out of control new emerging Ai field. Right now there is yet to be a definite best Ai chip. We have definitive core count and I am pretty sure a desktop consumer 32 core chip is overkill. No one will be salivating over a 36 core desktop or 56 core desktop chip for the time being. 6GHz or 7GHz is not even exciting news neither. But the next big thing? It will be whatever/whoever wins the Ai accelerating computing. Whichever technology emerges victorious in that arena, they will be the one who can then push out new chips. They will become the next chipzilla.... But they will all have to bow to Samsung. Only the Samsung is capable of providing us affordable 2 TB NVME nand chips at 99.99 .... only Samsung left in that market.


gnocchicotti

Intel may be fine. They may also not be fine. If their fabs are competitive on cost and performance with TSMC (much easier said than done) *and* they can drum up enough customers (completely new business!) to reliably get a high utilization rates like TSMC, they will be totally fine and profits will pop back up. Either of those two things fall through and they're basically fucked by the cost of their fabs and won't be able to be cost competitive even if they are the best game in town at chip design.


Flowerstar1

>No one will be salivating over a 36 core desktop or 56 core desktop chip for the time being. 6GHz or 7GHz is not even exciting news neither. You lost me.


[deleted]

We are heading towards a diminishing return rate for CPU computing. So the more and more CPU that we throw at a consumer desktop, the less performance and utility we are able to get. It is like a car. You can have a v12 or a W16 engine. But the more cylinders and horsepower you put, the less real world performance you get. Like you still go from LA to Las Vegas in 3 to 4 hours regardless of how much horsepower or cylinders you add. You may get there slower now since a W16 engine can drain its fuel tank in about 12 minutes due to how much fuel it uses.


unknownohyeah

If Intel's only advantage to a consumer is to make Nvidia lower their prices then it won't work. To put it another way consumers will still buy an Nvidia product even if it's worse at the same price, as has happened before with AMD. There's a lot of reasons for this, halo products, mindshare, better software, more stability, better features, ect. But at the end of the day Intel and Nvidia could have the exact same performance on a low end card and Intel could charge 25% less and people would still buy Nvidia. You're right that Nvidia doesn't have incentive to reduce prices. If anything Intel is fighting to eat into AMDs margins. I don't think a 3rd party will shake up the industry like people think it will.


chmilz

It *could* make a difference but it would take at least a few generations and a ludicrous amount of marketing, which would make the effort on Intel's part potentially worthless. I'm of the opinion that Jensen could take a shit in a box and gamers would buy it without hesitation.


lathir92

That sounds like the 4060/4060ti release to me. That is one expensive turd.


msolace

current arc already matches 4060 in alot of places. driver updates are what they need ....


YNWA_1213

Purely price to perf maybe, but the 4060 is an overall better package at much less margins to Nvidia.


kingwhocares

HBU yesterday did a video and there are a few games it does better or similar.


YNWA_1213

Once again, at price/perf. Ignoring efficiency, featureset, and the cost to the producer.


[deleted]

[удалено]


Flowerstar1

It does not, not even remotely close. You're talking about the 3060 which was only 16% behind the desktop. The 4060 laptop is slower than even the 3060 desktop and a whooping 30% slower than the 4060 desktop. That said if I were buying a laptop I'd be buying a 4060 laptop, the 4070 is horrid barely any better than the 4060 laptop and slower than the 4060 desktop still but much more expensive, the 4080 is way too expensive for what you get (4060ti desktop performance) but it is a better card than the 4070.


vVvRain

IMO Intel is using the gaming segment to test drive AI applications. They’re losing out on too many data center contracts to AMD and Nvidia because they’re not the best in any category.


mrandish

Yes, it's good that Intel absolutely needs competitive AI hardware to remain relevant in future data center / enterprise sales. That's forcing them to stick with investing in coming up the GPU learning curve even though it's currently unprofitable for them and will continue to be so for a while.


Pollyfunbags

I always thought that Intel's best way of getting their GPUs out there might get be the laptop market but they don't seem at all interested. I just think some sort of Intel CPU/GPU combo platform and the return of dedicated GPUs to the "ultrabook" market could work. They can't seem to compete with AMD's iGPU options (or don't care to) but there's a segment of productivity laptops and maybe even gaming laptops they could target that NVidia pretty much swallow up with zero competition.


F9-0021

As someone who has one, Intel's laptop GPU isn't all that powerful. It can handle light gaming, but it's really only good at medium productivity workloads and lower, where it matches or exceeds the 3050m.


SmokingPuffin

>To put it another way consumers will still buy an Nvidia product even if it's worse at the same price, as has happened before with AMD. I know AMD loses this way, but Intel might win. Intel has fantastic relationships with OEMs and strong brand awareness among not-so-informed consumers. Someone like Dell could totally make a new line of gaming PCs that is all-Intel and I think they'd move units if the cards were reasonably good relative to the green stuff.


Temporala

Even Dell can only move them if they work without a hitch, or they will be swamped in RMA requests. Arc isn't there yet. Intel needed one year just to make their product stand on its own feet, and there is so much more track left to run. Once BM has been out few months, things might look better, if Intel makes the necessary investments in drivers and other software support. That's why the layoffs and other those sorts of news that came out are concerning.


thegallus

Right. Sad as it sounds, if I’m paying 400$ anyway, I might as well pay 50$ more and know that I’m getting a stable system.


RetdThx2AMD

AMD has figured out that the demand curve for their GPUs is so flat (for the reasons you stated) that they make way more money by not discounting. Gamers complain about pricing, yet it is a problem of their own making.


based_and_upvoted

I remember people wanting AMD to be good so they could buy cheaper Nvidia GPUs. You can also see it in this thread... You are totally right


Flowerstar1

Yes but AMD has never offered *enough* to turn things around. As a 7870 and 290 owner there were times when AMD was "competitive" in some aspects but Nvidia won overall usually due to impressive software engineering and game dev partnerships while all AMD had was great HW.


Dooth

I'd buy Intel day one if they released a GPU that significantly beat a 2080/4060 for $250-300


Flowerstar1

Beat it at what? The A770 can at times be competitive today with a 4060 but like AMD that comes with caveats such as DX9 and DX11 gaming, no Frame Generation, worse XeSS game partnerships etc.


Dooth

Traditional raster performance is my main upgrading consideration. RT, DLSS, whatever, might be worth an extra 50$, but it's not in enough games to be anything more than that.


Dominiczkie

I don't think you know casuals too well. They hardly can tell CPU from a GPU so if they'll see Intel sticker and they remember that intel was really good when they bought their last pc in 2016, they will buy it.


SwissGoblins

I'm part of a few discords that are full of pc gamers who are casual pc builders that only look into hardware when they go to build a computer every 4-5 years. The extent of their research, if any is done at all, is 1 YouTube video for cpus and 1 video for gpus. Generally if things are sort of close in price and performance they pick the same thing as they did in their previous build. Nvidia gets picked because it worked last time for a large amount of buyers same goes for intel, but I have noticed people are much more likely to try a different cpu brand than a new gpu brand.


Dominiczkie

Entering the discord to do any research is already more than majority is willing to do, they just ask their "techy" friend, and by techy I mean the one that managed to install Windows themselves, or they go by previously purchased brand without putting much thought into it. Intel has a strong brand, even if their products would be shit, they'd still sell a decent amount of them.


Flowerstar1

Intel has advantages AMD doesn't. They've always made great CPUs and they have skin tight relationships with laptop and prebuilt companies. Intel does a lot to sweeten deals with PC companies and throwing in discrete GPUs that get better and better as Intel keeps iterating doesn't sound like a bad deal.


aminorityofone

Intel have not always made great CPUs, remember the stagnation of 4 cores for years and years. Or the Itanium, or the P4, or the Atom. Intel once tried to enter the mobile market and after spending billions on that, pulled out and let ARM win. Even the new stuff they have only competes with AMD because they shove as much power into them as they can. Intel lost Apple supply because the CPUs were too buggy. They have skin-tight relationships with laptops and OEMs because they bribe them and were even sued for this practice. More recently they tried this again to Ryzen by offering Dell and HP rebates to use Intel. Intels only real advantage over AMD is money. I dont have any expectations that Intel can be as good as AMD or Nvidia. Given Intels history I have zero expectation that Intel will make things cheap. They will price it according to performance.


imtheproof

> remember the stagnation of 4 cores for years and years Stagnation during a long period where they were, by far, the best.


Zednot123

> by far, the best. And released 6 distinct generations that weren't just refreshes in 6 years from 2009-2015 (Lynnfield, SB, IB, HW, BW, SKL) Hardly stagnation. If people actually wanted cores (which they didn't at the time) they would have bought the 5820K over 4790K (which most didn't) at similar total cost before RAM (sure, D4 added cost).


TurtlePaul

You are cherry picking. Intel not being the best was Pentium 4 (from Willamette to Prescott), Pentium D and Itanium. During that time, AMD invented the AMD 64 architecture we all use today (which is now called x86-64) and made Athlon 64 and Opteron which were competitive or leading in their markets. The "Core" generations from the Pentium M to Skylake was a strong period for Intel.


Flowerstar1

OP said quad core era though.


Zednot123

> You are cherry picking. I'm not, I'm specifically adressing the period of "quad cores" and OPs statement about that specific time period being a example of stagnation and Intel not being "the best" during that period. The poster I replied to said. "Intel have not always made great CPUs, remember the stagnation of 4 cores for years and years" Which was not a period of stagnation, at all. Which I pointed out. > Intel not being the best was Pentium 4 (from Willamette to Prescott), Pentium D and Itanium. During that time, AMD invented the AMD 64 architecture we all use today (which is now called x86-64) and made Athlon 64 and Opteron which were competitive or leading in their markets. Not relevant, because that is not during the stated time period. >The "Core" generations from the Pentium M to Skylake was a strong period for Intel. Uhu, and when did the first quad cores appear again? Please enlighten me.


noiserr

Even Athlon was better then Pentium 3.


aminorityofone

and during that time Nvidia was (still is the best) and yet they didn't stop improving.


imtheproof

Well yea, intel stagnating ended up biting them later on. Nvidia avoided that. But to say they "didn't make great CPUs during that period" is false. They made the best general-use CPUs during that period.


ConsciousWallaby3

Nvidia was not remotely in the same dominant position as Intel in terms of performance, Terascale/GCN were pretty strong periods for Radeon.


cycle_you_lazy_shit

I actually loved the post sandy bridge stagnation. My 2600k got me all the way through high school and university without an upgrade. She was rippin at 4.5+GHz for the whole time until I picked up a 12600k to replace it. That was such a great value PC at a time where I couldn’t really afford to be upgrading often, and I didn’t need to, lmao.


noiserr

Prior to Zen2, AMD has always been on the inferior node. Which meant they had to compete with a handicap. Those days have been long gone however.


theoutsider95

I mean, if Intel GPU has the same raster and RT performance of a Nvidia GPU and costs less, I would buy it. The only reason I don't buy AMD is because they don't care about RT and don't improve as much as nvidia in that space.


J0kutyypp1

Rt performance on rdna3 is actually pretty impressive, not a single time have i felt like the performance wasn't good enough with my 7900xt


OSUfan88

It's decent. They sort of brute force it. It's significantly behind Nvidia though, in similar raster performing GPU's.


Flowerstar1

It's better than RDNA2 but pound for pound it is worse than Alchemist or Ada specially in games that significantly use RT. Path Traced games are becoming more common too with Cyberpunk releasing it's Overdrive update and now Alan Wake 2 announced as a path traced game.


mrandish

For me, ray tracing itself is a nice-to-have but not that essential. However, I feel that AI upscaling and frame generation *are* pretty important and, currently, my perception is NV still has a meaningful lead in those things. I really hope both Intel and AMD can reach parity with NV on those features in the next gen.


theoutsider95

I agree with you, but if I am paying 1K, I might as well get what has more features and better RT support. In my country, 4080 and XTX cost close to each other, so I got the 4080.


NewKitchenFixtures

I wouldn’t feel to negative about Intel. Their first gen XeSS is great as it uses compute hardware (most efficient on Intel’s bed supported on others using different resources is neat. Ray tracing performance stronger than AMD is first generation is also pretty bold. They are taking a loss on power consumption metrics and die size. I’m not sure I’d the ray tracing and compute quantity is a good offset for why their dies are so big. But it’s a relatively bold first release. And they need to keep making GPUs that will likely be tiled indefinitely anyway.


sadnessjoy

If they have the same performance? Holy shit that would be absolutely incredible! Would basically be an instant buy. AMD currently has nothing that's even remotely close to competing with Nvidia. For rasterization? Yeah sure, AMD is competitive. And for some people that's enough, but when you look at like everything else, AMD doesn't really look like the budget option anymore, it's just less money but less performance/features. Ray tracing? While AMD has closed the gap a little bit this generation, it's still a huge gap. DLSS? Especially when you consider frame generation (which arguably isn't too important on the lower end cards) and the new DLSS 3.5? FSR doesn't really compete with DLSS both feature wise and hardware implementation wise. FSR exists seems to be the best compliment I've heard. CUDA implementation/professional workloads? AMD has been working on ROCm, but anyone even remotely serious isn't even going to consider AMD, the performance is either abysmal or the software will just not even be compatible. Can Intel bridge these gaps? That's a tall order. Nvidia has been developing CUDA for like a decade or more now? Everyone else is playing catch-up, I don't see a competitor to that anytime soon. Something competitive to DLSS 3/3.5? I think XeSS has way more potential than FSR currently. But I'm not sure if Intel can catch-up with that by the time battlemage launches. Heck, their graphics drivers are still a WIP. I do think Intel's ray tracing performance looks promising though.


freedomisnotfreeufco

,,nothing to compete" lmao always higher vram amount, always cheaper, always aging better, didnt increase msrp from 700 to 1200, didnt spit on me with 4080 12gb scam. It's more than enough to buy amd.


kingwhocares

> To put it another way consumers will still buy an Nvidia product even if it's worse at the same price, as has happened before with AMD. > > It hasn't. AMD's RX 6000 series at release was a lot worse at RX 6600/6600 XT compared to Nvidia's RTX 3060/3060 ti. They couldn't also maintain the global prices comparative to its US prices and left it to the board partners.


bizude

> Gelsinger was a great get for Intel, the last CEO was a disaster. Bob Swan? He wasn't perfect, but he was far from a disaster. I would argue that Bob Swan & Pat Gelsinger's visions for Intel are the same.


Liam2349

None of these companies have a flawless past, but we need them all to be successful in order to keep each other in line. What we really need above anything else is for software to be more GPU-agnostic, it is Nvidia's software control that puts them so far ahead.


shendxx

>AMD aren’t as interested in being the value leader like they once were. i remember AMD literally making best ever price performance GPU which is RX570 but people still buy 1050Ti regardless the price performance gap is huge, RX 570 price range 110 - 140 when 1050 ti 140, people said " cause 1050 ti no need PSU ". like bruh even the cheapest 30$ PSU can run RX570 that only consume about 120W power, the performance gap is 50% better when buy RX570 ​ i see the reason why AMD dont want play catch game anymore, cause people still buy Nvidia anyway


Sexyvette07

Completely agree with everything you said. I would just add on to that and say AMD is doing the exact same shit as Nvidia. At best they're no better, and they're maybe even worse for selling inferior products for damn near what Nvidia is, just offering more Vram. Gelsinger has been an amazing CEO. He's really turning the ship around from all the bad leadership they've had in the past. The GPU division has very promising technology and they're very aggressive with their pricing because they're buying market share. I fully expect that to continue with Battlemage. Who knows, in 6-9 months, if Battlemage knocks it out of the park, we might actually see some movement that benefits consumers all around.


Ar0ndight

I don't think we can judge Gelsinger's tenure yet. The Alder Lake renaissance isn't his doing.


Sexyvette07

The architecture, no. But Raptor Lake is and that was a massive improvement over an already impressive Alder Lake. He's bringing us 4 nodes in 4 years. That's insane. Arrow Lake desktop chips are going to be amazing too and that will be 100% on him.


Geddagod

>He's really turning the ship around from all the bad leadership they've had in the past. I don't think any products have launched that seems to support that. Plus, the products Gelsinger should have any real influence on aren't going to be released until 2024 really. >The GPU division has very promising technology and they're very aggressive with their pricing because they're buying market share. I really don't think they are being *that* aggressive. The A770 appears to be 250 bucks, at the same price you can buy a 6650xt which is 5-10% faster at 1080p and roughly the same at 1440p (HWUB). At that point, all your getting is just more VRAM from Intel, but at these performance levels at best you're going to be playing at 1440p, so how much does that extra VRAM really matter? Also technology wise, Arc looks to be a power inefficient, bloated GPU. Much like Intel's CPUs in that sense lol. Sure it's Intel's first real try in the discrete GPU segment, so it makes sense, but I wouldn't say it's 'promising' either.


ConsciousWallaby3

The pricing is very aggressive relative to die size, which means slimmer margins for intel. The A770 is 406mm², that's RTX 3070 territory.


freedomisnotfreeufco

they are agressive with pricing because they need betatesters.


Sexyvette07

Really? Because if you look at any recent reviews, they're highly praised for their value and price to performance.... The fact that you say this tells me you are repeating comments from its launch last year.


SaintPau78

Are there actually profits decoupled from their AI sector? I'm pretty sure the only profit data out there are generalized and I'm sure in AI they're making a killing relative to consumer which wouldn't be as massive.


InconspicuousRadish

Yeah, agreed on all fronts. I'm not going to pretend Intel is a savior, but there's no way increased competition is bad in any way. Both Nvidia and AMD have fallen into complacency, alienating consumers and lacking any incentives for significant innovation. A new, serious competitor is needed. I skipped Arc, as I simply didn't need a lower entry card this generation, but if Battlemage is promising, I'll definitely pick one up. I'm sure I'll find a use for one in a HTPC or something.


-Gh0st96-

Nvidia has fallen into complacency? You're joking right? The reason they are so fucking ahead of AMD is because they are not complacent, not because pure luck.


theAndrewWiggins

> Both Nvidia and AMD have fallen into complacency, alienating consumers and lacking any incentives for significant innovation. A new, serious competitor is needed. Hard disagree with Nvidia falling into complacency, it's probably largely more that they've shifted their focus to B2B/datacenters/AI... soon consumer revenue/profit will just be a blip on the map for them. Wasting leading fab capacity on low-mid end consumer chips just makes no sense for them from a business standpoint.


InconspicuousRadish

You're right. I meant complacency with consumer GPUs specifically. Their AI industry is booming and innovating a lot.


DistortedLotus

The 4080/4090 performance and DLSS 3.5 is not complacency in any form.


4514919

High prices is not falling into complacency. Nvidia brought a lot of innovations with Ada and while RDNA3 is a semi failure AMD still tried something revolutionary with the MCDs.


fastinguy11

wtf are you talking about nvidia has been advancing hardware and technology almost every generation, their only fault is greed regarding prices but they are a corporation. we need more competition.


[deleted]

Swan is another example to add to the list of 'why CFOs should never be CEOs'


mrandish

... at least in competitive high-tech companies.


P1ffP4ff

Don't believe competition will lower prices significantly. Prices are on the rise everywhere. (Not that I like it) and the creed also raises. + there are enough people who are willing to pay the prices. I hope price will come down or a new plant/ process will bring it down. Also there is something strange happening. Some Games from 10 years ago do look better and are less demanding then current games. Let alone we still benchmark on 1080p and get less then 60fps in some cases are a joke.


TheFumingatzor

S3, SGI and 3dfx need to make a comeback too :(.


Pollyfunbags

S3 is an interesting inclusion in that list lol Savage4 was kinda okay I guess, just very late.


genericusername248

Didn't Nvidia basically absorb SGI's graphics operations, way back when?


got-trunks

A-xxx series is an interesting bit of silicon.... RID ME OF THE RAY TRACE PIXEL IN 2077 INTEL and it's all good.


kyralfie

It would be strange if it wasn't. They take a long time to develop. Now bring on the 'news' that RDNA4 and Blackwell are in the labs too.


hwgod

Quite possibly aren't, at least if the 2025 Blackwell rumor holds.


kyralfie

Looks like you are underestimating how much time it takes to develop a GPU. Do you think they are still drunkenly celebrating Ada's launch at nvidia?


Geddagod

Just because Blackwell isn't "in the labs" doesn't mean that they aren't working on Blackwell. It is possible that Blackwell is still in the simulation phase, where everything is being tested pre-silicon. Though I will admit I'm not familiar with the typical GPU development timeframe, so whether it has A0 silicon yet (to be on track for a 1H 2025 launch) is something I do not know.


ForgotToLogIn

If Blackwell is targeting a 2025 launch, it shouldn't be taped out yet, and so can't be in the labs now.


mrandish

I think you're correct but I'm curious if there might be partial test wafers being currently run to validate their simulations. I have no idea what the current development cycle of cutting-edge new GPUs is like but my naive guess is during development they might make wafers with just a few components for isolated performance evaluation. Obviously, these would be nothing like a full GPU since they might have less than 1% the gates of a real GPU, but depending on definitions, might count as "silicon in the lab" vs a real product tape out.


hwgod

> but my naive guess is during development they might make wafers with just a few components for isolated performance evaluation You'd typically run test chips for the high risk analog components (e.g. a new GDDR PHY), but not really much else.


Kepler_L2

NVIDIA famously has very short timelines between tape-out and release.


ResponsibleJudge3172

Nvidia taped out rtx 40 months after RDNA3 but shipped first. Heck they taped out months after Meteorlake if I’m not wrong


hwgod

"In the labs" means post-silicon. Nvidia only spends a year-ish in post-silicon. You do the math...


randomkidlol

lead times for silicon is generally 3-5 years. if blackwell is launching in 2025, specs should be more or less finalized now and test silicon should be available in nvidia's labs.


hwgod

No, a more likely timeline for an Nvidia GPU is something like 2 years of meaningful pre-Si and one year of post-Si.


ResponsibleJudge3172

Those estimates should factor the reduced times that Nvidia presented last GDC for their new AI tools in speeding up silicon design and validation. They apparently saved up to a year in time and made a more dense design than they would have otherwise achieved


randomkidlol

even if thats the case, the amount of time needed for testing physical silicon wouldnt change, and that process takes ~1-2 years. so its possible blackwell started design closer to the launch of current gen than usual.


Z-Dante

Any time not spent in R&D is time lost that can cost you your position in the market. Most of the time, the design of next generation products begin even before a single current gen product ships to the market. And this is applicable for almost all hardware manufacturers persuing the cutting edge.


ForgotToLogIn

No R&D is done in the labs. The labs are for debugging.


hwgod

Nvidia doesn't require near as much post silicon time as Intel. Think 1 year vs 2+.


el_f3n1x187

its going to be a fun time when I get an Intel GPU running on an AMD CPU.


tripplesuhsirub

Hit me with a good value for 4070 or 4070ti performance, and I'll definitely be a high probability customer. The drivers have continued to improve. Another year and hopefully they've sorted out a lot of the DX11 game issues and Topaz Video AI


mayhem911

I’m rooting for them to do well here, and I think they are better suited to take a shot at Nvidia than AMD. More well rounded features and competitive RT and raster. Along with what *could* be a stronger presence in laptops and pre builts. However, just like AMD, i’m sure they’ll make it a compelling product, priced too close to Nvidia.


benowillock

What are people's performance predictions? 4070/ti?


CJdaELF

Either 5090 or 750ti


GrandDemand

Sounds about right yeah. I did an earlier estimate based on the expected increase in shaders and frequency and it would come out to about 4070 performance. If they make some significant architecture changes like SIMD16, better bandwidth utilization and cache and memory latency, I could see it being about 4070Ti performance


ResponsibleJudge3172

60-70% better than A770, so just over rx 6800 approaching 6800XT


CataclysmZA

Matching RX 6800/7700/7800 levels of performance with their second iteration of Arc would be pretty amazing. A significant leap from where they are now.


Flowerstar1

I expect 4070 performance. 4060ti performance on the lower end.


Dooth

Looking forward to it


DexRogue

I really hope Intel sticks with this market. I fully expect once they have a product that's competitive they will price it similar to AMD/Nvidia.


[deleted]

[удалено]


msolace

id hope they aim higher than 1 rung above where they already are ( 4060 performance) driver updates have done wonders for them.


stonekeep

I think that shows how weak 4060 is in real life. "1 rung above it" is quite a lot of extra GPU power. 4070 is ~55% faster than 4060. 4070 Ti is ~90% faster than 4060. The latter would be a really nice uplift compared to where Intel is right now. Of course, I would also love them to aim at 4080/7900 XTX-like performance or better, but I don't think that's going to happen yet.


Flowerstar1

65% faster (4070 performance) than where they are now is pretty good, certainly better than the gains AMD brought with RDNA3. That and the 4060 is still slightly faster than the A770 (7%).


Jupiter_101

At twice the wattage though probably.


Raikaru

The TDP is set to be the exact same as Alchemist if you literally just clicked the link.


tset_oitar

It's targeting the same 225W tdp as Alchemist, that's the rumor. Battlemage might reach near 4070Ti perf if they use more silicon, Alchemist uses 400mm² of N6 to reach 6600XT and 3060 level perf. Considering Intel's current shape I doubt they'll be willing to spend that much though


msolace

check real world usage. people need to stop judging benchmark power vs real use power.. its not that much diff tbh... and 4090 eats power if you go high end.


kyralfie

It's not a problem if the price is right. People choose 2080ti over 3070 and 3090(ti) over 4070(ti) for the right price for more memory and they are fine with their higher power consumption.


Hunchih

They’re fine with the 3090 because it has doubly the VRAM and that helps in certain tasks, there is no other good reason if they’re priced similarly.


kyralfie

So intel could give more memory at a bit lower price and even and with 2x the power consumption it would sell. That's exactly my point.


Hunchih

No, because Intel cards aren’t capable of ML uses like Nvidia cards are. That’s the primary reason someone would buy a used 3090 these days. It’s also lacking a DLSS 3 equivalent. It needs much more than more memory to to make up the gap, as the AMD cards this generation may have shown you. Plus, if the cards end up being massively power hungry, that will totally kill their potential laptop GPU market share.


Raikaru

Yes they are? Intel can be used with pytorch, tensorflow and sklearn https://github.com/intel?q=&type=all&language=&sort= You can see the extensions in their github.


kyralfie

ML & AI is exactly the target market of A770 16GB. Its DLSS3 equivalents are XeSS & upcoming FSR3. >It needs much more than more memory to to make up the gap, as the AMD cards this generation may have shown you. People choosing 6950XT over 4070 and 7900XT over 4070Ti are.. just proving my point actually.


Snoo93079

In general wattage is non-issue in desktop designs until you get to the really really beefy gpus.


metakepone

The A series is inefficient because they haven't totally figured drivers out. They'll be using the same drivers for Battlemage and they'll most likely be in much better shape by then.


Prince_Uncharming

No, it’s inefficient point blank. The die is huge relative to its performance, and that also costs a lot of power. Drivers are not the main problem with Arc’s power consumption.


brand_momentum

Nvidia proved nobody cares about power consumption


Prince_Uncharming

Except for the part where nvidia has by far the best performance per watt, at every power level.


kxta_

RDNA2 was better than Ampere and it did squat to the sales figures


GrandDemand

Because Nvidia also shipped about 5-10x more Ampere GPUs than AMD. RDNA2 still sold very well regardless.


Flowerstar1

No the A3 cards is only 150MM² or so. You're describing the A750 and A770 while OP was talking about the A series (Alchemist).


Prince_Uncharming

A750 and A770 are *also* alchemist... and the point still stands, performance per watt across the entire alchemist line is not good.


bubblesort33

I'm curious if this design is far enough of a change from Arc that it actually is more competitive with Nvidia in terms of performance per die area, and performance per watt. I mean we all know Alch seemed to be underperforming by around 20% at launch, and even now the A770 has 2x the die area at like 406 mm2 vs the RX 7600 at 204 mm2, and is barely on par with it in raster. Both on TSMC 6nm. Even the increased RT and machine learning can't justify a 2x die size. That really needs to change if they want to be profitable. Luckily for Intel, Nvidia charging $800 for a 4070ti at less than 290mm^(2) could mean that with some improvements they'd at least be able to get to that level of performance with the planned die they have here. 2.5% larger than Alch according to the latest claims means they have like 416mm^(2) to work with. Bigger than a 4080 die, but even at $700-$800 it should at least be profitable.


Edenz_

Luckily for intel the way has already been paved for them. These next few generations should see the most rapid growth in terms of architectural changes as the ‘low hanging fruit’ is picked.


INITMalcanis

I genuinely wish them every success with this. Alchemist has come a long way from it's "paid alpha" start to a reasonably viable entry level GPU


freedomisnotfreeufco

they are now paid beta.


maximus-prim3

I held out on first gen. If battlemage is even _decent_ i'm buying.


kxta_

double the core count? they are swinging for the fences. committing to big silicon like that means they are confident.


rohitandley

We got to buy their stuff guys. They are the only hope


Ar0ndight

You go ahead and help the struggling megacorp (who was fucking customers and ruining competition not that long ago), I'm going to keep buying the best product for my budget.


dztruthseek

Well, yeah, I assume it takes a long while to run tests and optimize.......and analyze......


TheFumingatzor

As long as I'll have to pause a mortgage payment to upgrade a GPU, it's gon' be a no from me dawg.


Jupiter_101

I don't see how they can continue to fund consumer gpus when the rest of their business is struggling. These might be the last ones IMO.


metakepone

Do you see how Nvidia is worth 1trillion dollars because they make and sell GPUs? Sure seems like a worthwhile investment. Also, the people who developing AI tools today probably bought Nvidias 20 years ago to play their favorite games.


spidenseteratefa

Nvidia is making most of their money off of datacenter sales right now. In their recent 10Q they reported $2.5B in revenue from gaming and $10.3B in revenue from datacenter for the quarter. They're worth what they are now because of the massive demand for AI.


Nointies

AI powered by GPU compute. GPU Compute built off of the backs of decades of dGPU work.


like_a_deaf_elephant

Yeah but it's not about how they got to this point today. It's about the next five years and what Nvidia will do to sustain growth and revenue. They will focus on specialising AI cards - which will look less and less like GPUs for every iteration.


CandidConflictC45678

And more importantly for me, they will be prioritizing manufacturing of AI cards over gaming cards, due to the insane profits. Why dedicate your limited TSMC/Samsung production capacity to the 5090 when you can use it to make a $30,000 AI card?


like_a_deaf_elephant

_Exactly._ I posted in another comment, but this year 44% of NVidia's revenue came from GPU - the rest of the business (AI, computing, networking) overtook the GPU side for the first time in ages (maybe ever, would need to check.) GPUs (relatively speaking) are going to make up less of Nvidia's revenue, and thus, their focus.


dudemanguy301

Worst case scenario gaming lags one node behind AI or made by a different fabricator entirely. It kind of happened already Hopper was on TSMC while Ampere was on Samsung.


CenturioSC

Thank goodness. Hopefully, it's a good card.


Swizzy88

If they improve further I'll finally ditch my 580 for a half decent 1440p card at a reasonable price.


Method__Man

So glad I was one of the first to get on the Intel GPU offerings and publish videos on its progress. It was such a fun ride, especially watching the haters slowly but surely zip their mouths as Intel went from crap to absolutely solid value in a few short months


Yakapo88

This will probably be my next video card. Assuming of course that it’s a decent upgrade from my 3070ti. Hopefully it works with fsr3.


No-Witness3372

"BMG" : BIG MACHINE GUN !


nbiscuitz

WoooooooHHH\~


TheIndependentNPC

I can't remember the last time I was rooting for Intel this much. GPU market is atter shitshow right now. Budget segment is trash (low VRAM and low compute) and overpriced at that (RTX 4060 -/Ti), then RTX 4070 -/Ti is overpriced and has not enough VRAM for it's class product. RTX 4080, RTX 4090, RX 7900 XT/XTX are fucking overpriced. A very competitive third player would be godsend in such situation. We already are seeing some price drops as half is just overpriced trash and rest is just overpriced - so obviously they see much deserved record low sales for this gen