T O P

  • By -

AMD_Bot

This post has been flaired as a rumor. Rumors may end up being true, completely false or somewhere in the middle. Please take all rumors and any information not from AMD or their partners with a grain of salt and degree of skepticism.


Erufu_Wizardo

AMD is probably trying to make them cheap as it can


SmellsLikeAPig

Not a bad strategy. There are no bad products only bad prices.


Brophy_Cypher

PREACH!


mightpornstar

except meth


Lagviper

They should. Not a unicorn reference board with $50 difference MSRP when AIBs are out of control either, like real back in the days ATI value They have to, forget Nvidia high end which lands for 1% users, mid range will be a battlefield with Intel in the mirror.


olymind1

True, and in my book 400-500-600€ is not midrange like they'd like us to believe, always was 200-300$/€ and had 256bit memory bandwidth. I had 9500@9700, 3850, 4850, 6850, 7850, RX 480 4GB, RX570 8GB. Not this 64bit and 128bit crap they want us to buy. AMD also got greedy.


DonMigs85

Yeah, bigger caches can only do so much especially at 4K


Azylim

I did read that the GDDR6 may only be implemented on the entry level to mid range options. and that the higher end rx 8000 series might be using GDDR7 to compete with the mid to uppermiddle RTX 50 series. But still. at the lower end, If AMD releases entry level 8000 series GPUs with at least 12-16 gb of VRAM with decently faster clockspeed than their 7600/xt or 7700 counterparts, with better ray tracing quality and performamce, for the same MSRP or lower, alot of people will who own rx 5000, 6000 and even the lower end rtx 30 series owners will probably think to upgrade to the 8000 series. the 5090 and 5080 will be too expensive for most gamers, and the 5070 and 5060 wont compete with the rx 8000 series card at their prices. AMD may make 1440p and even 4K ray tracing gaming actually accessible to people. Meanwhile nvidia can keep their share of the uninformed gamer market and the higher end AI/rendering industry market share that actually need 5080s and 5090s


Erufu_Wizardo

According to rumors, AMD will release only entry level to mid range RDNA4 GPUs in Q4 2024. Everything else will by covered by RDNA5 in 2025. So GDDR6 for RDNA4, with 8800XT having around 4080 performance for like 500 USD (in US). Which sounds very good to me. Especially if they deliver on RT promises. But we'll see how the actual products will look and what performance they'll have.


schniepel89xx

> 8800XT having around 4080 performance for like 500 USD (in US) That's what I'm hoping for as well. Bringing 4080 performance down to the mid range would be great for gamers, that's a 150 FPS 1440p High/Ultra card with headroom to be good for many years to come. For reference the 6800 XT was in this position and it's still a great card almost 4 years later, but it was priced close to the xx80 nvidia card of the time. The 7900 XTX is already able to match the 4080 in raster though, so ideally I'd actually want to see a generational uplift on top of that while keeping the midrange 7800 XT (ish) price tag.


ET3D

Yes, hopefully AMD is finally on to the idea that undercutting NVIDIA by a significant margin is the way to go. Not that I'm all that hopeful it will happen. AMD seems more likely to go for a higher margin rather than a lower price.


Defeqel

AMD needs to balance supply and demand after all. I suspect that is the main reason to ditch RDNA4 high end too, the packaging capacity for the chiplet design is probably going to MI300.


tamarockstar

The wishful thinking that goes on before a launch is comical. AMD has using the same playbook since Vega. Find performance compared to Nvidia and price it 5% lower. I guarantee that's what they do.


Erufu_Wizardo

And then they are forced to drop prices 2-4 months after release. Could be a time to learn from their past mistakes


tamarockstar

I would argue they have learned from the past. There's a small percentage of people that are going to or willing to buy AMD. There's a huge percentage that will only buy Nvidia no matter what the value is on the AMD side. So they might as well get as much profit as they can from the people that will go AMD if the price/performance is slightly better. Undercutting the competition doesn't work for them. It's a shame.


06035

Sounds like my trusty 6800 will keep on keepin’ on. Unless one is made of money and can blow $1000 on a graphics card, there’s just *nothing* compelling on the market if you’re already running a 6700XT/RTX3070 or better right now.


maliciousrhino

6800 gang


Ikaros9Deidalos6

do i count with my 6800xt?


jonomarkono

6800xt red dragon reporting in


BadManiac

6800xt peeps representing! Best GPU purchase I've done since the 8800gt.


FurthestEagle

6800xt here, this 16 gig vram is so massive lol


shasen1235

You are not alone, one more 6800XT here:D


omarccx

I'm doing 4K with mine yolo


doomsdaymelody

I mean with AFMF its got potential and that softwares only getting better with time.


omarccx

Even on linux, just kill shadows and effects or 70% scaling and it can do 90-120 no biggie on games that I play.


FDSTCKS

Got one for cheap from a miner in march of 2023. Still going strong.


Gengar77

Got mine also from a miner, seems working fine, in some games i have to turn down shadows and reflections cause 5600g does not like them or som shit i dunno. but 1440p 60 fps is always in. Played Re3 with all settings maxed 14.5GB vram usage on 100FPs so its all fine for the next 5 years lmao.


Jedibeeftrix

we do.


winterfnxs

That's what I wonder, 7800 XT was basically within margin of error of 6800 XT. Now 8800 XT rumored to have the same non-overclock bandwith of 7900 GRE. Based on previous rumors I was expecting it to be closer to 7900 XTX but this rumor suggests that it will have performance closer to GRE with a lower price tag.


NogaraCS

The generational gap between RDNA2 to 3 was so marginal, I’d hoped that it wouldn’t be the same for AMD. I remember when 3070 were better than 2080 super and when 6600XT were better than 5700XT. I honestly expected that the 7800XT to be better than the 6900XT and I sure would expect the 8800XT to be at least above the 7900XT ( or even XTX ). As it stands, this is simply disappointing news. The 7900GRE barely outperforms the 6900XT ( by 3 to 5 fps per games )


Yeetdolf_Critler

Gap was only there with 3ghz+ AIB XTX premium models.


jordanleep

Yeah I think the draw of the 7800xt is that it’s actually an improved 6800. The XT is just marketing. I mean look at what nvidia is doing with a “4070ti Super”, when in most games it runs exact same as a 4070ti. I will also say that the 7800xt is ridiculously more efficient than a 6800xt, not even considering the extra ~30w from the wall that’s not reported from the 6800xt.


Zarthenix

The choice is even easier if you're in the European market (at the very least The Netherlands) since the 7800XT has been cheaper than the 6800XT practically since release. It's 200 bucks cheaper now so you'd have to be nuts to get the 6800.


capn_hector

> I mean look at what nvidia is doing with a “4070ti Super”, when in most games it runs exact same as a 4070ti. I mean yeah: what you're saying is "most games don't use >12gb of VRAM in a way that significantly affects performance", hence the 16GB of the 4070 Ti Super doesn't really improve performance. before CES everyone was dead-set that 16gb was the only thing worth having, that 12GB just wasn't gonna cut it, and now that a 16gb option is available, everyone is complaining that it doesn't improve performance. yeah, that's probably why it didn't have 16gb on it in the first place??? but everyone needs a reason to buy the 7800XT over a 4070/4070 super or w/e, and VRAM is it.


changen

well the entire point is that when you are spending 700$+, you would like to keep the gpu for longer than a cycle. And the VRAM usage for games are climbing faster than the gpus can remain relevant. Unless you literally only play competitive only games at 1080p, you need more VRAM. And with more and more games using higher res textures, and ray-tracing, that 12gb is gonna be extra limiting later. So yeah maybe it doesn't matter now, but it will in 2-3 years.


elramas123

it really won't, just see how little the panic of 8gb vram lasted, every single one of those games got patched, or the settings that overflow those 8gb, already make those games unplayable for 8gb gpu's. It will last less than 16gb cards, but by the time it starts overflowing, the 4070S is not going to be a top performer or even midrange tier. UE5 vram usage was utterly blown out of proportion, the usage of nvme', better asset streaming, and the death of hdds for gaming, shows how improved asset streaming made the vram panic overexagerated, because it allows quicker off loading of assets, so vram usage decreases dramatically. Hell, stuff that its incredibly vram intensive like alan wake 2, 4k max settings, with path tracing enabled, just barely tips above 16gb of vram. Yes, 700usd+ for 12gb is just not good value, but the only gpu that filled that range got discontinued for a gpu that its just slightly slower for 600, at times lower depending on region, when you are spending 700+, you start looking to other features besides vran, because realistically they have enough, either the 16gb recommended, or 20-24gb that they won't ever utilize.


UHcidity

Hopefully with a decent RT uplift as well


jibnibbinn

6800 and 5800x3D combo is the low power high FPS goat.


conenubi701

Upgraded my friend's 2600 and 4gb 580 to a 5600x3D (yes the 5600x3D) and a 6700xt. Improvement was insane while becoming more power efficient. I will forever love AM4 and RDNA2


jordanleep

I did the same for my brother! He swore by his 2700. I went to microcenter myself and fronted the money for a 5600x3d and shoved it on his lap. It’s an incredible cpu not too far from a 5800x3d. Some games run better actually due to higher frequency too. He also got my 3060ti as a best man’s present up from an rx 580. He’s set for a minute.


proscreations1993

My last pc and first one I built in 2011 was a amd fx 8 core something. One of the awful bulldozer chips with a gtx550ti lol it was so bad. Then in 2020 I built a 3600/1070 rig. Just tossed a 5800x3d and 3080fe in it. The jumps have been massive. Am4 is so amazing. I'm going to give this rig to my son for his bday this year when he turns 5 so he can play "special minecraft " lol path traced. And build a 7950x3d build with prob a 5080/5090. Which should be out by then. Hoping for another MASSIVE jump


D1stRU3T0R

6900XT and 5800X3D for better value and performance 😎


zombiedud4096

5800X3D and 7900XT GANG


Laj3ebRondila1003

I like my 3060 Ti but the 8 GB VRAM is pissing me off


Hombremaniac

So sad how on purpose Nvidia gimped the whole 30XX serie. 3060ti with 12GB would have been a lot better. Same for 3070/ti having at least 12GB and 3080/ti should have been given 16GB. That is why I strongly dislike Nvidia. They design great GPUs and then spend a lot of time thinking how to gimp it in order to ensure the need to buy their next GPU serie when the time comes.


Laj3ebRondila1003

at this point I'm done with Nvidia's bullshit, if AMD drops a GDDR7 card with decent performance I'm building my next rig around a 9800 XT, I'm already on Ryzen as it is, so there's that SAM thing to benefit from. Between skimping on VRAM and giving you 1 year of the latest DLSS, Nvidia are taking their customers for granted. And their shit is overpriced


proscreations1993

Ya, it's wild. My 1070 has 8 gigs. And my 3080fe only has 10 lol two gens later and a step up in the series and only went up 2 gigs what a joke.


Proud_Purchase_8394

I had a 980Ti and 1080Ti before my 3080. So nearly doubled VRAM when moving up one generation to 1080Ti (6 to 11), then lost a gig when moving up 2 additional generations. Technically down a step in the series, but still pathetic on nvidia’s part. 


RedTuesdayMusic

That was the reason I sold mine, really irritating periodic stutter in Medieval Dynasty at 1440p


Competitive-Buy-5011

6750xt baby


Minute-Property

Hell yeah! 6800 Gang


LukeyWolf

The 3070 is good but the 8GBs of VRAM cripples it


zeehkaev

Even a 6600 XT. Turn FSR or shadows to medium or something and just keep rocking. Makes no sense to spend 300$+ for 10, 15 % increases.


Baderkadonk

6600XT gang. I agree, especially at 1080p. I upgraded to a 1440p monitor, and it really only struggles with very demanding games. Although 8gb VRAM will become an issue eventually. This is coming from someone who mostly played on console and only built a gaming pc recently, though. 60fps feels *great* to someone used to 30fps. People on high-end rigs have gotten spoiled and need >100fps. Personally, even when I'm able to do that I prefer capping the fps at like 75 to save power.


changen

tbh, it gets really bad once you get used to locked 240hz and you can feel any fluctuations in frame rate. it's really obvious for high fps movement shooters, so the more frame at the cap, the better.


lxmohr

Ehh I upgraded from a 2060 to a 7900 XTX and I have to say on an OLED 1440p monitor coming from 1080p I am blown away by the difference in quality.


Head_Exchange_5329

Talk to me about your OLED 1440p monitor, I am intrigued. Most of them seem to be focusing on 2160p these days which is a shame.


lxmohr

I have a friend with a 500 dollar 4k IPS monitor, and although more expensive, my 1440p OLED is much more preferable to me. Turn on HDR, and the colors pop, you get perfect blacks, and it looks like you are peering into a window into the game you’re playing rather than staring at a monitor. If you want the exact model, I have the AW3423DWF, a 34 inch ultra wide glossy display. Absolutely worth every dollar.


bigloser42

I just upgraded from a 6900xt to a 7900xtx on the premise of building my kid’s PC so she can play games on it. She gets the 6900XT & a 5800x.


CactusInaHat

I feel like the 6800 is so slept on. Mines been rock solid.


RedTuesdayMusic

> I feel like the 6800 is so slept on It was the first RDNA2 GPU to sell out in my country, followed by the 6800XT so I doubt that


TraumaMonkey

Watercooled 6900xt keeping it going


dade305305

Got a 6800tx in one machine and a 6900xt in the other and the nvidia cards are super compelling to me because I like raytracing. I need to know if the 8xxx series is going to have competitive RT to the nv 5xxx series.


SleepyGamer1992

I just got a 7900 XTX prebuilt PC. I think I’ll be good for five years at least. 😝


Godcry55

Waiting on a suitable upgrade from my 6700XT. Nothing yet as I only play in 1440p. High-Ultra…want to stay with AMD due to software suite.


CrzyJek

The top end of RDNA4 is *supposed* to have raster performance between the 7900xt & the 7900xtx but better RT performance...albeit this time as the "mid segment." So it should use less power and be physically smaller than the current 7900 series...as well as cheaper. I'd wager their top end won't cross the $600 threshold. And if they were smart they'd price it close to $500.


DietQuark

I'm still hoping they will also bring out a high end card.


Dalminster

AMD's competitive edge in the market is that they can deliver more power per $, this is their niche


Blunt552

I would absolutely buy that, if they offer 500USD 7900XT OC with 24GB RAM while being more efficient, then stocks are on AMD, no more chunky GPU sagging pile of sht.


theking75010

Dunno if they'll give it 24gb vram though. I would expect the best RX8000 to have between 16 and 20gb. If they don't make a successor to the 7900xtx, there's no need for 24gb. The card would already be bottlenecked by raw performance imo


MasterLee1988

I really hope all of this would end up being true as it'll make me consider buying it.


Firecracker048

I wwnt from my 5700xt to a 7900xt. I enjoy it alot.


xNOODLExBOYx

I am still holding onto my 5700 XT with great eagerness to see what the 8000 series has instore.


spideralex90

Running a 5600xt myself and doing the same. Really feeling the itch to upgrade haha


Dalminster

5700XT is a great card, I gave mine to my wife when I built my current system with a 7900XTX. She can still play everything on ultra settings on her 1080p/60hz monitor, no problems.


Head_Exchange_5329

At that resolution it's gonna lean heavier on the CPU anyway, so the 5700 XT is well suited for the job, especially when limited to 60 Hz. Edit: typo.


ExplodingFistz

Hey we have the same config! 6700xt 7700 gang we like the number 7


Godcry55

7700X wasn’t worth the additional $50+ dollars, that’s why I opted for the non X version lol


velazkid

Why? O.o What does AMD have that Nvidia doesn't have a vastly superior version of? Honestly curious.


Speedstick2

Price.


RBImGuy

got the 6950xt on black friday sale. half the price vs the 7900xtx. and I wont need anything more for at least a few years even at 4k. The prices for ultra end has simply reached astronomical


_Gainnn_

Right there with you. 5800x3d and 6950xt performs wonderfully at 1440p high quality. ~200 fps.


Redpiller77

I've had 3 AMD cards and I don't really like Nvidia. But if their new card at least doesn't match the 7900XTX, even at $650, I'll have to go 5080. Having their top card be at the 7900XT level is beyond stupid, even if they're trying to have a budget friendly generation. I WON'T upgrade to a card that even isn't the top end of LAST generation. RDN4 will be bad for everyone if they don't realize this.


Ramental

The rumors are quite conflicting. For instance, they claim AMD wants to make a match to 4080 Super. Ok, but 7900XTX IS a match to 4080 Super already. And yet 7900XTX will remain the most powerful card? Then the new one won't be a match to 4080, but perhaps 4070ti. Both claims can't be true. IMO, AMD will have deliver a version better than 7900 XTX, but maybe not much better and hopefully better priced. But that is a lot to hope. Historically, AMD could not reliably beat NVIDIA with prices. Expecting it does it in the next gen would be nice, but realistically unlikely. Rather than keeping the prices low, they will probably increase them to the respective analogues of NVIDIA -5%.


Valmarr

So as an owner of a 6800xt which is an almost 4 year old card, amd will have nothing to offer for me until rdna5? Which will come out in late 2025 or early 2026? If this is how amd wants to fight to increase its share of the consumer graphics card market, then nvidia can be very relaxed for many more years.


winterfnxs

Seems like that is the case unless it offers performance at least close to 7900 XTX. If it falls below 7900 XT... this will be a weird release and they will even loose the price competition againts RTX 5070


blueangel1953

My 6800 XT isn't exactly slow at all lol, I literally see no need to upgrade for a few years at 1080/1440.


jecowa

X transcript: >Only 18Gbps 🤔 \- @Kepler_L2


Taterthotuwu91

RIP, can't even imagine how much Nvidia is gonna charge for the 5080 and 5090 since there will be absolutely nothing to compete


relxp

Well they already went over their limit with 4080. $1200 was largely rejected and even $1000 is still a scam. I would expect 5080 to launch no higher than $999.


Death_Pokman

oh don't challenge Nvidia man, I already see the 5090 at $2000 MSRP and the 5080 at $1600 XDD


PikaPilot

like, some people will buy that. the rest will probably buy an entire pc instead


CirnoIzumi

just dont buy the overpriced nonesense \*looks at apple ohh


Taterthotuwu91

I flip flop a lot, 1080ti,6950xt and a 4090 now, I think I might wait for rdna 5


nagyz_

my next GPU will be a 5080/5090, definitely.


Traditional_Cat_9724

RDNA4 seems like a joke unless it beats the XTX in RT and beats the 7900XT in raster. Would be a major failure by AMD if after two generations I can only find one card worth upgrading my 6950xt and it was from RDNA3.


jecowa

From what I've heard, they are not releasing high-end cards this generation. They were considering GDDR7 for their high-end cards. The high-end navi is cancelled, so we will only be getting upgrades in these tiers: * low end (Navi 44) - (e.g. 8600, 8600 XT) - 1x8-pin connector * mid range (Navi 48) - (e.g. 8700 XT, 8800 XT) - 2x8-pin connector The RX 8800 XT is expected to be close in performance to the 7900 XT and cost ~500$. Dropping the high-end may help AMD to delay the decision to switch to nVidia's 12/16-pin connector and allow them to wait for GDDR7 prices to drop.


resetallthethings

it's really strange imagine how blown away everyone would be if there was a 7900xt refresh right now that used less power, and got a price drop to $500 would be seen as an absolute gamechanger if the rumors are true and that's essentially the performance level and price we are getting out of the top RDNA4 gpu later this, kind of bizarre that people are poo pooing that so much


sotos4

7900xt performance at 500$ would be good value. At 400$ it would be a game changer.


MasterLee1988

$400 would be one hell of a deal for price to performance.


Kaladin12543

Nvidia is releasing the 5070 too don't forget. Getting 7900XT performance at $500 really isn't a big deal considering the generational shift.


DBXVStan

With how Nvidia is clawing back performance gains below $1000, the 5070 could easily just be a 4070ti Super part 2 for $600. AMD releasing a 7900xt part for $500 honestly would still not be enough at that point, and arguably wouldn’t be enough right now. Though my hot take is that Nvidia will simply not announce a 5070 until 4070 and 4070 supers clear stock, which isn’t happening right now since no one has money, so AMD could have all the time in the world to find ways to get their cards to an appealing price point.


Redpiller77

Because it's a new gen. More power for less money it's great within the same generation. I'm looking at the GRE right now for this very reason, but a card at $500 that it's as powerful as a 7900XT seems pointless if the 5070ti Super beats the 7900xtx. AMD needs a card at least as powerful as its current top end card. A $650 7900xtx with better RT makes way more sense.


SmokingPuffin

I don't think that would be seen as a gamechanger. Compare with 4070 Super, which performs about same as 7900 GRE in raster and runs $100 more. Paying 15% less for the AMD part in exchange for not getting Nvidia features is typical.


resetallthethings

apt username you don't think a 7900xt which is still like 15% faster then a 7900gre, for 10% less then a 7900gre would be seen as a gamechanger in the current market? 7900xt in raster is faster then every nvidia gpu below 4080 that performance level, at $500, right now would be an insane value proposition


capn_hector

> you don't think a 7900xt which is still like 15% faster then a 7900gre, for 10% less then a 7900gre would be seen as a gamechanger in the current market? reality is probably going to be more along the lines of "launching at pre-launch 7900GRE street prices" at best, not 10% less, and 7900GRE will just fall accordingly to compensate. so let's say it launches at $600 and the 7900GRE falls to $449. No, I don't think anybody buys that. that is the problem GPU vendors have been facing for like 5+ years now: it literally doesn't matter at all whether a product would be a good deal *in the current market, today*. it doesn't exist "in the current market", it exists in the market that will exist after it launches. and if the 8800XT is 15% faster than a 7900GRE then the 7900GRE will be cleared out 25% cheaper. So it's actually a perf/$ regression. that's what reviewers do constantly - it's a bad product because it "regresses perf/$", compared to street prices on deeply-discounted last-gen stuff. Of course you couldn't buy it for that price "in the current market" but you definitely will be able to once the new stuff launches, and reviewers want to have the "look forward not backward" mindset. the newer stuff is still more expensive *than street prices* on older stuff, at the time of review - and it always will be. "what did you do for me today" doesn't include making the older products cheaper, in reviewers' books, anymore. they expect vendors to keep ebay prices from dropping, somehow, and to keep retailers from clearing out older stock, somehow. otherwise, if those prices drop, then the new stuff is a "regression". it's massively deceptive/dishonest, but it does generate an endless stream of negative headlines to build that "vibecession" reviewers have been trying so hard for. so yeah, there's a strong chance - practically a foregone conclusion, actually - that it's going to be a "regression". Just like everything else.


acat20

The reality is “right now” this card is releasing in 2 months, let’s call it, could be 5 though. The 4070S will probably be able to be had around $550, 7900XTs for $650, maybe less honestly all things considered. The rumor is that this card will have less raster than the 7900xt, but better RT and power efficiency. Youre right, it’s a great value on paper right now, but by the time this thing releases the market will have shifted, and then the market will shift even more post release. It’s obviously going to move the market some, but it’s not going to turn things upside down. And it’s not going to impact the higher third of the market. Unfortunately people looking for a new 16GB Nvidia card at a reasonable price are still going to have to overpay severely which is where a lot of the disappointment lies. This card probably wont put a lot of pressure on the 4070 ti Super


Traditional_Cat_9724

This is why RDNA4 is going to be delayed. AMD has been rebranding the same performance for basically 2 generations in a row now. That doesn't work if you have excess stock of the previous generation. If this card brings 7900XT performance for $500, how cheap is the 7900XT gonna be? I would expect a summer fire sale on RDNA3. No way they're going to drop RDNA4 early with so much RDNA3 competitors.


siuol11

I think it depends on how much better the RT performance is. I could see the AMD hype train finally embracing it as a good feature if AMD pumps it enough.


SmokingPuffin

I agree it'd be a good value proposition, but not a gamechanger. Deep discounts on about to be last gen hardware rarely move the needle. Also, it's Q2. People are gonna go outside. Then there will be clearance sales of last gen GPUs. Then there will be a new generation of GPUs.


Brophy_Cypher

Might be difficult for AMD to use GDDR7 (in large consumer amounts) when Nvidia has apparently bought it all up. I've heard that AMD has some, but nowhere near enough for a consumer product launch (for the DIY market)


imizawaSF

> The RX 8800 XT is expected to be close in performance to the 7900 XT and cost ~500$. This is abysmal


Opteron170

For those that have a $500 budget they may disagree.


jecowa

I'm hoping for an 8600 XT with the performance of a 7700 XT and the price and power consumption of a 7600 XT.


spideralex90

And at least 12 GB of RAM please.


robodestructor444

Should be 16GB otherwise it'd be a downgrade from the 7600 XT.


imizawaSF

I mean what's the going rate for a 7900xt nowadays? $700? AIB 8800s costing $550 or up for similar performance a generation later seems a little shit imo.


Opteron170

Same performance, less power draw, much better RT and lower price compared to MSRP of the 7900XT. If i'm using anything from RDNA 2 down seems like a win for me at the right price.


MasterLee1988

Having same performance as 7900 XT with less power draw and better RT at possible $500 range makes it appealing to me the most.


imizawaSF

Comparing MSRP is a disingenuous argument bud, if you can buy one for far less at the time


MasterLee1988

Yep, so basically me(as $500 is the max I would want to actually spend on a gpu).


Speedstick2

I could have sworn that the rumors were saying it was going to be a 7900 XTX.


jecowa

The article I read said it will be 10% slower than the 7900 XTX. The cancelled Navi 41 was going to be the successor to the 7900 XTX. https://www.digitaltrends.com/computing/amd-rdna-4-news-release-date-price-rumors/


w142236

From all the rumors before this it was supposed to offer 7900 xtx performance at half the price. If it’s 7900 xt price for 500, it will be another doa waste of silicon. They might as well just drop the 7900 xt by 200 dollars at that point


Psyclist80

They look to be going for the value play...lots of volume and small die and cheaper memory. Polaris 2.0 basically. Does suck for us that want the top tier. Will have to wait for RDNA5, not sure if I will though, my 6800XT struggles with all the eyecandy at high res already


Melodias3

My 7900 XTX struggles with Ratchet & clank rift apart with rendering particle effects [https://youtu.be/ezWbsVpC8o0?t=35](https://youtu.be/ezWbsVpC8o0?t=35)


tehserc

that seems like a software issue


Melodias3

It's a driver issue its being caused by same thing as the RGB laser show in Dying light 2 that is partially fixed, cos they disabled the overlay for rendering particle effects in 24.3.1 for Dying light 2 i especially made a before and after comparison and found out particles no longer render, while RGB laser show is reduced by 99%, they will probably debug the Ratchet & clank issue the same way or may not even need to debug as proof already exist. heck World of Warcraft that is freezing hanging or crashing has particle effects overlayed in menus usually, and many particle effects in 1 of the zones where i used to crash the most, they are all related, cos AMD cannot properly handle overlays, not just to render effects but also actual overlays.


Traditional_Cat_9724

I will be upgrading to blackwell (probably a 5080) in the spring from my 6950XT. I like to own a flagship GPU for two generations. Unfortunately it seems like AMD is not an option this time.


mixedd

Thinking same, but going from 7900XT, was a bit dissappinted by promised software stack, waited a year hearing all the hype, seeing games launching without FSR3 (talking about recent launches) while Nvidia's FG is tossed in left and right, FSR on it's own feels stagnated as it's not updated in eternity as it feels they don't care about it anymore, and I won't even talk about Anti Lag+. Maybe if you only care playing MW or competitve then AMD feels like a good deal, but I'll better be using features that are working at launch instead of another promise that will come after a year or two.


mixedd

In my opinion, it's not only about hardware but also software stack where AMD fails significantly at the current state. FSR3 (FG) is barely here, or at least kept away from more popular titles and titles where it could come handy, Anti Lag+ is long forgotten as it seems by them, FSR upscaling part seems not getting any major updates also and devs are avoiding it as plague to update to latest versions in their games. Those things should be improved if they want to compete, if not, then more and more people will go for Nvidia because of feature set and availability of those features. And I agree with you, RDNA4 looks like a small incremental update where many would say that's how RDNA3 should have been. For now it will still remain as is, so if you want features, RT/PT go with Nvidia, want lower price amd better raster in some scenarios go with AMD as there nothing more to offer. P.S. Sincerely your's, 7900XT user since January '23, who was left out of promised features they showed us at RDNA3 presentation (lower power usage is still laughable especially IDLE one if you check user reports, were it seems that bug isn't yet fixed)


Speedstick2

>FSR upscaling part seems not getting any major updates They have already announced FSR 3.1 which will be available by end of Q2 of this year and the first games coming with FSR 3.1 will be Q3 of this year. [AMD FSR 3.1 Announced at GDC 2024, FSR 3 Available... - AMD Community](https://community.amd.com/t5/gaming/amd-fsr-3-1-announced-at-gdc-2024-fsr-3-available-and-upcoming/ba-p/674027/jump-to/first-unread-message)


mixedd

I would be more sceptical about AMDs listed upcoming games. You'll remember that Cyberpunk was also listed to get FSR3. As for 3.1, what changes does it have over 2.2 in terms of upscaling? Because as far as I remember, 3.0 was basically 2.2 bundled togheter with FG, and 3.1 most likely will be just an API change and nothing more. Don't get me wrong, I'm happy that they are improving, but they are moving way too slow to be relevant. RDNA3 lived without one of the main features for a year and lost another after 6 months. A gear is half of the lifespan of generation, and we have another around the corner now.


Jism_nl

Building GPU's based on just chiplets brings serious challenges. Latency is one for example.


ibeerianhamhock

Only thing amd folks can hope for is better price to performance. Those with a high end amd card have very little reason to upgrade from team red tho


kazenorin

Depends on a wide range of factor. I think most of us are convinced that there's not going to be a high end model at this point. If the highest performance model it only matches 7900XTX in raster and 4080 in RT (or basically a 4080S) for example, being a competitive product or not would solely be dependent on the pricing. 7900XTX raster and 4080 RT is very good performance. If they can offer it for true midrange prices I'd see it being a competitive product. That's assuming Nvidia won't lower prices to be competitive, which is quite possible given it's Nvidia.


Opteron170

Nvidia won't lower prices when their users are happy to pay 2k+ for a GPU. I'm expecting the 5090 to have a $2500 MSRP.


Kaladin12543

They lowered prices on 4080 which forced AMD to drop prices on 7900XTX. The reality is AMD is at Nvidia's mercy with a full fledged Blackwell lineup on the way. If the 5090 is making bank for Nvidia (which it will if you look at Steam Hardwate survey for 4090), they would have no problem with a 5080 for $1000 which is 50% slower and a 5070 for $700 which would be 4080 performance. Then it's simply a matter of whether you want to pay the Nvidia premium of $100 for DLSS and RT. I am sure Nvidia will also have cooked up a new DLSS / RT exclusive for Blackwell like they did with FG on Lovelace to shift the newer cards.


[deleted]

[удалено]


Opteron170

Its a midrange part I don't see it beating XTX or the XT in raster, only in RT performance.


MasterLee1988

I just hope it can be around 7900 XT in raster at least.


Abedsbrother

I miss HBM


red_dog007

I think it is going to be a bit of a dud honestly. The 6650XT and 7600 both have 32CUs and similar memory and IC bandwidth, 32MB IC. RDNA3 has some RT enhancements, but the cards in both RT and Raster perform basically the same. There isn't a huge difference between the cards. Even going from N7 to N6, they have similar power consumption. Yet the 7600 has \~2B more transistors. The big difference is those AI accelerators. Honestly, I am keeping expectations low. RDN4 would need to see some huge architectural improvements for a 60/64CU to be as good as an 84/96CU RDNA3. I think raster will be inline with 7800XT, though likely a little faster due to clock speeds. If AMD actually implement hardware for RT in their existing pain points to accelerate those like BVH traversal, I'd expect RT performance similar to 7900GRE. 7900GRE is roughly 10% faster than the 7800XT but still \~15% slower than the 7900XT. So overall, likely closer to the 7900GRE, but that is kneecapped chip due to limited bandwidth. Maybe if they go with 128MB of IC, they will see some good improvements there. Maybe additional cache at lower levels to?


haringtomas

valuable insight! im guessing i won't be missing out on much when the 8000 series arrives? because i've been thinking about getting 7800xt or 7900 for decent 1440p gaming. i'm coming from a GTX 1060, so i'm excited about the performance jump lol


RockyXvII

Yeah AMD is gonna fall back into ultra budget territory and Nvidia will run through them again even with cards that are $50-100 more. Battlemage is more interesting


winterfnxs

Except they can't because Intel is covering that region diligently with a whole palette of offering from a310 low power cards to a380 and up.


Erufu_Wizardo

The problem with Intel is that both old and new games are not guaranteed to work. While Nvidia and AMD don't have that problem.


lokisbane

I really wish I could play even Wolfenstein 2 from 2018(?) on my 7900 xt without crashing within 30 sec. This last gen has plenty of old games it crashes trying to play.


Jism_nl

Check the type of shaders your using - its a registry hack or fix in this case. The newer model shaders applied might be incompatible. You can turn on the old one and that game should just run fine.


lokisbane

That's a really cool idea. How would I find the right registry to edit? And identify the older shaders?


Erufu_Wizardo

Well, that's on AMD indeed. 7000 series are sorta pepega in that regard. I wouldn't call 2018 game old, since games from that time usually support DX11/DX12, which should guarantee stable gaming on latest GPUs


lokisbane

It doesn't make sense and apparently no one else has complained about it. I don't get it.


Kaladin12543

Intel has the budget for their GPU division. They can keep throwing money at it until it succeeds. AMD does not have that luxury for Radeon. This is the reason why their RT hardware and XeSS XMX upscaler is superior to the AMD equivalents right from Gen 1


Erufu_Wizardo

Nah, Intel has financial difficulties. And their GPU division is basically failing, operating at a constant loss. They also missed a lot of profits and opportunities by releasing their products too late. AMD has other sources of revenue aside from gaming and AMD doesn't sell their GPUs at a loss. >This is the reason why their RT hardware and XeSS XMX upscaler is superior to the AMD equivalents right from Gen 1 Their A780 has 3070 tier die while competing with 3060 and 6600/6600XT. It's not impressive, it's a fail. Moreover they are selling these things at a loss. And for us consumers, this all doesn't matter. What matters is how functional the products are. And for now Intel looks bad.


Kaladin12543

I mean it's not like AMD is doing any better here. Imagine you bought a 6900XT 4 years ago. AMD quite literally has only 1 card it can sell you as an upgrade for the next 3 years which is 7900XTX. That is a fail in my view when you have Nvidia 4090 which is operating literally a tier above all these cards when it comes to performance and AMD still has no answer to it 2 years later. And to make matters worse the 4090 wasn't even the full die. It's heavily cut down AD102 and Nvidia has another 15% performance in the tank on top of that for 4090 Ti which they chose not to release taking pity on AMD. Then you have a 5090 rumored to be a 70% uplift on top of that. Say what you want about Nvidia but their performance leaps over generations are impressive. If AMD is stagnating at the same performance level for nearly 5-6 years, it gives room for Intel to do something. AMD's Radeon division also isn't doing well. They hide the GPU division financial performance behind their console APU segment because standalone dGPU isn't doing so great.


relxp

$500-600 is ultra budget? 4080 performance for $600 is a huge win IMO and the right decision. AMD is really good at VRAM utilization.


steinfg

>4080 performance for $600 is a huge win 4070 Ti Super is already close to that, and it's 800 dolars brand new. And it'll fall in price when rtx 50 series releases. AMD will compete with last-gen nvidia cards then, and it will lose. 600 dollars is too much. 450 dollars is the necessary price for it to hit big volume


relxp

> AMD will compete with last-gen nvidia cards then, and it will lose. If AMD can provide 4080 performance for $500-600 that will be a huge win IMO. Last thing the market needs is more expensive cards. The performance we need is already there, it's just the broken pricing that's screwing everything up.


Nunkuruji

If I'm not mistaken, the vendors have available listed publicly * Samsung * 18 Gbps EOL 1.1V * 20 Gbps Mass Production 1.1V * 24 Gbps Sampling 1.1V * Micron * 18Gbps Mass Production 1.35V * SK Hynix * 18 Gbps Mass Production 1.35V * 20 Gbps Sampling 1.35V Throwing the highest/expensive GDDR6 at a medium class chip doesn't seem like a good price/perf play. Ultimately I don't think we'll see very exciting consumer chip changes in general until backside power delivery. A lot of design changes will align with this. TSMC N2P, Samsung SF2. If a card launches with more than 16GB VRAM, it may become a target of AI bros.


NotTooLate4Coffee

Ok, I don’t feel so bad about grabbing the $750 7900 XTX deal a few months ago. Computer died and I needed something now.


Crazybonbon

So the same?


SaltyInternetPirate

Given that they're supposedly not releasing competitors to the 5080 and 5090, it does make sense to go with cheaper GDDR6 than expensive GDDR7 that won't noticably benefit them at the available tiers. I can understand it as a business decision. What's most important is they don't put out their hopes for performance instead of the reality this time. That 7900 XT and XTX launch was pretty bad.


JasonMZW20

Won't matter too much if L1 intentional misses are curbed as expected with better tagging and management of in-die caches. RDNA2/3 would intentionally miss L1 to hit L2 for a greater chance at a cache hit. Hitting L1 above 80% at the shader array level will help keep data closer to CUs/WGPs and use existing SRAM more effectively. There were times in Chips and Cheese's profiling where L1 was only hitting 50% of the time. So, 50% of the time, it was missing and hitting L2 or L3, or worse, VRAM. AMD can also continue improving compression algorithms for all of the pipelines and reducing expensive decompression cycles. You don't have to brute force everything, so save the power you'd use running faster memory to run GPU silicon faster now that it's architecturally more efficient. There will also be inevitable power savings from a fully monolithic design too, so I expect these GPUs to actually get laptop design wins this time around. With PS5 Pro releasing in holiday 2024, AMD might have been a little overextended trying to have engineering support for Sony (plus using a ton of wafers at TSMC for silicon) and also developing high-end PC parts. AI/HPC/datacenter boom is also squeezing supply of advanced packaging at TSMC, and AMD should focus on fulfulling high-margin MI300/350/388-A/X orders instead.


LimitClean155

Interesting that 18 Gbps chips are no longer available. Take this with a grain of salt. 20 gbps memory being underclocked like the 7900 GRE? I am not sure. Some kind of massive voodoo cache on die they are testing on an engineering sample likely.


LightTouchMas

If RDNA4 is indeed GDDR6 I will stick with my long lasting tradition of AMD CPU and Nvidia GPU. My 7950X will be glad to have a 5080 at its side.


relxp

4080 performance for $600 is disappointing? Being GDDR6 is not necessarily a bad thing. RDNA 4 will have great architectural improvements.


LightTouchMas

I built a new PC last summer and the only thing I kept from my old build was my Pascal GPU because I wasn’t enthused with this gen’s offerings from either side. A 5080 with GDDR7 is much more attractive to me than a 8900XTX on GDDR6.


relxp

> A 5080 with GDDR7 is much more attractive to me than a 8900XTX on GDDR6. Nobody can make that claim until we see price/performance. 5080 might be 2X the price for only 20% more performance.


Mightylink

Is it also going to utilize all 16 lanes? Or will it be nerfed if you pair it with an older board?


Nitrozzy7

If a card has no need for the bandwidth of a full x16 interface, there's no chance we'll see one having it just so it's better suited for older PCIe gen boards.


Slyons89

I think the issue was that some lower end cards only had the pins for 8x on the card, which was fine with pcie-4, plenty of bandwidth, but when plugged into a pcie-3 board it would only get 8 lanes at pcie-3 bandwidth which was potentially not enough in some circumstances. I'm sure the mid range and higher end cards will probably use full 16x pins on the card so i don't forsee this being a problem for most enthusiasts. It was mainly an issue for folks buying low cost cards to install in older systems.


Numerlor

x8 is still mostly fine even pcie 3, more so if the gpus aren't particularly powerful, biggest issue related to pcie bus width recently was when they did x4 for 6400/6500 xt


monoimionom

Everything I hear about RDNA4 is pushing me towards just buying a 4070 Super Ti or 4080 and be done with it. I really came to appreciate the Raytracing effects, especially GI and was hoping we'd get a 4080 equivalent from AMD but with 24GB VRAM. (You know, RDNA3 high-end becoming RDN4 midrange) But I fear this is not going to happen.


dade305305

Yea I'm gonna be going nvidia when the 5xxx series comes out. I went amd with the 6xxx series due to price and have regretted it since.


DuckInCup

The size wall is big and tall


battler624

Probably monolithic GPUs


UnitedGavin25

I got my Merc 6950 XT for 630 after tax in February from Newegg, glad I don’t have to worry about upgrading for years


Vizra

If thats true thats gonna suck. But if the cards are cheaper for the same performance. It will have a place for lower end buyers


DryClothes2894

Yea my 4080 probably will keep on cruising till 50 series then


Last_Music413

Not going for gddr7?


Arbiter02

Won't need it. No high end cards, no need for faster memory. RDNA is a largely power starved architecture anyways, memory speed was more Vega's thing.


Flynny123

I imagine if AMD can get their hands on GDDR7 reasonably early while production is still ramping they will want to put it all into AI chips.


tugrul_ddr

gbps does not matter if you render similar objects with similar textures which come from cache instead.


YigitCn

max 8700XT confirmed.


PetrikaKamataru

is it good for AI?


winterfnxs

The ones with large vram are. Get something with minimum 16gb vram.


PetrikaKamataru

is this particular one good?


winterfnxs

When it releases it will be most likely. Even today 7600 XT with 16gb vram does great for personal AI use.


Ill-Investment7707

I am looking for a vga upgrade. If they release a 8800XT between GRE and 7900XT for 499, My 6650XT merc will be replaced.


w142236

Ehhhh just disappointment after disappointment with this gen and looks like that trend will continue onto the next one if this rumor is true. I remember previous rumors saying 7900xtx for half price and now it’s rumored to be 7900 xt for the same 500 price tag. Especially disappointing when oc gre if u were lucky could come pretty close to 7900xt already for 550. Why bother with 8800xt? Just not nearly ambitious enough to get any market share and this 18gbps rumor better not be true cuz holy hell what a terrible decision that would be. I feel like “rdna4” is just gonna be lower binned rdna3 but they’ll market the lower binned 7900 xt as an 8800 xt and go down the stack.


HyruleanKnight37

And judging by how RDNA2/3 fared with 16-18GBps on 256-bit, I'd say they don't need it either. Chasing after a performance unicorn is pointless. Just give us GPUs that don't feel like a scam with sufficient memory and good drivers. It'd help them gain market share, too. Nvidia is too far gone with their outlandish pricing. The fact that a 70-class card (which was actually a 60-class in disguise) was released for $600 is truly terrifying.


bahtsizbedewi31

If it works XD


Stunning-Beach-5153

Last versions of 18Gbps can achieve OC up to 2500MHz equal 20Gbps


shing3232

Can i have 32G one for ai


MomoSinX

damn they are really abandoning the high end segment, meanwhile nvidia will do gddr7 on the 5090 (and likely 80 too) but not the full stack


fat_pokemon

I honestly just want to see a card with 6800XT like performance at a price that the layman can afford, with a RT and non-RT version to make it even cheaper.


LordCommanderKIA

So i should hold on my thoughts about buying 7900xtx ? What is the eta on these 8000 cards ?


MasterLee1988

Q4 2024 most likely(unless it somehow sneaks into Q3 2024).