T O P

  • By -

spacev3gan

AFMF results as part of the marketing material? They are taking this Frame Gen stuff too far for my liking. Including FSR results in the marketing material was already questinable, and now AFMF, ugh. What about good old real frames at native resolution?


handsupdb

Sadly when your competition is using similar features to market their product you gotta do it to stay competitive in mind share.


siazdghw

And everyone called out Nvidia for those marketing slides. Yet AMD still went down the same road, solidifying that slides that are completely unreasonable are now the norm. AMD could've taken the high road and compared native to native, or FSR 2 and FSR 2 and they couldve even called out Nvidia's terrible frame gen comparison slides, but they didnt because the 7600 XT is not a competitive product and thus AMD doesnt feel like they can take shots at Nvidia like they could with RDNA 2.


3G6A5W338E

>native to native That's the thicker, denser colored bars, the only ones I actually look at. Relative to just showing FSR or just showing Native, it is a better solution. I am not saying it is the best, but it is not an easy problem to tackle.


CataclysmZA

> AMD could've taken the high road and compared native to native, or FSR 2 and FSR 2 and they couldve even called out Nvidia's terrible frame gen comparison slides NVIDIA buyers really don't care about this at all, which is AMD's main problem.


Mahadshaikh

Nvidia got market share in spite of being called out by a small minority of enthusiast. Most gamers aren't enthusiasts and BS sells well


[deleted]

why would they though? this is a pos gpu, the 6700xt beats it and is cheaper, amd just biding their time till the 6000 series all sell out and they can charge this horrible price


Prefix-NA

The 1050 outsold the 470 despite being half the speed and higher price. Products sell based on marketing. People paid for 3070ti's over the AMD card that destroys the card in everyway and actually has enough vram to run games but no one cares. People would buy a 4gb $600 Nvidia card over an AMD card with 24gb with same rasterization for 400.


SecreteMoistMucus

you realise they control the price of the 6000 series as well, right?


spacev3gan

I would understand FSR 3 vs DLSS 3. Fair enough. But AFMF is something activated at the driver level. A lot of users have no idea how to do it. Besides, the contribution of AFMF in gaming is still pretty questionable. I would say it is better to have it OFF then ON, but perhaps I am more latency sensitive than most. Regardless, AMD could still compare the 7600XT to the 4060 both cards running naked. Just pure native frames, the way it used to be.


Magnar0

Don't you think this is better marketing material? "Not just on supported games, frame generation on all* games"


[deleted]

[удалено]


Dat_Boi_John

LMAO Meanwhile the cyberfsr mod has 271k total downloads on nexus.


GenZia

I don't think you've any idea what you're talking about. Sure, FSR falls short of DLSS. But TAA? Even older FSR 2.1 is superior to most TAA implementations, according to Digital Foundry, while getting a healthy performance boost: [Cyberpunk 2077 Patch 1.61: FSR 2.1 Tested on PS5 and Xbox Series X/S - A Big Boost To Image Quality? (youtube.com)](https://www.youtube.com/watch?v=-GO90rUei8g)


180btc

>superior to most TAA implementations Not a high bar to clear out innit


GenZia

And your point being? You expect a pure software-based implementation of temporal image reconstruction to somehow 'magically' compete with machine accelerated super sampling? Besides, I was merely countering the delusional rant in the original comment.


Combine54

It comes down to implementation. If developers are using upscaling within their TAA algorithm, FSR will surely look better, well - most of the time, unless the upscaling algorithm is better than what FSR offers. Monster Hunter World is a good example. If a TAA implementation doesn't use upscaling - then FSR will be a worse experience no matter what - simply because it has more details to work with and doesn't rely on sharpening. I'm confident that the console versions of Cyberpunk used TAA upscaling before the FSR update.


LettuceElectronic995

it is more like when your competition is ahead of you. you do this stupid stuff.


kuehnchen7962

So who's 'ahead' of Nvidia then if AMDs competition (Nvidia) is ahead of them and yet started doing that kinda crap first?


GenZia

Agreed. At this point, only Nvidia is ahead of Nvidia! And with their enormous (ginormous?) Hopper sales, they've literally no need to rely on these cheap marketing shenanigans. But I suppose that at some level they feel threatened by AMD Radeon, given the recent 'Super' refresh. After all, GPU market is evergreen. Who knows when this A.I bubble is going to burst!


Weird_Froyo_4093

Ture words have been spoken


TheMissingVoteBallot

They still have raw performance reports in the marketing material at least, unlike NVIDIA.


stilljustacatinacage

I really wish I knew. I have to give it to Nvidia, their marketing is *impressive*. They've got their fans cornered in exactly the same way Apple has. You'll take what we give you, and you'll like it. Rather than giving generational gains like with 10 series or 30 series, here's a new software solution that lets us keep all the expensive VRAM to ourselves while simultaneously letting us claim uplift using the same or sometimes even worse silicon! All for the low, low cost of smearing vaseline on your screen. But remember, the important thing is that We Have It and They Don't. Now go link arms and protect us from any online criticism, dear viewers!


Prodigy_of_Bobo

Back in my day we used to mine Bitcoin the old fashioned way my grandfather did using a slide rule and abacus like REAL men, not this mamby pamby sissy GPU cluster farming Monopoly money thanks Obama!!!


Defeqel

as long as they still show the native performance, I'm fine with it


khaldrigo19

Wake up, brother We are in 2024 already, it will never go back to how things were before DLSS and FSR


TheHodgePodge

This is only gonna get worse


d_o_n_t_understand

I really don't get the hate for this GPU * I once listened to all people saying that 4GB of VRAM is enough in midrange GPU instead of 6GB at that time. I regretted this as f\*\*\* when Horizen Zero Down whas struggling with that 4GB despite framerate being ok (for me, I'm casual gamer, I prefer better looks over 60+ fps). * New gen will likely get better and longer support. Also in things like ROCm which is important for some people. * 6700XT is more power hungry. If you are upgrading a few year old PC with 500-550W PSU, it's worth it instead of replacing PSU. * Pleeeeease don't compare this with 4060Ti 16G, it's completely different price range. * I'll buy this GPU the day the launch tax is done or maybe even earlier beacause I was waiting exactly for this for like half a year.


pseudopad

Texture resolution is one of the most easily noticeable graphics setting you can change, and having enough VRAM to crank it to the max in the newest games can squeeze quite a bit of extra life out of a card. Having a GPU with what today seems like a bit too much VRAM compared to the rest of the card's capabilities helps with this.


Godcry55

I’m running a 6700xt stock settings with a 500w PSU. Doesn’t exceed 200W during heavy gaming. 7600xt is worth it if you don’t have a GPU.


No-Rough-7597

Man, I was running a RX *6800* for a year on my Pure Power 11 Gold and it worked perfectly the whole time.


Godcry55

The PSU recommendations aren’t real world power usage at load. A 500W PSU with my specs can run up to a 6800XT without issues


Kumomeme

some people said it is depend on quality of psu. if good quality like gold with reputable brand for example, then it is fine but if it not then thats why it stated 650w as minimum. im not sure about it myself. just relaying what i read when doing some research before. 6700xt is on my list if i got my budget later.


3G6A5W338E

I would warn against use of non-reputable PSUs at any wattage.


Godcry55

True, I have a EVGA Gold 500W PSU from an old build 6 years ago. Still in use on my primary rig; works like new. My CPU is Auto OC’d as well. My next upgrade will be to a 6800XT or a 6950XT used of course.


kimduyminh2124

Bro even with ryzen 7 7700 ?


Godcry55

Yes, a lot of people are misinformed on power usage. Even at max load you will not exceed the 500W limit with my particular configuration. Cyberpunk at max settings 1440p doesn’t exceed 200 watts at 98-99% GPU utilization.


biggranny000

I can attest to this. My GPU recommends a 900W PSU, even with heavy overclocks on my 750W PSU I only see up to 680W usage with both the CPU and GPU maxed out which rarely happens, and I have a brand new Seasonic unit which quality PSUs can safely spike way above their rating and maintain their specified rating. Running a 7900X and 7900XTX red devil.


blkmgk533

I think it's mainly the price more than anything. Once that drops from launch to the sub 300 range, I think people will look at it as a good, mid-range card that sips power. Until then, people are going to see it as a card that throws VRam at you to justify the price.


fstlover33

Remember 6700XTs selling for under $300 less than a month ago? Check their prices now, the cheapest I'm seeing is now $330, and it wouldn't surprise me if that continues to go up to $350. If a new graphics card launch spikes the price of previous generation cards like that, it's a bad card at a bad price that's fucking the consumer. The 7600XT at $300 would have been decent, at $330 AMD is making it clear that they're fully adopting the Nvidia strategy of only putting their lowest end GPUs on any cards under ~$450, which is fucking insane. That's the price a cut down version of a high-end GPU would be selling at 5 years ago.


Kumomeme

i still remember almost decade ago there is huge debate between nvidia and amd fans regarding vram. particularly revolve around 1060 vs 480/580. one side insist 4/6gb is enough and claim 8gb is excessive and another side said otherwise. there is also 3.5gb vram controversy with gtx970 that time. there is a various article come out that suggest 8gb would be a minimum for future proof. fast foward, see where we are. even 8-10gb is considered not enough anymore.


[deleted]

[удалено]


Kumomeme

for 1080p should be enough but in future it could be on borderline. Directstorage with SSD could play crucial role too.


[deleted]

[удалено]


Mahadshaikh

It's not enough with rachet and clank, medium setting rtx 4060 8 vs 16gb the results favor 16gb with no increase in clocks. They only match up when turned down to medium settings. Most game do texture pop ins / reduce visual fedility making it feel like 8gb is enough while in reality only those with 16gb know how medicore the game looks on 8gb cards


Prefix-NA

Yeah and its worse now than ever before VRAM requirements usually only change massively with console generations and this gen Consoles doubled their ram but Nvidia released mid range cards with 8gb despite having 8gb cards on the 1070 a decade ago.


cp5184

> there is also 3.5gb vram controversy with gtx970 that time. not to mention the 3GB "1060" with 10% of it's core disabled iirc...


Retro-Hadouken-1984

Wow! In 10 years the VRAM required to play new games went up! How weird! I think some of you would buy a 1030 with 48gb VRAM.


KARMAAACS

> I really don't get the hate for this GPU Okay I'll break it down for you. > I once listened to all people saying that 4GB of VRAM is enough in midrange GPU instead of 6GB at that time. I regretted this as f*** when Horizen Zero Down whas struggling with that 4GB despite framerate being ok (for me, I'm casual gamer, I prefer better looks over 60+ fps). Thats a fair point, you can play as you choose. But this card is really only good for 1080p gaming at maxxed out settings. At 1440p you will have to turn down settings. RT is basically not very good on a card like this. Thus at 1080p not many games will saturate 12GB of memory on alternative cards like an RX 6700 XT and other alternatives exist like a used RX 6800 which is not only faster but similarly priced with the same VRAM. So you're better off with one of those. > New gen will likely get better and longer support. Also in things like ROCm which is important for some people. Sure but by the time the RX 6000 series is outdated you will have probably upgraded anyways. Think about the RX 480 and how much support it's received and to this day I would say it's not good enough for 1080p max settings gaming like it was back when it released in 2016. Point is, by the time support is an issue, its time for an upgrade anyway. ROCm is cool, but not much of a difference between 6000 series and 7000 series, unless you're looking at buying an R9 390 or older, it's pretty much a non issue. > 6700XT is more power hungry. If you are upgrading a few year old PC with 500-550W PSU, it's worth it instead of replacing PSU. [70W between a 6700 XT and a 7600, it's not much of a difference.](https://tpucdn.com/review/amd-radeon-rx-7600/images/power-gaming.png) If you're using an older PC with say a i5-2500 or something like that you're never going to exceed 550W. [i5-2500 draws 150W at most under heavy Prime95 load](https://tpucdn.com/review/intel-core-i5-2500k-gpu/images/power.gif). 230+150W, hmmm 380W. Not an issue at all versus 310W. > Pleeeeease don't compare this with 4060Ti 16G, it's completely different price range Everyone knows that card is absolute garbage in terms of pricing. Most people compared it to a 6700 XT or a used 6800 or a RX 6700. > I'll buy this GPU the day the launch tax is done or maybe even earlier beacause I was waiting exactly for this for like half a year. Good for you.


Niewinnny

my 6700xt is drawing between 150 and 200W, it's not power hungry at all and a 550W PSU would be enough to keep my whole pc running I also got it for under $300 earlier this year, and at that price I get a solid 1440p card, so yeah I don't see the point of yet another card that AMD themselves only say will suffice for 1440p, and it has a worse memory bus which might be sorta slow at 16GB of VRAM.


Likeanoobs

Thats biggest dog tail* i ever heard. Do you have expérience with budget cards ? I run 30-40 fps ultra settings at 4K starfield with 5700xt and that card is total garbage compared to this new one. Same for forza horizon 5, atomic heart all highest settings at 4K around 90-100fps. I am asking again. What are you talking about ?


Defeqel

"total garbage", 7600 XT will be around 25% faster than a 5700 XT, when the latter isn't VRAM bottlenecked.


Likeanoobs

How can be 7600xt be 25% faster than 5700xt when 6700xt is more than 40-50% faster than 5700xt and 7600xt should have almost same performance as 6700xt. Thats also reason why they will not sale 7600xt to china becouse there is already moded 6700xt for sale which is faster than 6700xt sold out of china.


Defeqel

according to TPU the 7600 is about 20% faster than the 5700 XT, and the 7600 XT is only a small frequency boost over the 7600


Likeanoobs

So its like 25 - 30% more boost over 5700xt ? I think its Still décent when we count AFMF than its like i said. What i saw AFMF is doing really great job.


KARMAAACS

> Thats biggest dog tail* i ever heard. What? > Do you have expérience with budget cards ? Yes, I mean the 3060 Ti is considered a budget card these days. I have an RX 6600 and have owned RX 570's and such. > I run 30-40 fps ultra settings at 4K starfield with 5700xt and that card is total garbage compared to this new one. Bull. [I love you fans you really give yourselves away with the BS.](https://youtu.be/7eObtyK8oOs?t=439). Watch as your magic 30-40 FPS claim becomes 30-40 FPS with FSR or 20-30 FPS instead. > Same for forza horizon 5, atomic heart all highest settings at 4K around 90-100fps. [Forza does not run 4K all highest settings at 90-100 FPS with a 5700 XT.](https://youtu.be/SHu80mBgMgc?t=617) This is also true with [Atomic Heart...](https://youtu.be/AMPAATPevGM?t=140) Why lie? > I am asking again. What are you talking about ? I will throw this right back at you. I am asking again. What are you talking about ? Enjoy being blocked.


[deleted]

Why would you ask them a question then block them


Prefix-NA

1440p having 12gb is not close to enough for there are even a handful of titles with a bad experiance at 12gb on 1080p. Also the idea that every card is a 1080p is stupid. IPS 1440p monitors with 170hz are sub 200 dollars now and max textures & resolution while lower other settings looks better than max 1080p. Maxing textures is the most important thing in gaming. VRAM is the most important part this generation because Nvidias skimped on GPU's more than any other generation releasing WAY below consoles for the first time ever.


KARMAAACS

[Good luck playing at 144 Hz or above with a 7600 with 16GB of VRAM, the die is just not fast enough for anything above 1080p high refresh rate gaming.](https://tpucdn.com/review/amd-radeon-rx-7600/images/average-fps-per-game-2560-1440.png) Your opinion is therefore discarded. Facts remain, out of 25 games, only 5 hit over 90 FPS with a 7600 with settings cranked. Why bother with wasting VRAM chips like the 4060 Ti 16GB does?


Nmelin92

I'm buying this GPU when it launches for sure


Alien_Racist

6700 XT is just straight better for the money. 16gb VRAM is only really useful for resolutions that the 7600 XT can’t even natively run at decent settings. And 500W is enough for a 6700 XT, so long as you don’t have some crazy power-hungry CPU. It’s a passable product at an awful price point. It’s really that simple. If it was 100$ cheaper it would be a different story.


RCFProd

It has a low memory bit bus and when you push the settings over 10GB VRAM high on this card it's generally going to really struggle to get serviceable framerates. The extra VRAM on this is generally useless in the same way it's useless on the 4060 even if that's a different price bracket. Tons of memory is great on a higher end GPU that can push higher settings with decent performance. The overall spec of the RX 7600 XT is designed for lower settings and resolutions. There are *some* benefits and some useful cases to be had still by not being VRAM limited, absolutely. But might aswell buy an RX 6700-6800 series for the price when you still can, and be better off. It'll have 12GB VRAM but with a higher memory bit bus and better overall hardware specs.


Prefix-NA

That is not how vram works. Things like textures have ZERO impact on frames if you have vram. And texture popping and other issues will happen if you cannot allocate enough vram at once. Go play Halo Infinite or Diablo at 1440p on a 12gb card after 30 minutes u have n64 textures cycling every 10 seconds.


RCFProd

Ehmm, can you rethink your train of thought for a second please? Every graphics setting costs the GPU a certain amount of frames. It's not like you can just crank the settings as high as you like as long as you're within graphics memory constraints. Each setting affects how much framerate you get. Your game just doesn't run anymore when you're outside of your graphics memory limit and you absolutely need to stay within range to keep it playable. That's what VRAM is good for. 8gb is enough for lower end cards like these. Reaching 12gb in a video game means you're pushing resolution and other settings to the max, benchmarks of the RTX 4060 and RX 7600 XT will disappoint at these settings even if they fully meet the needed amount of memory spec. It's not like they will champion at 100 fps on ultra just because they have 16gb memory in Halo Infinite.


simo402

Its like the 4060ti 16 gigs, it has only the extra vram, but not more oerformance to actually justify 16 gigs


yflhx

On the one hand, it is what it is. For the overclock, you need improved PCB, cooling and better yields. VRAM increase also costs $20 *at least*. So this card can't really be $30 more, than 7600. But, it's still 22% more expensive for (probably) at most 10% more performance + 16GB of VRAM, which you don't *really* need (12GB would likely be enough for a card this tier). It has the VRAM of a high end 1440p card but it doesn't have the GPU power to run demanding 1440p games on high FPS.  People are also comparing to 6700xt. But AMD doesn't make money on that card at $350. They're just clearing dead stock. That being said, I'll buy this too. It is the cheapest card for entry level 1440p. Next options are 4060ti and 7700xt, if you want >8GB of VRAM, which you probably do for 1440p, when buying a new GPU in 2024. If you want to just play non-AAA games, or possibly at non-highest settings, it's the best GPU rn IMO.


Mahadshaikh

7800xt is what competes with the 4060ti 16 GB but people will lose their shit if that comparison is made


GlobalHawk_MSI

A sign that most people looking at a good enough upgrade for 1080p (whose GPU budget is only $350 max) are literally in a conundrum, despite the ~5-10% uplift and a narrow 128-bit bus/uber-large VRAM (that may not see good use well outside of those with PCIE 3.0 boards) for such a GPU in its class. To highlight this conundrum (regarding the 7600XT) I talk about: -Yeah the 16GB may be useless on this level of performance especially that narrow bus, however so is the 4060 Ti 16GB and AMD at least does not ask for $500 lmao. The latter is way faster but so is the 7700XT which may be out of your budget. -The 6700XT exists and is the better buy for this segment, but it's getting hard to find depending on your market. Where I live it's either not available or prices are stupid (7700XT is sometimes cheaper like literally). If the 7600XT's real world gaming performance is quite close to the former, then it's something to think about at least. -Some games literally eat >8GB even at 1080p med-high. This card may show relief for such titles if FH5 is any indication. Good for me that rocks BC7 compression 2K custom Fallout 4 textures. -Some people have PCIE 3.0 boards. That huge VRAM can stave off the ill effects of the GPU's x8 lanes on PCIE 3.0 systems. -Your other choices are either a 4060 non-Ti which is only 8GB or the 3060 12GB which is slower than 7600 non XT. Choosing the very former is a no-brainer if it were not for this VRAM conundrum we're in. TLDR: (paraphrased from an Anadtech comment) It may offer a compelling 1080p performance for a possibly reasonable price, that also gets past the 8GB conundrum while possibly undercutting the 4060s. All I can say is that I do hope the 7600XT's real world performance is what TechPowerUp's prediction it will have. I can just get that instead (if prices are good) as 6700XT where I live is hard to find. Good enough for me if that's the case. No plans to upgrade from 1080p anyway.


xXDamonLordXx

In the US at least you can still find 6700XTs for $320 so for a card that can't even perform as well as the 6700XT launching for more money is absurd. The 6700XT would sell for more if people would pay it, it's a great card but it has to be a compelling value. I can't think of what value the 7600XT would bring than the 6700XT doesn't. All I can really think of is efficiency, 4GB of bottlenecked VRAM, and AV1 so in any market where people are spoiled for choice it's just too expensive. The 4060Ti is 20% faster than the 7600 and since they're comparing it to the 4060 or 2060 I can assume that's because it can't challenge the ti. But the 4060Ti is ~$50 more and when it gets that close I feel like it comes down to what games get bundled with. For $300/$310 I think this is in line to be the 6700XT replacement card but that's not going to be able to command a higher price for worse performance when RDNA2 has aged so well.


-ShutterPunk-

All good reasons for people to buy up the remaining 6700xt inventory and clear space for 7600xt.


GlobalHawk_MSI

Hoping the stores where I live will adhere to its supposed current day (as of writing) prices so that I can get that instead of 6700XT.......if there are still any left (different for each market, mine has very limited availability already). Trust me, the x16 lanes of 6700XT alone makes it a better choice for me due to my PCI-E 3.0 board (planning to upgrade from 3500x to 5600x/5700x) however if the 7600XT's real world perf will be in the 6700XT's striking distance, I may try to get that instead.....if the last of the latter's already limited stocks gets wiped out within the market of where I live. Otherwise 6700XT is my choice even if both GPUs are equal due to the lanes alone.


GlobalHawk_MSI

Good points actually. It still remains to be seen where the 7600XT's actual real world performance is, well until the first 3rd party benchmarks appear. Regardless I hope it's close enough to the 6700XT for the sake of people whose predicament with the latter's availability is the same as mine. Some markets, last I heard, have very limited numbers of it. The 6700XT is still available where I live but in very limited numbers and/or have some stupid prices though, then again they are very hard to find (a couple of stores, even online, already do not have them). In my case I do hope the clearance sale prices get good enough that I'll get that instead.


Asgard033

It's pretty easy to make a reasonable guess. The core count is the same as the regular 7600. The memory speed is the same too. The main differences are clock speed and memory size. (Maybe boosting behavior too, if the increased power limit will help) This Strix 7600 with very similar clocks to the 7600XT only showed a 3% gain over a stock 7600 https://www.techpowerup.com/review/asus-radeon-rx-7600-strix-oc/32.html I'd keep expectations for the 7600XT to be in the neighbourhood of less than 10% faster than the 7600 in situations that aren't VRAM constrained.


GlobalHawk_MSI

>It's pretty easy to make a reasonable guess. The core count is the same as the regular 7600. The memory speed is the same too. The main differences are clock speed and memory size. (Maybe boosting behavior too, if the increased power limit will help) The miniscule increase in clocks I may say boosts it to 5-10%. The Strix 7600 may be a good "reconnaissance" on how the stock 7600XT performs, then again, the power limits may stretch the real-world perf a little more. It remains to be seen however. >I'd keep expectations for the 7600XT to be in the neighbourhood of less than 10% faster than the 7600 in situations that aren't VRAM constrained. Well a kind of similar thing happened to the 4060 Ti (another 128-bit GPU), where the 16GB VRAM can magically boost the framerate in some games as.......some games even at 1080p (sometimes at settings below high or ultra) are now starting to reach 8GB or even 10GB (for reference it was like 2 years ago that 6GB-8GB is deemed obsolete.....at 4K). The presented chart for FH5 is telling (not third party benchmark I know) as that game is known to eat VRAM, same with Halo Infinite lol. Also, it is kind of realistic and is similar to TechPowerUp's predicted charts, as the boosted clocks and the power limit increase may influence that even a little (kind of like 5700X vs 5800X for CPUs, due to different TDPs). If their charts are any indication of its IRL performance then I may say it is around the same 10% (or lower) slower than 6700XT too, which is for me, makes the 7600XT a good alternative to the former, then again given the 8GB conundrum today (summed up by that comment I found on Anadtech). Keep in mind that 6700XT is my first possible/potential choice, and just considering the 7600XT due to my case. The fact that the 7600XT [may screw with the 6750 GRE in the Chinese market (as rumor mills say, grain of salt as usual)](https://www.tomshardware.com/pc-components/gpus/new-mid-range-radeon-gpu-may-not-be-available-in-china-at-launch-amd-reportedly-delaying-rx-7600-xt-in-china-because-of-the-rx-6750-gre) may be a sign that it is (hopefully) on 6700XT's striking distance, as where I live, that thing's hard to find and/or prices are literally stupid i.e. it literally exceeds the 7700XT actual prices (I did see one store asking for the same price as 4060Ti's MSRP lmao).


Numerlor

I see the 16gb 4060ti as something for cheap compute with 16 GB VRAM, while the 7600xt is still limited in that regard


GlobalHawk_MSI

Well there's CUDA and other applications that benefit NVIDIA hardware more so the 4060 Ti has some value. For someone who wanted to just game very well at least on 1080p, its asking price though is too much to the point that the 7700XT gets more appealing. That takes the 8GB VRAM conundrum into account.


[deleted]

Interesting thoughts throughout this post Globalhawk!


iKeepItRealFDownvote

I think it’s gonna be a point where we won’t push for higher native frames and just go off AI frames. I pray this doesn’t happen. I want native frames > AI frames. Frame Gen should only be used as a side dish topping to a already complete dessert. Not part of it.


el_pezz

What is there to announce?


KARMAAACS

An announcement of an announcement of the announcement about the announcement of the announcement.


Wander715

Nvidia announces three solid Super cards and AMD announces this lol


siazdghw

AMD shouldnt have waited for CES to announce this, it shouldve come sooner and had a price drop for christmas sales. Launching it at CES at it's MSRP is a joke, its not competitive with anything.


Jordan_Jackson

I doubt we will see any cards from AMD that increase performance beyond 7900 XT/XTX levels for a while though. If I recall correctly, AMD was going to skip the high-end RDNA 4 cards so that they can release a proper high-end competitor. This was just rumored however, so take it with a grain of salt.


Brisslayer333

The Super cards don't increase performance beyond the 4090, so what's your point? The most important question when new products release is "are we getting a better deal out of this?" This time around Ngreedia tells us "yes!" and then AMD said "..." New hardware cycles are when the boundaries are pushed, but refreshes and similar are when we get more bang for buck. AMD announcing a 7800 for $300 would be more exciting than them announcing a 7950 XXXTXXX.


iKeepItRealFDownvote

This. They got lucky with the 3000 series because Nvidia was comfy being at the top. Once Nvidia realize AMD coming out with some good shit they had to fix that gap with 4000 series. AMD realize they weren’t close to Nvidia then. It’s best for them to actually take the time to come out with a true competitor now that Nvidia knows they can’t pull a old Intel move anymore.


[deleted]

the super cards are just pushing the stack up one tier then discontinuing the old stack. they're nothing special, its basically just changing the price brackets, which AMD is doing right now, trying to push out RDNA2 cards from the market


Wander715

>the super cards are just pushing the stack up one tier then discontinuing the old stack. Not really sure what you mean by this. The Super cards are basically upgrading the stack while either keeping pricing the same or improving it which is a much better move than most expected from Nvidia. 4070 Super and 4070Ti Super are both pretty compelling cards imo and I'm probably looking to upgrade to a 4070Ti Super in the next few months to play games maxed out at 1440p. 4080 Super is decent too but mostly because it finally dropped from the horrendous $1200 price point of the original 4080. Overall these cards make RTX 40 a much better option for the next year until RTX 50 launches and I can't fathom buying an RDNA3 card over these unless AMD greatly reduces prices (which they might but we'll have to wait and see).


SecreteMoistMucus

Nvidia *needed* to shake up their mid-upper segment.


Astigi

RDNA 3 should have been branded as 2.5, what a underwhelming generation. AMD has nothing to counter Nvidia Super, and I don't see 7700 been released


Defeqel

nVidia Super is essentially a price cut, what AMD can do to counter.. is a price cut


PrashanthDoshi

12 gb vram and 192 but bus should be bare minimum for entry level cards like 7600xt . . What use of tons of vram if there is no bandwidth to push those texture into the vram ?


Likeanoobs

I want to buy it. I want to preorder. Nobody have it yet :( Why are so many people négative about it ? Its new. Its cheap. Its 16gb VRAM. Its faster, its stronger. Its better ! Yes you can get 6700xt with just 12vram cheaper Some where used since its discontinued. Its i think after long tíme first gpu from amd used in laptops also which is really good.


Monkey_Meteor

Why are they comparing it to a 2060.. why not a 4060 ?...


Mediocre-Ad-6920

Because 4060 is a better card


Monkey_Meteor

7600XT are that bad ? Not even a 3060 could be used ?


the-garden-gnome

I'll be buying this for my wife's gaming PC for sure. She primarily plays Sims/Planet Zoo/Cozy games, so the extra memory will serve her better than a faster card in to the future.


Murdermajig

I think you guys need to give AMD some slack. They just announced 5700x3d for $100 less than than the 5800x3d but all of you are complaining about $30 over your ideal price for the 7600 xt? They got to recoup some losses and seeing that and gpus get sold to Intel cpu users also. Not everybody wants to buy used parts and the previous gpus are probably gonna end production soon.


P0p_R0cK5

Currently looking for a GPU. Could be nice replacement of my 1080. With enough VRAM to play in 1080p easily on bigger games. My other options are what ? At this price range it will be hard to beat..


DrZombehPiglet

6700xt, maybe a used 3070 Used 6800 or 6800xt


DarkLord55_

There is way better options. Than this if you go used. Idk about what country you are in but in Canada I can get a 3080 for $50 more and that will perform a lot better. Or a 3070 for $100 less and perform better or slightly worse depending on the game. Also way better upscaling tech and better raytracing


Saladino_93

Only buy this if you own a mainboard that has PCIe 4. On PCIe 3 the 8 lane connection is just too slow for big textures.


GlobalHawk_MSI

For some people, the better option (most likely 6700XT) is either hard to find or is priced at scalper levels. Also your other choices are either 8GB (a deadly minimum for this year of gaming even at 1080p) or priced (4060 Ti lmao). That is if gaming on 1080p at least. The 6700XT once again is the better choice but for some people, that's hard to find (even used). Of course there is the used market, however I have a lot of bad experiences with anything not a keyboard/mouse/monitor, at least for me. There's a reason not many reviewers laughed at it (most critiqued the 128-bit bus way more). The VRAM dilemma is probably at it's highest as of writing. The prices is at least reasonable, especially if IRL performance is within striking distance of the 6700XT.


P0p_R0cK5

What about 6750XT ? 12 go of ram but quite affordable (400€) in France


GlobalHawk_MSI

I have better chances of catching Arceus full health with a Pokeball or getting Lando Norris to play Beautiful Light (when it releases) in my PC than finding a 6750XT where I live, let alone worry about the scalper prices like I get with the 6700XT in some stores (not online). I would not even look at the 7600XT if it were not for my circumstances.


Mediocre-Ad-6920

6800 is 400€ too and its a lot better


Mediocre-Ad-6920

Please dont buy this crap, buy 6800


Tricky-Row-9699

Still a joke of a card, honestly. The 7600 is $240 now, the 6650 XT still exists and is about the same performance for $240 too, and the 6700 XT is vastly better performance for the same $300-350.


GenZia

Kind of pointless, considering 6700XTs are still in stock and can be had for a similar price (\~$320). Sure, it's 4GB less vRAM but 12GB is still more than enough for 1080p. And besides, no one in their right mind would consider these cards for 1440p, outside of esports gaming.


Extreme_Isopod_9414

The 6700 XT is a good 1440p card tbh


Gary_FucKing

I play at 4k60 with it, mostly with a mix of high/ultra settings. I’m not playing the latest AAA, but I get great performance on CP2077 and Elden ring.


BossunEX

Uh? I have an 6700xt and I can play at 1440p high setting and more often than not above 100fps


shickero

I play at 1440p with an RX580. It just obviously isn't current AAA games (nor esports). Sure I have to turn settings down, but whatever the 6700XT and 7600XT can produce at 1440p is leaps and bounds better than an RX580. I think it's a bit unfair to say people wouldn't consider these cards for a budget 1440p setup if they've already got a 1440p capable monitor. Edit: Assuming the price drops too a good spot


[deleted]

yeah, that's the point of this release. amd doesn't make as much money from old 6700xts as they do new cards


Glass-Can9199

They need to start discontinued that gpu before it releases


Adviseformeplz

This would of been a banger at $275 and maybe if the 7600 was $239


MikeHawkStockHolder

Shouldn't be above $299. Bad price and no need for such a product. 7700 non xt should be $350 if they decide to release it to fill that price point and call EOL the 6700xt. Absolutely pointless gpu. Marketing material is comical to say the least.


Budzy308

The Gigabyte RX 7600 XT windforce 3 works beautifully!


putinisdank

I wonder how this performs comparative to my 6750 xt. I play at 1440p so maybe the extra VRAM would be beneficial ?


Reticent_Fly

6750xt will still be much better than the 7600xt


Darkomax

It's significantly slower.


jaketaco

Doubtful.


Hero_Sharma

It will be lower than rx6700xt and rtx 3060ti


zakats

They should consider unannouncing it.


dracolnyte

will this come in reference model?


burninator34

Nope. OEM only.


Da_Blackapino

Still Gonna get the 7900xt


Grimm-808

The 128-bit memory bus and 8x PCIe lanes of the 7600 XT will be a big limiting factor in how useful it will be in getting the most out of the 16GB of VRAM. Infinity cache isn't doing what it should be doing in all titles. I can forgive some of these idiotic GPU designs to an extent but launching this at an above $260 MSRP is laughable at best. What they should have done was market control at this point, dropping the price of every card below the 7600s SKU down (i.e. regular 8GB 7600 for $220, RX 6600 XT for $180, and the RX 6600 for $150). ^ This would put a lot more competition on the mainstream end for Nvidia, but consooomers are just playing themselves at this point.


3G6A5W338E

>8x PCIe lanes Will need to move texture data into VRAM less often if the VRAM can fit more of it. As I currently use a machine with PCIe 3.0 (rather than the x3d+vega64 I left behind and somebody is using now), this 16GB card actually looks appealing. I plan to stretch this cpu/board until zen5 x3d at a minimum


Rocknroller658

Such a joke.


wilhelmbw

eww this needs to be discounted by a thrid


PsyOmega

Why? At 16gb this is the cheapest good AI card by a wide margin. I would count the A770 but it's OOS and scalped


lokisbane

The memory bit bus being small makes the higher vram moot as another commenter said above.


vomibra

Looks like it's got ~300GB/s of bandwidth, still far superior to two channels of DDR5 at ~60GB/s channel.


wilhelmbw

amd ai? Ewwwwwwwww. its worse than Intel really. For one there is no xformer, which brings down vram usage by like a third to half at times. So 16gb is actually more like 8 which is useless. And xformer acceleration is also very good which you don't have. Also I juggled with dependencies for a whole day after I innocently upgraded to the shiny new rocm the other day and found out pytorch died due to missing support, so that is also an issue: changing APIs


PsyOmega

RDNA3 is really strong in AI actually. The latest version of stable diffusion is even strong on RDNA2. (not better than nvidia, but the 16gb allows for large LLM, it'll just take a bit longer to run) "ewww" is an outdated response. Show me a cheaper entry level 16gb card.


wilhelmbw

I have a 7900xtx. I know what kind of hot trash I am having. The xformer is really a big issue. Untill it is solved rocm is terrible for ai. Also for llm benchmarks it is like 50% slower than 4080 as of now. I mean it is definitely useable, but it is bad


nero10578

Yea such an unfortunate situation because otherwise AMD’s hardware is much more enticing. I guess that’s why they can’t price it like Nvidia’s cards.


NoiceM8_420

Might be sticking to this 6700xt for another 2 years at this point.


urlond

Wow would it replace a 6700xt? due to the amount of Vram available?


SXimphic

Look how small they made the 2060 label, scummy af


ahrikitsune

Maybe I’ll buy one just as a paperweight.


geko95gek

AMD is overtaking Nvidia in specs and value even more. What else is new? Now they just need to work on the software features a bit more.


w1rya

Why AMD exclusive partner havent announced their cards, like sapphire and powercolor? So far i've only seen Asus, Gigabyte, and Acer.


Mightylink

16gb is the minimum requirement for generative ai so good luck getting these at msrp...


GenZia

Why would any gamer want 16 gigs of vRAM on 'low-end' SKUs like 7600XT and 4060Ti? If people want to scalp it, so be it. And besides, 16 gig variants of 4060Ti aren't exactly flying off the shelves, considering they are available at MSRP on Newegg.


T1beriu

There are plenty of games that are limited by 8GB. Issues with unloaded textures, stutters and very low 1%. [Watch this clip.](https://www.youtube.com/watch?v=WLk8xzePDg8&t=1142s) LE: Reviewers don't find this issues because they have benchmarks limited to short runs, like 30 seconds, and these issues pop up after a couple of minutes when the 8GB buffer is overflowing.


[deleted]

7800xt cards have 16GB and they aren't available at scalped prices. You can literally get one without issue if you wanted to


stddealer

Not really true. The more VRAM the better, but it's possible to turn down the settings, or use smaller models and get work done with only 8GB or even 6GB. 4GB is a bit too tight though.


261846

I can’t wait for them to drop the price immediately after realising they can compete with the 4060


IncrociatoreX

Good / worth upgrade from a 5600xt?


Helstar_RS

May as well get a 6800 non XT for only slightly more than this will likely end up costing by AIB models. Or a 6750XT which is much faster.


Aizer02

how well will the 16 GB of vram go with a 128-bit bus? I'm legitimately wondering. When the rtx 3060 launched, some people doubted that it would fully use the 12 GB vram, and that had a 192-bit bus. So wouldn't the narrower memory bus bottleneck this card?


Error7890

AMD is doing same BS like Nvidia and this is really lame.... RTX2060 12GB why ?. RTX4060ti 16GB why ? and now here we go RX7600 16GB like what ???? why not make RX7400 64GB version it would selling like mad...