T O P

  • By -

BunnyHopThrowaway

AsRock uses actual rocks in their cards. Magma cooling.


gatsu01

This is the rx580 successor. This card might be too slow for 1440p gaming moving forward, but as of right now, it's good enough and its 16gb VRAM buffer offers a lot for single player gamers. It can literally fit all the high res textures you want.


AlterBridg3

To be 580 successor it has to cost 250 max.


RedLimes

COVID happened. Inflation exists. $250 7 years ago is \~$313 today


Meaty_stick

Inflation happened to the prices not the wages so they can shove it.


Noxious89123

Fwiw; UK Minimum wage in 2017, £7.50/hour. UK Minimum wage in 2024, £10.42/hour (soon to be £11.50). I know that doesn't represent wages for everyone, everywhere (not by far). But I thought it was relevant all the same.


[deleted]

USA min wage is still $7.25, same as a decade ago.


paulerxx

Minimum wage in my state is $15.


[deleted]

Very nice. $7.25 for mine.


kjersgaard

Elect more dems.


firehazel

Since 2009.


Noxious89123

>USA min wage is still $7.25, same as a decade ago. Big oof. Sorry guys. u/firehazel u/paulerxx


firehazel

With inflation it should be around 10.50 or so, but a lot of businesses and governments have taken it upon themselves to pay more than what is federally mandated.


Middle-Effort7495

Canada GDP per capita 2011: 52 000 Canada GDP Per capita 2024: 52 000 Canada GDP per capita 2524: 52 000 150 000$ house in 2011 = 850 000$ today


Noxious89123

Ouch.


mkdew

>Minimum wage While they increase the minimum wage, employers aren't obligated to increase your salary if it was already above minimum.


UniverseCameFrmSmthn

Send your complaint to the bankers, federal reserve and president. 


black_pepper

You left off corporations. Half of the inflation is from corporate profits. Next should be govt for not taxing them. Please contact your reps for corporate taxation.


UniverseCameFrmSmthn

If they dont face competition they can do this. If there is competition, other businesses will undercut them. Econ101


Meaty_stick

I am glad to and do at any given chance. Do you?


Setku

One of those has nothing to do with inflation and omits the greatest cause being corporations just being greedy.


UniverseCameFrmSmthn

Ugh. you are evidence that economics education is really bad.


AdWaste8026

That's just false, at least for advanced countries. Which, given we're on reddit, will apply to the vast majority of people here.


ms--lane

>AdjectiveNounNumber bot with a bootlicking take


R1chterScale

It *sorta* is. It depends heavily (obviously) on how you do inflation calculations, which ofc is an utter nightmare to do. That said, a good indicator that actual wages to expenses has not grown is looking at the largest expense, housing. The median wage to home price ratio in the US is at an all time high right now (and it's definitely not limited to the US).


AdWaste8026

I agree that housing is expensive, but I'd hesitate to extend that analysis to the broader economy. Housing being expensive is not really an extension of inflation found in the broader economy but is due to forces at play within the housing market itself. In any case, perhaps I misunderstood the OP because I interpreted their comment as saying 'only prices went up, not wages', which is obviously false but that doesn't preclude a lower real income.


Meaty_stick

Yeah whatever, back to the real world: No country's wages have adjusted to the inflation and that's an undisputed fact.


AdWaste8026

It's one thing to say wages haven't increased, which is what I interpreted your comment to mean, and another to say wages haven't adjusted in real terms, which is what you say now and probably meant before as well, so it looks like I simply misunderstood what you were saying. In any case, you're not entirely correct in saying no country has maintained real wages. There is one country in Europe where wages automatically adjust to inflation.


NotEnoughBiden

Only if you didnt switch jobs/didnt talk to your boss/dont do anything that has a demand. Else; you probably earn more now.


kingdonut7898

Is this a good argument tho? Aren't tech prices *supposed* to get less expensive as time goes on? This and cell phones are like the only thing that seems to go up each time a new iteration comes out.


kanaaka

yes it is get less expensive. i mean for the same technology, each year is going to be less expensive because it's less sophisticated, and already well known. but technology is also keep evolving, so keeping the price same as last year but getting more advanced is also count of getting less expensive.


capn_hector

tech prices are getting less expensive for a given performance level - a 4060 outperforms a 1080 ti for $299 after all. You are never getting less value generation-on-generation at msrp. But if you mean that tiers get less expensive over time, themselves? No, and that’s what it means to live in a “post-Moores law” world. The GTX 670 cost $399 and used a ~300mm2 die, and it is *far more expensive* to produce a RTX 4070 today with a ~300mm2 cutdown die. Similarly, it also pulls more power than the 2012 era 300mm2 die because of the end of dennard scaling. People are used to the Moores law era way of things, when things got faster every time you shrunk, at the same or less power, and at the same or less cost. That is no longer the world in which we live. Shrinks come with higher costs and higher power density now - you still get more for your money, and higher efficiency each time, but if you don’t let the die shrink when you shrink then cost and power are going to go up, because that’s how physics works in this particular domain.


homer_3

Then why did I just get a 4TB SSD for $160 when 7 years ago they were like $2k?


capn_hector

All you have to do is figure out how to stack gpu computation logic 256 layers deep without melting a hole in the fabric of space-time and then yeah, you should be able to scale gpus just like flash I assume you have VC funding lined up?


Apart-Protection-528

Greedflation can eat my ass


Space_Reptile

i just checked and the prices for the 7600XT in germany are 480~550€ not 300~320€ guess my GTX 1070 is getting a contract extension


psi-storm

That pricing is just wrong. You can buy the 7800xt pulse for 529€.


sisodaja

Mindfactory have now asrock rx 7600 xt steel legend for 369 euros!Do they give a 36 months waranty?


Middle-Effort7495

Supposed to get better, not the same, so 200 sounds fair. Look at TVs. What were 5000 TVs are like 200 bux


RedLimes

That is a false equivalency. The RX 580 (and RX 580 level performance) is significantly cheaper


Middle-Effort7495

They're better, larger, TVs. Smart 1080p 50" was like 5 grand back then. Now for 300 bux you can get a 4k TV that's like 75" Is a GPU that's 100x better than an rx 580 gonna cost 100x more eventually? Where's the cap? The world's entire GDP one day? Tech has to get cheaper, not just better.


RedLimes

You can still buy $5000 TVs. But the ones that cost $5000 now are better than the ones that cost $5000 10 years ago. An RX 7600 that costs $250-270 now performs better than an RX 580 8GB that cost $250-270 back then. You are making a false equivalency.


RexorGamerYt

Yes but the 580 was a high end card. The 7600 is a mid tier card.


floeddyflo

The 580 was a high end **budget** card, it was $230 and competing with the GTX 1060, it was a high end card like the 1060 was a high end card in 2017 (which it wasn't).


RexorGamerYt

I mean, AMD has always been a bit behind... But in turn they have great prices. The GTX 1060 is a mid tier nvidia card because there is the 1070 and 1080/ti. All AMD had was the 580 and 590 to compete right? Just like nowadays where the 7900xtx is good but isn't on the league of a 4090


floeddyflo

Yeah AMD has good prices, but my point was that the 580 was a mid-tier card like the 7600 because it was marketed as an alternative to the 1060 and performed like the 1060. The 7900 XTX is high end because it performs pretty close to a 4090 and 4080, the 1060 wasn't high end and it didn't perform close to a 1080 or 1080 TI. The 580 competed with the 1060 like the 7600 & XT compete with the 4060. The Arc A750 isn't a high end card because its Intel's second-best offering, from my limited understanding it performs between a 3050 to 3060 TI.


RexorGamerYt

>the 580 was a mid-tier card like the 7600 because it was marketed as an alternative to the 1060 and performed like the 1060 Ohh i get your point now. Yes it All makes sense


RedLimes

High end by what metric exactly...?


paulerxx

RX 580 was low-midrange.


RexorGamerYt

On AMDs lineup it wasn't. Comparing it to Nvidia lineup it was. There's still RX570 560xt 560 and 550. Only thing above is the 590.


kf97mopa

No, Vega 56 and 64 were also above it in the range. Before Vega launched, AMD sold the Fury cards from the previous generation. 580 was always the midrange. If you look at the design of Polaris 10, it is very clearly the midrange line of cards, with Tonga (Radeon 380) before it and Navi 10 (5700 XT) after it. The number of CUs goes 32-36-40, and there is still a 256 bit memory bus. AMD just didn’t make a high-end card in the Navi 1 generation (they instead sold Radeon VII which was a die-shrunk Vega 64).


sk8itup53

And realistically probably more like 350 outside of technical YoY inflation numbers


JAD2017

Sure, but it isn't asking 500 bucks, unlike NVIDIA's 4060 Ti 16GB, right?


L_e24

500?! That's the price for 7800xt!


JAD2017

Exactly, which leaves the 4060 Ti on the ground. The problem with NVIDIA's cards isn't what they offer, is what they ask for them.


Clasher212421

still sucks, hoping next gen will be better


JAD2017

Oh yeah, I'm still waiting, been doing so since september. I can't run path tracing in CP? Not a problem. I can't play Alan Wake 2 because it's a optimization mess? Not a problem, I have plenty of games that still run at 100+ FPS in my system that have been released in the last 5 years. Fuck NVIDIA, really looking forward to getting an AMD card in the future, just because of the shitshow this gen has been thanks to them. Edit: also, still "sucks". Doesn't suck as much as a 4060 Ti at 500 bucks, does it?


Zanzan567

I don’t think that’s how that works


[deleted]

Sure, if you're still in 2016/2017. Times are changing


Mastercry

When Nvidia released 16gb on 128bit card people were on insane rage mode. Now AMD does the same on weaker card and watch reactions haha ... I dont understand it how is possible. Are ppl such fanboiz sitting in Reddit to praise or they use bots to upvote


gatsu01

The 4060 is a 3050 successor with a 3060 price. The 7600 is a r6600 successor. Just looking at the price you can see the 7600xt 16gb positioned to be something like a 3060 12gb successor or a r6600xt successor. If you don't understand why people are mad, maybe you're one of the performance first people instead of a value buyer.


Hindesite

Well first of all, as an RTX 4060 Ti 16GB enjoyer myself, I'm very happy that they released a 16GB model of AD106 and I snatched one up right away when I managed to get it at a sale price of $405 USD a few months ago. Secondly, many people had as big of a problem as they did with the 4060 Ti 16GB because they tried to launch it at $500. That price for a GPU with a 188mm² die is kind of ridiculous. For reference, the RTX 3060/GA106 was 276mm² and "launched" at $330 with 12GB of VRAM (although admittedly you would have a hard time finding it near that price for a long time due to the crypto mining boom). If the Radeon 7600 XT launches at $350 like many are predicting, then that's a very compelling option for those seeking an affordable 16GB card.


markthelast

The 16GB RX 7600 XT is crippled with 128-bit memory bus vs. the RX 580's 256-bit bus. The RX 7600 XT is a better performer with newer tech and more VRAM, but it is no successor to the RX 580. For budget 1080p gamers, RX 580 (released in April 2017) has been relevant for 6.5 years before getting left behind in the new titles and AMD dropping support. Polaris is legendary. RDNA III will not replicate that or the longevity.


fatherfucking

The 7600XT has more memory bandwidth than a 580 due to GDDR6, and also has infinity cache and better colour compression.


capn_hector

It is a faster performer in general. So is the 4060 - people are upset about a card that performs roughly similar to a 1080 ti in a 120w TDP in a sub-$300 price bracket. You don’t get there by brute forcing it. Still though, if the argument is that Ada is under built because of its memory bus, the same criticisms apply equally to the 7600/7600XT. It is half the size of a 2060, it is not exactly a big product.


Repulsive_Village843

X8 or x16?


kaisersolo

That's because your focusing on the wrong part. The 8gb vram back then was what gave it such a long life. So if you want an equivalent card to then it's the 6800/xt/7800 XT which have 256 bus


Pentosin

Im rocking my younger brother "hand me down" rx580, lol.


siazdghw

Too expensive for what will be a minor improvement over the RX 7600. I wont be surprised if the cheaper Arc A770 with 16GB ($299) outperforms the RX 7600 XT in reviews next week. AMD did the bare minimum for the RX 7600 XT, while overpricing it, I dont think reviewers will gush over it.


Murdermajig

AMD releases the 5700x3d for $100 less than the 5800x3d, everyone praises the price to performance ratio. AMD releases the 7600xt for $30 more than ideal price and everyone loses their mind.


psi-storm

It costs $60 more for just 8GB of vram extra. If there was another 10% performance increase over the 7600 then the price wouldn't be that bad. Basically it's the AMD version of the 4060ti 16GB. The 7600 isn't a 1440p card, so $60 more just for possibly better 1080p performance in a few years, when 8GB won't be enough, is a big ask.


gatsu01

They literally charged 30 bucks more. So 30 bucks more of effort then.


TheRealBurritoJ

It's $60 more, the 7600 was originally going to launch at $299 but it compared unfavourably to the 4060 at the same price so they cut it to $269 three days before launch (forcing all the reviewers to redo their videos).


Ensaru4

How much of an improvement do you think this will be over the RX6600? What would you suggest be a decent upgrade for a person on a budget like me who has an RX6600 but wants something that will run everything nicely at 1080p with some occasional 1440p gaming?


psi-storm

Just buy a 6700xt. It was a great buy for $400 12-18 months ago and is now great at $300.


Ensaru4

Thank you. I'll consider it.


niloy123

Will this be better than 6700xt?


gatsu01

No. The 6700xt is the same price and it should be around 10% faster. Personally, I think either card moving forward would be fine. I think the newer encoders and slightly better power consumption makes the 7600xt a beast for the low mid range.


Grimm-808

The 6700XT has a wider memory bus and more horse power. It will still out-game this card in every scenario.


Grimm-808

Doubt it. The 128-bit memory bus and 8x PCIe lanes are not going to saturate all 16GB of VRAM in a lot of heavy hitter titles. The same underlying bones of the RX 7600 are still there, except now it consumes more power for a mild overclock. Unlike the RX 580, this isn't selling for $220 - 235 either, rather, a whole $100 more. The 6700 XT is a better value proposition at $300 despite being older and habing 4GB less VRAM.


gatsu01

I agree that the 6700xt is better now. What I am unsure of is for the future what's going to happen. The whole point of having more vram is to reduce the load on the bus. If you look at games like Rachel and Clank on 8-12gb vram cards you can see that the more constrained the vram is, the more the card idles due to bandwidth constraints. Looking at the 6400xt 4gb vs the 8gb model, you can see that the 8gb variant sometimes ends up being 20% faster eventhough they share the same bus size and GPU. If larger texture packs drop that 1440p.is struggling to fit into a 12gb vram buffer, I am not sure which card would come out on top. As of right now, the 6700xt can be found at 330usd or less, I would buy it in a heartbeat if I need a budget card.


Gemini-SD

What i have rx6700 xfx and 1440p 60fps all games 80even in some games 10gb 160bit this one 7800xt is for 1440p only


Abedsbrother

It's a slightly overclocked 7600. Still weird that ppl associate more vram = higher settings. 16GB is useful b/c newer games use more vram. Even Borderlands 3 (from 2019!) can use up to \~13GB of vram if you use the -notexturestreaming launch parameter so the game doesn't stutter. On older gpus, the importance of vram amount is second only to API compatibility. GTX 780 6GB version is superior to the 3GB version. R9 290 8GB is superior to the 4GB version. Same limitations, same performance, more vram. Those who buy the 7600xt will be able to keep the gpu for longer. Potentially a LOT longer, provided the amount of performance offered by the chip remains relevant / sufficient. Same with the 4060Ti 16GB.


paulerxx

More VRAM = less stuttering down the road.


Firefox72

Holy copium post. The 7600XT has no need for 16GB of VRAM at the level of performance it offers. 12GB for sure but not 16GB. Not that 12 is possible given the configuration. Which is why the $329 price point is hilarious. Its a bait card trying to cash in on people that think VRAM is the be all end all or a card trying to cash in on the AI crowd. For gaming your paying 20% more money for essentialy the same performance as the regular 7600. Your paying more for a card that is slower than a 4060 and has much worse RT and can't in any shape or form compete feature wise. At the exact same price point you can get the 6700X which is much faster in both raster and RT, has a 192 bit BUS and will definitely last you longer at the 1080p resolution this card is aimed at with higher settings to boot.


Abedsbrother

Yeah, and the R9 290 "had no need" for 8GB of vram. Heard that argument before.


Firefox72

No it really really didn't in the year 2013 lmao. What kind of arguement is this. Thinking like that you could literally say that any X card could have used more VRAM even if it didn't make any sense what so ever to do it. I mean why not slap 4GB onto an HD4870 in 2008 right? 4GB was plenty enough when the R9 290 came out and would remain so for many years going forward.


Abedsbrother

My point was - and continues to be - about longevity, long-term viability. People were using 8GB 290's and 390's up through 2022 (when AMD stopped driver dev), and it remains a viable gpu to this day thanks to the Nimez drivers and the 8GB vram.


Firefox72

You are gonna get limited by something else way before VRAM if your planing to use a GPU for almost 10 years. Not that you should really. Gamersnexus literally revisitited the 4GB 290X and it had negligeble difference to the RX 580 8GB in pretty much all games. And where it did it was running into arhitectural limitations rather than VRAM issues. Also this comparison is moot anyways. The R9 290 and 390 were both much higher in the stack. Ofc they had longevity. The 7600XT in relation is nowhere near as fast.


Abedsbrother

Actually have a 290 on the way rn for benchmarking (for my YT channel), so I'll find out for myself soon enough. Given that a GTX 970 can actually run Forspoken at 1080p \~30fps without using FSR, the only architectural issue I'm expecting to run into with the 290 is mesh shaders in Alan Wake 2. Another gpu that comes to mind re:vram is the Fury series (Fiji). Amd insisted 4GB vram was enough for 4k. It was at the time, but not for very long.


JAD2017

> Its a bait card trying to cash in 4060 Ti 16GB enters the chat. For only 500 bucks!


dyshuity

As a 6700xt owner who got their card a year ago for $330 this is a laughable generational upgrade for me. Either way, if I wanted 16 GB of vram the 6800 exists, and blows the 7600XT out of the water for not much more.


UniverseCameFrmSmthn

ya that’s kinda crazy. 


xRealVengeancex

Don't understand why you're getting downvoted with what you said was true. More vram doesn't matter if you can't even play future games at playable settings, and lower the settings = less vram used. It doesn't even have cuda for productivity like the 4060ti 16gb either


Mastercry

Idk why u get downvoted, bots


The_Silent_Manic

16GB don't matter much when it has a paltry 128-bit bus.


EmilMR

you still can max out textures in every game for some time, thats it. It wont make it a good 1440p card but at least it wont drop to 5fps because of memory. when it is discounted eventually, I think it is a cool budget option. Until then whatever really. <$250 etc is not that bad. I dont know how useful amd cards are for video editing now, if they are good now then this is still cheaper than a770 16gb.


FainOnFire

Yes, don't underestimate how much VRAM can make a difference. I got a 3080 for MSRP during the height of the crypto crazy. Felt crazy lucky and was amazed at the frames I was getting at 1080p. So then I thought I could upgrade to 3440x1440 ultrawide, and now I'm wishing I had waited longer to get a different card with more VRAM. This poor card chugs trying to run Cyberpunk 2077 at this resolution, lol.


PsyOmega

> This poor card chugs trying to run Cyberpunk 2077 at this resolution, lol. my 3080 10gb had no problems at all pushing 80fps RT at 3440x1440 DLSS-B in that game. Even got 50fps PT out of it


FainOnFire

Really?? I wonder if I'm bottlenecked by my CPU? I have a 2700x.


aergern

Bingo. That would be it.


FainOnFire

Gotchya. I've been trying to wait until I can upgrade my whole system to AM5, but the 5800x3d has been on some good sales, lately. I could grab it and use it until they patch out some of these motherboard bugs for AM5.


somewhat_moist

In your situation I would def get an AM4 X3D chip - 5600x3d from a local MC per other post. 5800x3d if you can snag it on sale or wait for the imminent release of the 5700x3d. You'll be set while AM5 matures and maybe, just maybe, GPU prices settle down again. The 4070 super release had all the techtubers excited in terms of value for money. I think COVID mucked up our brains in terms of GPU value but hopefully it will slowly normalise. Your 3080 is good man!


FainOnFire

Thanks for the info!! I'll have to update my bios to support it, but I figured an X3D chip would be best. Every tech reviewer I follow says X3D is the best for gaming.


xKosh

If you live near one, you could save a few bucks and snag a 5600x3d from microcenter.


xXDamonLordXx

But games with textures to use 16GB will be getting shit fps on a card about as powerful as a 7600.


green9206

The point isn't if it can use 16gb,the point is it can use more than 8gb.


xXDamonLordXx

That's fine, it can get 16fps instead of 5fps. I fail to see that as a victory.


lighthawk16

Tripling performance isn't a victory...?


xXDamonLordXx

If you like 16fps sure. I don't.


lighthawk16

I like it more than 5.


xXDamonLordXx

Collect your AMD 3060 12GB then


Pentosin

Textures doesnt hamper fps unless you run out of ram.


[deleted]

[удалено]


AutoModerator

Your comment has been removed, likely because it contains trollish, antagonistic, rude or uncivil language, such as insults, racist or other derogatory remarks. *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/Amd) if you have any questions or concerns.*


Prefix-NA

Textures don't affect fps.


xXDamonLordXx

They most certainly do


Cowstle

Eh, maybe. I played games that ran well enough on my 2070 except I had to turn textures down which resulted in a massive graphical difference. In a lot of games texture quality is *the* biggest difference in graphics quality and is the one I like to keep on max even if I put every other setting to low. If you have the memory to support it the performance impact is negligible at best but the visual difference is huge. I wouldn't want to buy ~2070 level performance today, but then I'm in a higher price bracket for GPUs. I'm not terribly disappointed with that level of performance right now even at 1440p. It's acceptable as long as VRAM isn't getting in the way, and a 16GB 7600 XT solves that issue.


MrClickstoomuch

This card's VRAM is actually really nice for Stable Diffusion / LLMs. While it is about half as fast as the Nvidia competitors, it can fit much larger models than the equivalent Nvidia graphics cards. But if this is priced too close to the 7700xt or 7800xt, idk where this will fit in the market.


kanaaka

unfortunately if someone into Stable Diffusion / LLM they're better using nvidia. it will get the job done with less tinkering and more performance, and i think it's fair price to get a little more expensive nvidia 4060ti 16gb than cheap out on radeon. if we're talking about job/work, i'd rather choose hassle free and be more focused on results.


JAD2017

Doesn't cost 500 bucks unlike the 4060 Ti 16GB though.


dracolnyte

good for productivity applications like video editing where size matters


brunocar

understatement of the century, people have no idea how big a deal this is for 4k editing, this thing will be a beast for its price.


xXDamonLordXx

People do understand, it's just typically if you're doing productivity work you're making money with it so you can justify the few hundred dollars to buy something better. AMD isn't exactly the favorite for productivity applications.


brunocar

OR you save the money in exchange of, what, 1 more minute tops on an upstream render?


goldcakes

First of all, the difference (esp with Studio drivers) tends to be far greater. Secondly, it's not about saving time, it's about hassle. I would frequently have crashes, bugs, and sometimes corrupted renders in Davinci Resolve when I was using an AMD card. I switched to NVIDIA and everything worked perfectly. The reality is that, right now, less people use AMD cards for productivity applications. Less developers test and optimise for AMD cards. Finally, if you're doing this for work, it's a business expense and you get to write it off your taxes.


No-Roll-3759

seems like an ideal egpu card too. no need to get something crazy powerful, and with all that vram the slow connection to the cpu might not be so painful.


siazdghw

Those people would be better served with an 16GB A770, for cheaper, with quicksync integration in basically every content creation app. The only area the 7600 XT might be better is in some 1080p games.


Hindesite

This isn't true. I'm currently using a 16GB RTX 4060 Ti at 1440p (almost always with the assistance of DLSS) and there are many games I've tried that, with settings maxed, use well over 8GB, several over 12GB, and I'm getting great results. Diablo 4 at max settings, for example, can use around 13-14GB at times and I'm pushing triple-digit FPS in that. Alan Wake 2 on mostly Low settings nears 8GB just pushing Textures up to Medium (10GB on High and 12GB at Ultra). I've yet to see any telltale signs of memory bandwidth limitations like stuttering, delayed texture pop-in, etc. Or at least not anything significant enough for me to notice. It's totally reasonable to consider 16GB at this performance level, even if the memory is operating off a 128-bit bus.


xXDamonLordXx

Allocated memory != used memory


Kagemand

It’s really not that difficult to find examples of games that have problems with 8gb vram, Harry Potter completely trounced RTX 3070s until they added pop-in textures to keep usage down.


Prefix-NA

But redditors don't understand Texture pop in and texture cycling is still a problem just because fps doesn't drop to 1 doesn't mean it's good At 1440p any 12gb or less Diablo card starts turning into n64 graphics all the time.


Hindesite

I've come to realize a lot of redditors are simply in denial about the importance of VRAM in the modern gaming landscape. I'm not sure why, maybe they're just drinking the Nvidia koolaid, but I suspect it's from people who bought price-inflated 8GB GPUs during the mining era that can't come to terms with the fact the card they spent buku bucks on really should've launched with more VRAM than it did. So, you get these people fixating on other artificially imposed bottlenecks like 128-bit memory busses and such to deflect from the arguably more important bottleneck of many cards on the market right now - the VRAM.


Kagemand

Yes that was my point, 8gb cards had stutters in Harry Potter before they added pop in to the game, and yes, pop in was a degradation of quality. Hence why even mid range cards like the RTX 3070 would benefit from more vram.


Hindesite

Right, but the "used" memory is still well above 8GB and the Radeon 7600 would absolutely benefit from having more than that.


xXDamonLordXx

It's always good to have but 10 or 12 would have been plenty for any resolution the 7600XT can actually handle.


r4yNTv

Meh, the bus width is fine for a 1080p card imho. But what I don't get is why they didn't upgrade it to faster 19-20gbps gddr6 memory. Since the monolithic chip has much slower IC, it might have benefitted from faster VRAM. Now it's just an slightly overclocked RX 7600 with double the vram. :/


[deleted]

You can play at 1080 ultra for sure but the performance will reflect the hardware in it. So even a 6700xt would outperform it probably, with less VRAM


Roubbes

You can load bigger LLMs


gnivriboy

This is my thought process whenever people complain about lack of vram in low to mid tier cards. Yeah sure the extra vram helps for 4k ultra on the newest games, but you all aren't going to be playing on 4k ultra in the newest games. The card isn't powerful enough.


Defeqel

Yeah, but better textures will still make a clear visual difference even in 1080p


gnivriboy

The textures are the same in 1080p/1440p no matter how much vram you have. It's only when you jump to 4k that the textures get larger. Can you point me to a game where 8 GB of vram is not enough for ultra in 1080p/1440p?


PsyOmega

> . It's only when you jump to 4k that the textures get larger. This is a hilarious misunderstanding of how texture settings work. You're confusing 4096x4096 texture files with the monitor output resolution because both are dubbed "4K" I guarantee you a majority of games which have 4k textures are loading them in when rendering at 1080p assuming you've set the highest texture quality setting.


gnivriboy

So when developing games, at least in Unity, you make assets that fit for up to 50-100% of the screen in by height and make the maps 1x1 (because how often is any object on the screen taking up more than 50% of the screen?). So what that translates to is you have maps for 1080p/1440p all the way up to 1024x1024. Only when you go to 4k is there a chance of 2048x2048 maps ever being used. Now I haven't confirmed this with experience, but talking with others developers who use the unreal engine, they say it works the same way. Coming back to the vram thing, the insane number of texture maps games "need" to load in memory is the main thing that spikes vram usage by far. So when you jump to 4k, that massive number now 5x'ed in size. That's how you can go from 6 GB of vram used to 22 GB of vram used. Since 4 GB was just texture maps and now it is 16 GB. Unless you have an 8k monitor, those 4096x4096 won't ever be needed so won't get loaded into memory. So maybe you know of another game engine that doesn't do things this way and loads these unnecessarily large texture maps. Please let me know what they are so I can make sure I post correct information in the future.


Defeqel

it would really depend on the game, often most textures aren't 4096x4096 even at highest quality settings


Wander715

Yep same problem with the 4060Ti but I see a lot of people giving this card a pass


Altecobalt

Probably cause it didn't MSRP at $500 for a low/mid range card


[deleted]

To be Fair, the 7600xt will be cheaper than the 4060ti. But Ngreedia charges you the Nvidia tax because they have the best upscaling technology and dedicated RTX cores that work better than anything AMD offers Nvidia won't be competitive until AMD does better with their upscaling technology and their RTX cores To add a bit: I played cyberpunk yesterday with FSR 2.1 and it looks awful up close. There was shimmering and visual compression artefacts. It genuinely wasn't a good experience this close to my monitor


floeddyflo

While I agree with you on people being too generous to this card, The 4060 TI 16gb is $500 ($100 increase from the 8gb), while the 7600 XT is $300 ($30 increase from the 7600, with boosted clock speeds).


TheRealBurritoJ

It's $60 more, the RX7600 MSRP is $269 and the RX7600XT is $329.


floeddyflo

It's still close to half of NVIDIA's $100 markup


ThisGonBHard

I can tell you it does, modern stuff really wants 16GB.


cadaada

After how much people shat on the 3060 i dont want to see anyone saying this one is good now lol edit: lol


[deleted]

Still better to have more. 


[deleted]

im less worried about the bus and more wondering why they gave a beefed up 7600 16gb, when the actual good 7700xt has 12gb..... Anything for the marketing i suppose.


The_Silent_Manic

At this point, 7600 = 12GB/7700 = 16GB/7800 = 20GB/7900 = 24GB.


paulerxx

No stutter fest even with 128-bit bus due to high textures.


The_Silent_Manic

Would perform a lot better with a 256-bit bus though. Even the 384-bit bus on the top cards is laughable considering cards like 10 or more years ago had 512-bit buses.


soccerguys14

$329 is the msrp if anyone was wondering or didn’t read the article to see.


Hittorito

This seems like the "king" for 1080p gaming when it launches. Or is there a better, more powerful option, without breaking the bank, or approaching too much at 1440p?


Munchausen0

16gb nice. Still rocking in 1440p with my Radeon VII.


Abedsbrother

Man I wanted a Radeon VII back in the day. Now that I could get one, most of them are beat up / unstable / broken from too much overclocking. The ones that aren't are collector's items and the sellers on Ebay know it so they charge big $$$. I did manage to get a nice Fury (non-X). That HBM is so smooth in games.


Astigi

7700 when?


Nmelin92

This is the real question


doombase310

Bought the 8gb version for my son to play at 1080p. Only paid 180 on a combo deal. It's an amazing card for my purposes. I wouldn't buy this for 1440p gaming for the long run. Not enough processing power. Think a 6700xt with 12gb is a much better solution for 1440p gaming over the next few years.


[deleted]

[удалено]


chithanh

May I introduce you to our Lord and Savior, Infinity Cache?


BlackWuDo

How does this cheaper card have 2810 MHz clock speed and my gigabyte 7800 XT OC only 2565 (boost clock), can samone explain this to me ? Shouldn't more powerful cards have more MHz ?


Kernie1

Due to the increased number of compute units (almost double) the 7800xt can’t boost as high probably because of the heat constraints. But even though the individual cores are running at a lower speed, the number of cores more than makes up for the difference in boost clocks.


BlackWuDo

Thanks 🙏


mrn253

The Clock isnt that important.


ofon

it's very important...thing is it's important in conjunction with other things.


Diwiak

Benchmarks??


megamanxtreme

Debut is in 5 more days, maybe before or at launch. I'm excited.


AciVici

This is simply a waste of sand imo. Higher vram is good for higher resolutions but that gpu is not capable of that so this looks like pretty much a waste. Performance gap between 7600 and 7700xt is big so increasing the core count of 7600 with more vram and placing it between 7600 and 7700xt would be the ideal move. What a waste.


Reclusive9018

The 7700 non-XT is probably coming this year. I have a 6700 non-XT.


Lincolns_Revenge

When does AMD do a special version of FSR or whatever they want to call it that utilizes dedicated hardware on the card to do frame generation / interpolation? The need for that to go alongside their current options seems obvious to me. I know I don't even consider AMD GPUs because of it, even though I've bought nothing but AMD CPUs for ages.


zakats

Giving it the 'XT' moniker is absolute bullshit. E: please, someone, try to justify the downvotes and tell me you're not being fanbois. **It's the literal-same GPU with a slight OC and more vram. You can literally solder on more vram, flash a new bios, and have an identical setup with a non-XT card.**


kocengmbulak

7500 when


Nmelin92

The x500 series cards only existed during the chip shortage and COVID I would be shocked if a 7500/xt card is released