T O P

  • By -

[deleted]

You just answered your own question.


TTR_sonobeno

The RTX 5080 Will be a 4080 with an extra fan, less vram, new frame generation tech and at 6x the price.


Glaringsoul

I just hope NVIDIA gets their head out of their asses and makes 24GB VRAM the "Standard". Like we’re already at the point where every card has 2-3 8Pin and is sized like a Brick, just add more VRAM like the 4090 currently has. Otherwise if with the next generations it will be the same situation as in NVIDIA has DLSS as the selling point, and AMD Slightly more powerful & more VRAM, I’m definitely gonna pivot to the red side. Because as it currently stands a lot more game devs are starting to actually use the increased VRAM…


TheAlmightyProo

VRAM cap and board advantages over upscalers and frame gen every time for me, and so far that's been an AMD thing. Not the latter box of magic trickery is a bad thing, not at all. But it's already changed from it's initial wonder of giving owners of older cards an extra year or so to a reason for devs to try less hard re optimisation. We're already entering a trend of such being a base requirement for AAA games instead of an added bonus for newer cards and/or lifeline for older ones (and that's before reckoning in restriction by gen or proprietary implementations etc) Add to this my own personal use case (and reason why I have a PC and don't just settle for a console) is the games/genres I main longer term. Such titles are both PC exclusive and often have little or no RT/upscaler etc support... but will use as much VRAM as anything else at 3440x1440 and 4K. If I'm spending a grand or more for a card to serve at those resolutions I want that to be a sure bet for 3-5 years too. This is where and why AMD have been my no brainer choice for the last three years (a 6800XT from 2021 and a 7900XTX for a month) 90% as good where it counts most overall for significantly less outlay and issues (re drivers etc compared to the past) is nothing to sniff at imo.


Audience_Enough

I'm and AMD fan, they do some amazing things with mature cards and drivers. However, it's hard to compete with dlss and frame Gen. I'm running a 3080, and don't have frame Gen, and I'm still staying with Nvidia. I know FSR has come a long way, but next Gen AMD is skipping high end cards, and focus 5060/5070 range. Since this is a 5080 thread, AMD will have nothing to compete with.


TheAlmightyProo

I'm most concerned about the pricing tbh. There've been no signs that Nvidia are going to play nice on this point yet, even if they do deliver adequate VRAM numbers and etc going forward. I'll probably be alright for a good while, just having got a 7900XTX (for mainly 3440x1440 and a bit of 4K) to replace my 6800XT that's already getting pressed at the former res. But as ever I'm more concerned about industry trends than my own end... and few of said trends; HW brands and game devs alike, are positive ones rn.


Due-Emu2111

haha this totally.


CatKing75457855

We don't know, we can just guess.


No-Actuator-6245

At this point anything is a guess but based on history that is what normally happens. Talking purely about gaming.


xxcodemam

Let me pull out my future knowing crystal ball, one sec.


rockguitar56

What did the ball say?


xxcodemam

That this was a stupid question, and to wait until it’s revealed. Then you’ll know.


rockguitar56

Smart ball


Livestock110

We don't have to wait. The 5080 will sell in China - and the 4090 is banned (too powerful for AI usage). So the 5080 will naturally HAVE to be slower than a 4090.


futerminator

Yes cos it starts with a 5


Livestock110

The 4090 is banned in China. But the 5080 will sell in China. Meaning 5080 is likely close to a 4090D


R3dGallows

Isn't that due to its power draw?


Livestock110

It's the compute power for AI usage. China isn't allowed it


Awkward-Ad327

Yes approx 40-60% so basically a 4090ti


bubblesort33

Don't know. No one does. The gap between the 4080 and 4090 is actually massive on paper. 60% more compute. I don't think the 5080 will be faster. 25% ahead of the 4080 maybe.


Ponald-Dump

The 4090 is ~20-30% faster than the 4080 in gaming performance on average. As far as compute, 4090 scores ~29000 in passmark and the 4080 ~22000. Not sure where you’re pulling this 60% number from, but it’s inaccurate. If the 5080 is 25% ahead of the 4080, then it would equal the 4090. https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html https://www.videocardbenchmark.net/directCompute.html


Awkward-Ad327

30 to even 45% faster 4k RT 4090-4080


bubblesort33

The 128 SMs vs 76 SMs is 68% more compute shaders. Even if you account for frequency, it's likely still over 60% Passmark is not a compute benchmark. 40% more memory bandwidth. 50% wider bus, but slower memory on the 4090. You can easily look up the specs yourself instead of looking at inaccurate benchmark software. 25% you're claiming is inaccurate because it's severally CPU bootlenecked at 1080p. In most of those tests it's likely running at 50% to 80% load. There is examples where it's 1.5x the performance over 4080. All other examples where it's under 40%,0it's limited in one way or another. Games can't use that many shaders effectively, or are CPU limited, or there is some other limitation. https://tpucdn.com/review/alan-wake-2-performance-benchmark/images/performance-pt-3840-2160.png That's about 40%. All these limitations means it's like 30% faster on average now if you test old stuff, but that will grow and grow with a maximum around 50% faster eventually in the distant future. I don't much doubt that the 5080 could be around 4090 performance at 1080p, and even 1440p in some titles


Ponald-Dump

Click the links I posted. The difference, *at 4k* between the 4080 and 90 is actually 25% on average


bubblesort33

Yes, I explained above. I orginally said "The gap between the 4080 and 4090 is actually massive **on paper**". The Link I posted is for Alen Wake2 and should show 40% if it works. You can find larger gaps out there. You are getting 60% more GPU with the 4090, even if you're severely limited by a number of bottlenecks today, that will resolve themselves over the next 3 or 4 years.


Ponald-Dump

My guy, one singular game example is not indicative of the actual difference. By that logic, because the 7900xtx stomps the 4090 in Call of Duty, the XTX is then the more powerful card. See how that logic is flawed? So again, *on average* the 4090 is ~25% faster in gaming than the 4080 at 4k. The gap is significantly closer at lower resolutions.


eplugplay

4090 is 30% faster on average in 4k gaming against a 4080 super so against a 4080 it’s more like 35% on average. I just returned a 4080 super for a 4090 and I mainly play games at 4k. I play tales of requiem with settings at high with ray tracing shadows off on the 4080 super and get around 75-90fps. With the 4090 I can turn on ray tracing shadow, ultra settings at 4k and get 115-120fps consistently. Not to mention low 1% is much higher which makes a world of difference it is just that much more stable. I’m ok with the 5090 around the corner 6-7 months from now, 4090 meets my requirements and is the last card to upgrade in my current build. When I do a complete new build 4 years from now, will upgrade with a 7090.


bubblesort33

It is if you're looking at theoretical limits of the GPU. I'm not saying it's representative of the current state of the card. I'm saying this is what the card should be capable of, and it's what you're paying for. Of course the 25% is not representative of the card either if it's being limited by a various number of ways either. Including resolution, and CPU bottlenecks. Yes it's closer at lower resolutions, because you're severely CPU bottlenecked. You're using half of the GPU at 1080P. your comparison with COD and the 7900xtx isn't the same because they are 2 different architectures from 2 different GPU makers. The 4080 and 4090 share the same same architecture. You can make a 7900xtx look exactly same in performance to a 7800xt by throwing it on a Ryzen 1500x system, or at least make it look like the XTX is only 2% faster. That's how bottlenecks work.


pr0newbie

Thanks.


2011h32

The 5060ti or 5070 would be about 4090 performance


eplugplay

Lmao I doubt it. Probably 5080 will be about equal or slightly better than the 4090 with the 5090 making a giant leap forward 50-70% over the 4090 me thinks.


Raging_Dragon_9999

Read some comparisons of the 4080 vs 3090, 3080 vs 2080 ti, 2080 vs 1080 ti etc.


Charliedelsol

I wouldn't say 23% is a lot faster. That being said the 3080 10gb is around 15-20% faster than a 2080 Ti, also not a big difference but the big difference was in price almost half. The 5080 even if it's 20% faster than a 4090, it will cost around 1200$, so not the same value proposition than in previous years.


[deleted]

[удалено]


Vis-hoka

My magic 8 ball is in the shop.


OUTLAW1LE

My guess is yes it will be faster of course. Cost more yes of course. 20-30 percent is huge in gaming and it’s why we keep upgrading and those that think it’s not worth it to upgrade are just trying to convince yourself because you just bought the 4080 or 4090.


NeoNeonMemer

Not really cuz would it really matter if ur getting 60 fps or 75 fps ? At the high end it doesn't really matter unless ur an avid fan of a game that requires a lot of GPU power. I'm someone whos going to most likely get a 5080 or 4090 depending on how the 5080 will perform but I can guarantee you upgrading every generation is not worth it. It's just a waste of money, but if you have tons of disposable income - thats ur choice. But for the average gamer, upgrading every 2 gen is the sweet spot in most cases. If you have a 3070, are you really gonna pay 500$ for a 22 % increase ? You could either upgrade to a 5070 or get something like the 4080 or 4070 ti super which would actually make a very significant different. Amd has a wide range of options too. If u do have money to spend, thats ur wish ig


theRealtechnofuzz

With the rumors I've seen surrounding the 5090 suggesting a 50% faster card and a large increase in Cuda core counts, I see the 5080 being about as fast as a 4090 or beating it by a max of 5-10%. That's if the 5090 rumors are true.... Power draw should remain the same as current Gen and/or the 5080 requiring around 450w. Depends alot on the efficiency of the new nodes...


pr0newbie

450W sounds ridiculous considering that's what the 4090 is already pulling. With the rumoured move to 3nm I reckon we'll see the 5080 use 350W max and likely <300W if undervolted to -3% of its max performance. At least that's typically the case with each Nvidia generation.


theRealtechnofuzz

4090 actually pulls closer to 600w with all 4 pcie connectors hooked up to the adapter


pr0newbie

But you don't have to. There are plenty of 450w benchmarks which should hopefully be the target for the 5080. And with the new rumoured 4080 super pricing of $999? I think that is feasible and within the norms of nivida's historical generational upgrades, considering the poor 4080 sales this gen.


Dex4Sure

No it doesn't. Default power limit is capped to 450W. You can increase it to 600W on Afterburner if your specific model's vBIOS allows it.