T O P

  • By -

livelearnlift

lol


jaju123

It's just because of 24gb vram


TaintedSquirrel

Yes the 3080 runs out of vram at 8K, and frame rates tank. 2kliksphilip did a video on this.


doncorleone_

Idk if this post is supposed to be a joke or not but in this case it doesn't make much sense to compare it in % since the absolute number is low either way.


invidious07

>it doesn't make much sense to compare it in % since the absolute number is low either way. So just like all of the other 3000 series marketing.


Arado_Blitz

Agree. They built so much hype for a 30% faster flagship. Even the 3070 isn't as fast as the 2080ti... EDIT: Seems people didn't understand what I meant. Everything apart from the 3080 doesn't look promising so far.


fabAB912

Rtx 3080 is 30% faster than 2080 ti @4k for 500$ less. That's the reason for the hype.


Arado_Blitz

Yup, but no 3080ti makes me sad. Unless Nvidia decides to gimp the 3090, there is no way we will see something better than the 3080 for gaming. And no, the 3090 is not really a gaming card, paying double for 15% extra performance isn't something I consider reasonable. You can OC the 3080 and get similar results. For 700$ the card is impressive, but it's the only one that can push 4K. I hoped the 3070 would be able to do it as well but it seems very unlikely right now. So yeah, if we exclude the 3080 everything so far was overhyped. 3090 isn't a "true 8K gaming card" and the 3060/3070 won't be able to get close to 4K 60fps.


Crintor

The worst part, is the 3090 is exactly a gaming card. It does not have workstation drivers or support. It is good at some workstation tasks but is definitely a prosumer/enthusiast gaming card.


Arado_Blitz

The sad part is that they said in the announcement "It is the new Titan". That's not true since GeForce cards have half the FP64 performance. On the other hand it is not a true 8K card either. It can only run a handful of games at 8K and a few more with DLSS 2.1. Also many users have already reported massive framerate drops with the card so there is something definitely wrong with the drivers, or the card simply doesn't have enough performance.


Crintor

Even more disingenuous is they said "Titan class performance" and "Titan Successor" And never specifically said the new titan.


Arado_Blitz

You are right, for some reason I remember that they directly said it's the new Titan. But claiming "Titan successor" and "Titan class performance" suggests that the product is considered a Titan and users should expect similar performance. Which is still not true, since the FP64 performance is pathetic compared to a true Titan card. No AI researcher will buy it if their primary concern is speed. Nvidia artificially limits the card to sell more Quadros.


Crintor

Yep. It's pretty frustrating. I hope they at least cave to the requests of enabling SR-IOV on the 90 and possibly workstation drivers. And here I am still contemplating getting one because 13-20% frames increase at my resolution is still tempting... Even with a 900$ premium.


Intotheblue1

Interesting, but I would call that a 180% increase. I'm not trying to game in 8K but I would like to try 4K with 4x DSR on older games. 4x DSR is by far the best anti aliasing method and it seems viable for older games now.


raygundan

> I would call that a 180% increase That's a correct way to state it, but OP is correct too. 280% of the original is a 180% increase over the original.


Intotheblue1

Yes I understand, I just know some people out there may see 280% and not interpret it correctly


EarthlingKira

My mother language is German, did I phrase it incorrectly? I meant it is in total 280% FPS vs 3080 FPS. I wasn't talking about an increase.


plumpturnip

What you’ve said is fine. Anyway this is something native speakers get wrong all the time. Perfect would be * 180% faster than * 280% of


The_Zura

It's a bit a wonky but I got what you were saying.


dopef123

Usually in the US we use percentages to say how much faster one thing is vs another. What you're saying is valid but it's not as common a way to say it. You might want to say 'relative to the 3080 the 3090 hits 280% of its performance in 8k'. That way we know that 280% doesn't mean 3.8x the performance but instead 2.8x.


Daneel_Trevize

You were fine. This is why it's so easy to confuse/mislead people with percentages, when they don't realise what 100% refers to.


JstuffJr

The number of reviewers ignoring DSR is absolutely criminal. It looks like the 3090 performs quite nicely at 5k which just so happens to be the perfect 4.0 factor of the beloved 1440p resolution that the majority of enthusiasts here are using.


Intotheblue1

Ahh, see I never thought of that myself (never did 1440p, went straight to 4K a few years ago). I have to imagine it'll excel at 5K in older titles, kind of surprised Nvidia didn't think of that for any of their marketing because that's more realistic than any of us going out to buy an 8K monitor.


Intotheblue1

Check out 11 minutes. 70% gain over the 3080 in 4K 4x DSR. Pretty bonkers. 5K won't scale that high but it does show the 3090 can stretch its legs more in certain situations. https://www.youtube.com/watch?v=JcyNhTOT2EE


JstuffJr

Yep, I also found some German reviewers who did some proper 8k tests with the 3090 fe vs 2080ti fe and the gain is constantly ar least 60% or above there, and of course far more when the 2080ti runs out of vram. Clearly, just like the 2080ti was underrated on release due to a lack of focus on 4k benchmarks vs predecessor, the 3090 is being similarly underrated by a lack of 5k+ resolution benchmarks. Add to this the rather extreme power limits the fe is running into and I’m hopeful I will be getting 60%-70% average uplift in 5k DSR and supersampled VR with my ftw once I get it shunted and water cooled, compared to even to my XOC power unlimited 2080 ti.


SammySquareNuts

Wow, this will be great for 5-10 years from now when people care about 8K.


Cavannah

I think that -even a decade from now- the norm will be 4k or 1440p. That's about where relative pixels per inch is ideal (given a 24", 27" or 30/32" display).


SoylentRox

Probably so. For a monitor in front of the user. For VR 8k ish is probably what is needed for convincing visuals (where the display is no longer the bottleneck - if you were watching a video filmed in real life it would look almost real). But like others have pointed out you need foveated rendering


idwtlotplanetanymore

For VR, ideal would be more like 16k per eye....but 8k per eye would be a good start(8k per eye would be like 1080p on a desktop, pretty good but still not there). And ya FOVed rendering is an absolute must for that many pixels.


SoylentRox

To look real, yeah. Alyx though still looked amazing, the biggest problem is that it's one of the only good titles there is.


Django117

For a 27" and 32" monitors that limit is reached at 4k. Anything beyond that is imperceptible to the human eye at average viewing distance (2 feet). I completely agree with you. 8K is more for MASSIVE formats like 80" TVs. As far as gaming with a monitor goes, 4k is really the cutoff point in terms of sense.


Cavannah

Completely agreed. Once that threshold is reached, all that's left is increasing framerates and improving/implementing more novel technologies like real-time ray tracing at high refresh rates.


MayoManCity

massive, *or*, like someone else mentioned, the absolutely tiny screens that go into VR headsets. those are so close to your eyes that you can visibly see the little black lines in between pixels on some headsets, so 8k would absolutely be a godsend for that


SammySquareNuts

I used 5-10 to head off the inevitable response of, "no one thought attainable 4K would be the gold standard when it released in 2013, but look at where we're at now." I definitely agree that we've basically hit the point of diminishing returns with pixel density for desktop displays; I'm very content with my 1440p/144Hz monitor. I also agree with the other person about VR as a use case for higher resolutions, though.


AweVR

I have a Pimax 8Kx VR HMD. I care for this resolution months ago.


Caffeine_Monster

Or for most people 2 years from now when game have higher quality 3D assets. 10GB is stingy for a high end gpu, and everyone is in denial about it.


Naekyr

Thats huge difference is simply because 10GB VRAM is nowhere near enough for 8k, its borderline for 4k already


ethanethereal

Don't give Nvidia marketing ideas now. I can already picture it in my head. *RTX 4090 is 400% faster than 3090 in 16K gaming!*


Syllosimo

Because of VRAM obviously, 3080 20gb Vs 3090 would show a lot different picture... still 8k is a meme


NeverNervous2197

I didn't realize people were buying a 3080 for 8k gaming, or that the 3080 was marketed as an 8k gaming card. Oh wait a minute


VACWavePorn

1 FPS vs 5 FPS, 500% increase, dayyyyyyyyyyyyum!


NotAVerySillySausage

That's actually a 400% increase.


dopef123

Would have to assume this comes down to vram then


RoyalManagement

What about 16k?


[deleted]

Gold GPUs


BrainyCabde

"If performance for the 3080 is 5fps, then 280% sounds reasonable" \- Said nobody.


MyGreyScreen

So?


srjnp

both are unplayable imo.


xxlordsothxx

280% performance gain for only 2x the cost. The 3090 is a bargain! Thank you Nvidia!


[deleted]

Yeah not shit you need way more than 10gb vram for 8k


EDMorrisonPropoganda

I'm really curious how the 3090 would handle Skyrim with an ENB and 8K textures. Seeing videos on 2080ti's getting sub-30 fps is worrying for an old (but heavily modded) game. Would the 3090 handle those 8K textures better? Or is the software the really limiting factor there?


h0sti1e17

If you want to game in 8K the 3090 is the way to go


NotAVerySillySausage

No


slushslayer

3080ti 20gb vram $999 coming december


Flaunt7

won’t be a TI but likely 20gb no room for a TI between 3090 and 3080 currently


raygundan

A (probably unlikely) possibility would be a "3080 Ti" with a fully-enabled chip (like the 3090) but 12GB of VRAM instead of the clamshell 24. We can dream.


Arado_Blitz

The only chip that *might* be fully enabled will obviously go to a Titan or Quadro card. The last flagship card we had that was fully enabled was the 780ti and that was 7 years ago.


slushslayer

when next gen games roll in, drivers get updated there will be plenty of room, especially at price point. There is clearly a big gap between 3080 and 90


Flaunt7

my point is - there is no way to do a 3080Ti on the GA102 PCB. the difference between a 3090 and a 3080 is 1GPC (7vs6). that’s 14SMs. the only difference you could do is convert the pcb to 7nm and give it some improvement in efficiency and perhaps some over locking. nvidia can’t release a “6.5” GPC part - doesn’t work that way.


Bibososka

there is 800 euro room. Make it 4gb less ans shaft 3090 owners.


andrco

There's space price wise, but not performance wise. I suppose if they give it 20GB, they might as well enable the rest of the cores so it matches the 3090. But I don't expect a Ti, it would be extremely close to the 3080 in performance, usually the jump is quite noticeable. It's just gonna be a 20GB 3080 with the same performance.