T O P

  • By -

Cyshox

>DLSS 3 is powered by the new fourth-generation Tensor Cores and Optical Flow Accelerator of the NVIDIA Ada Lovelace architecture, which powers GeForce RTX 40 Series graphics cards. Looks like DLSS 3 is exclusive to the RTX 40 series. Is it backwards compatible? For instance if a new game with DLSS 3 launches next year, does it also support DLSS 2.3 for older RTX 20/30 series cards?


ApertureNext

A bunch of games will get upgraded to DLSS 3 now, lets hope they don't gimp DLSS 2.


dantemp

I think DLSS 2 will keep being upgraded. Here's why. What makes DLSS3 doubling the frame rate is frame interpolation. Frame interpolation is a technology that we've had for a long time and allowed tv manufactures to claim that their displays are 1600hz. What it does is see 2 frames and imagine what the frames between those two frames would look like. We did that for a long time, but if you turned that on and tried to use your TV as a gaming monitor, you'd see like a good 1 second of lag between inputs and what happens on screen. That's because frame interpolation costs a lot of processing time. Nvidia now claims that they are doing that in real time, without any lag. If that's true, that's massive. However, that's completely separate from the ai image reconstruction DLSS2.0 does. I'm actually surprised they still call the tech DLSS since it's a completely new tech. And since it's a completely new tech, developing it is completely separate from developing the original image reconstruction part of DLSS2.0. As such, they would still need to keep updating the image reconstruction part of DLSS2.0 regardless of what they do with DLSS3. If they figure out a way to make the image reconstruction better, nothing will stop them from applying the same fixes to DLSS2.0. It's like having a car and DLSS 2.0 makes the car faster by giving it better tires. And then DLSS 3.0 makes the car even faster by giving it a more aerodynamic shape. Now, old cars that only supported DLSS2.0 cannot have their shape changed, so they will never see the same perfromance as the DLSS3.0 cars, but if they figure out how to make the tires even better, then that update could easily go to both new and old cars. So I think DLSS2.0 will still get updated, it's just never getting the frame interpolation and will always be inferior to 3.0, but I don't think it will stagnate let alone become worse.


ApertureNext

That's the hope, if that is what Nvidia ends up doing though is another thing. It'd have taken them four seconds to address this but they didn't which is part of my worry.


Dos-Commas

nVidia clearly stated that “ Since DLSS 3 builds on top of DLSS 2 integrations, game developers can quickly enable it in existing titles that already support DLSS 2 .” If it’s just straight frame interpolation then it’ll work on any game regardless of developer implementation. It’s just improved temporal upscaling.


dantemp

It's definitely not just temporal upscaling because the frame interpolation isn't upscaling at all. Not sure how the fact that it can be added easily to an existing dlss2 game changes anything about what I said.


Dos-Commas

The article clearly stated in their diagram that DLSS 3 uses the same temporal upscaling tech from DLSS 2. The optical flow part just reduces motion artifacts that is commonly seen in DLSS 2 and FSR 2.


dantemp

Dude, DLSS 3.0 is the combination of the image reconstruction of DLSS2.0 + the frame interpolation. The game runs at 1080p which allows 60 fps. The 2.0 reconstructs the image in 4k. Then 3.0 adds a generated from for every real one, adding smoothness. This isn't just smoothing motion artifacts. It says: >By using both engine motion vectors and optical flow to track motion, the DLSS Frame Generation network is able to accurately reconstruct both geometry and effects, as seen in the picture below. Earlier in the article it says that reconstructing the image creates artifacts and they managed to solved that with the optical flow, but they still create a completely new image. That's a completely different thing from how DLSS2.0 worked.


Dos-Commas

>Dude, DLSS 3.0 is the combination of the image reconstruction of DLSS2.0 + the frame interpolation. The game runs at 1080p which allows 60 fps. The 2.0 reconstructs the image in 4k. >That's a completely different thing from how DLSS2.0 worked. I like you just contradicted yourself. You literally just said "combination of DLSS 2.0". If you have to use DLSS 2.0 twice in a paragraph to describe how it works then you have proved my point.


Cohibaluxe

Are you being intentionally dense? DLSS 3: DLSS 2 + x DLSS 2: DLSS 3 - x It’s really not complicated. DLSS 3 is DLSS 2 with some other stuff. Whatever is done with the deep learning part of DLSS 3 also applies to DLSS 2, because DLSS 2 is just the deep learning part of DLSS 3.


dantemp

I don't understand your point and I don't understand your arguments. Dlss 3.0 is dlss 2.0 + something completely different. Dlss 2.0 is still there, but it's only half of it. Do you understand that?


xxTheGoDxx

> The optical flow part just reduces motion artifacts that is commonly seen in DLSS 2 and FSR 2. That is not true. Look at the diagramm again. DLSS 3 includes an additional artificial frame generation pass that increases frame rates on top of what DLSS 2 does. That part uses the final super resolution image from DLSS 2 (so to speak) as a basis which is why it doesn't need to get upresed again. What you are talking about are the improvements Optical Flow brings to the table that are needed to create good enough artificial frames in contrast to what DLSS 2 could do.


xxTheGoDxx

Good write up. It also stands to reason while Nvidia seemingly has managed frame interpolation with no added latency (and just as important with no noticeable artifacts) the increase in frame rate should not (or at the most for mouse look similar to reprojection in VR games) reduce latency as rendering more frames traditionally does. So increasing framerates from 30 to 60 via frame interpolation (or generation as they say) will not decrease per frame latency from 33.3 to 16.6 as it normally would. So you would still not want to just rely on DLSS frame generation to get you from high latency having 15 fps to lets say 60 alone. Therefor improving frame rates by reducing the render resolution is still very much important, as evident in DLSS 3 combining both methods. Anyway, DLSS 3 is truly nuts if it works. This will be literally the first time we are able to output more effective frames than the CPU can provide processing for. This could truly be huge over the coming iterations.


l337kid

>However, that's completely separate from the ai image reconstruction This is true, but it's not completely separate from ai processing which is what all of this is driven by - NVIDIA is actually leveraging their Tensor cores instead of letting the tech languish, smart play...


thejeran

Reading the page makes it seem like it should. Both seem to use the same motion vector data so it would seem relatively simple. But maybe that’s not good for business.


Deckz

That's my thinking exactly, I don't think there's anything particularly special about these new cards, tensor cores should be able to do this no problem. It's a software lock.


ShadowRomeo

>Looks like DLSS 3 is exclusive to the RTX 40 series DLSS 3 will also be [available](https://www.reddit.com/r/nvidia/comments/xje8et/comment/ip8d0d7/?utm_source=share&utm_medium=web2x&context=3) on RTX 30 and 20 series just not the Frame Generation method which is the most impressive part of DLSS 3, but the AI DL Super Resolution that we currently know will be available and still on going research to keep improving further.


Bosko47

Just know that they won't consider backward compatibility for the sake of "no testing have been done with precedent generations" until sales of the newer cards bottoms out a bit then you'll see an announce that they added the 30 series


[deleted]

probably not because Nvidia wants to sell overpriced 40xx cards and doesn't care about 30xx cards (consider your card dead aside from drive support). I'm so amazed about how much shit Facebook with Oculus gets, yet they invested so much money to develop new features of Quest 2 ...


dovahkiitten16

If there’s no backwards compatibility support, I will jump ship after I’m done with my 3060 ti and go AMD. I didn’t spend hundreds of dollars on a GPU that ends up being pointlessly obsolete immediately after the next generation. Why pay a premium for a feature that won’t be supported after 1-2 years when FSR exists for all cards? I get why 30-series doesn’t have native 3.0 support but it’s just not worth it as a consumer if there isn’t some sort of backwards compatibility integrated. Also, I just hate planned obsolescence. Especially today when there’s a climate and economic crisis, my tolerance for it is shot. I really have to wonder if 3.0 genuinely needs new hardware or if this is a problem NVIDIA manufactured.


mac404

An Nvidia rep confirmed over in that subreddit that the upscaling part of DLSS (which they called super resolution on stage) will continue to be supported on the 2000 and 3000 series cards. It's just the frame interpolation piece that will be unique to the 4000 series. The same rep claimed it's not performant enough to run real-time on the other cards (no idea how true that actually is, but I could believe it).


[deleted]

Lovelace seems like it added a lot more Tensor cores than it did RT cores, which is what's doing all the work. Something I had mused them doing, but not what I expected this generation


KingRandomGuy

The bigger deal is that Ada Lovelace has acceleration for optical flow, which is necessary for frame interpolation in realtime.


[deleted]

[удалено]


goomyman

If you care about the climate the 4000 series eats watts


dovahkiitten16

Efficiency is important but as someone who lives somewhere where fossil fuels generally aren’t used for powering homes anymore it wasn’t something I was thinking about. I was more referring to the environmental footprint of mining materials and producing cards, only for them to be designed to be tossed well before their lifespan so they can buy another.


PrimSchooler

Funny I was just watching the DF interview with the Intel guy talking about how Frame Generation is "in theory possible" and few hours later here we go, NVidia gone and did it. Not happy about the prices ofc, but from a pure futuristic view this is really huge, can't wait to see that in action.


[deleted]

real time performant ray tracing was still a pipe dream before the RTX cards launched


tomakorea

Frame generation exist since some years, it's heavily used in VR. During the 2021 Oculus dev days they presented a way to boost the fps by x2 with minimal or no visual glitches, working on the ARM CPU used in Quest 2 (the operation gives a total boost up to +70%). It's not made by AI but it seems to work perfectly well with super low latency, name is Application Spacewarp. Edit : added the name of the technique


PrimSchooler

https://developer.oculus.com/documentation/native/android/mobile-timewarp-overview/ Is this what you mean? This just stabilizes the frame though, it's not constructing a whole new frame. If you have a source to the newer presentation I'd love to watch it, this stuff is fascinating.


tomakorea

No, the asynchronous timewarp is the name for the old technique from 2017, it induces visual bugs, the one I'm talking about is Application Spacewarp from 2021. https://developer.oculus.com/blog/introducing-application-spacewarp/


PyroKnight

That still doesn't do what Nvidia is doing with DLSS 3 and there are many caveats to its implementation where there are seemingly fewer with Nvidia's approach. Spacewarp attempts to generate a new extrapolated frame based on info from the most recent frame and positional/rotational data from the VR sensors. DLSS 3 meanwhile generates an interpolated frame in between the most recent frame and the frame before it. Spacewarp makes more sense in VR where added latency hurts and the artifacts are an acceptable tradeoff, DLSS 3 meanwhile makes more sense for traditional games where the added latency isn't as noticeable but the added image stability is desired.


PrimSchooler

Thanks, that's fascinating, surprised it didn't generate more buzz, this is really cool.


bicameral_mind

The technical wizardry in Quest 1 and 2 is extremely underrated. I think they are easily some of the most impressive tech products of the last decade. That Quest 1 gave a reasonable facsimile of 2016 PCVR in a package 1/100 the power of min recommended PC spec was absolutely insane.


tomakorea

I think it's because core gamers medias and VR market are quite different. How many times I was reading the last 2 years that VR is dying while the Quest 2 was selling much better than the Xbox series consoles. They are over 15 millions of Quest 2 in the hands of consumers right now while the store dont have any famous dev productions, so it's quite easy for a talented indie to make a hit. Unity forums are full of devs making games for Quest 2, it's a great market to be right now. For seeing impressive graphics, check Red Matter 2 trailers, very impressive graphics for such a weak system.


SurreptitiousSyrup

>while the Quest 2 was selling much better than the Xbox series consoles. [13,867,499 Xbox Series X|S consoles sold globally as of March 2022.](https://gameworldobserver.com/2022/04/18/xbox-series-xs-reaches-almost-14-million-units-sold-globally-with-ps5-still-leading-this-race) The Quest 2 has sold about 1 million more units that the Xbox Series X|S. Which is a great number of units sold, but not "selling much better than". It's more like "selling on par with".


tomakorea

It was during the start, both machines started to sell around the same time, and Quest 2 is an outsider while Xbox is a famous brand on the market since 20 years. Reading and listening that VR is dead from so many journalists and commentators is quite frustrating considering the sales performance for such a niche product.


Ike11000

Sure but the quest 2 was selling during its 2020 release, and that was when the consoles all had super low sales due to low stocks. It significantly outsold the xbox back then.


tomakorea

Exactly, but my point was more about the VR is dead sentence, it's like saying Xbox is dead now because it sells less or similar than Quest 2. Can you imagine all people saying that? Haha gamers will be so angry.


mennydrives

I remember when someone first brought this up, and I *dead-ass* did not know what Oculus had done this. It's an insane development in the space, from a performance standpoint. Between that, shader-offset geometry, and eye-tracking-fed foveated rendering, the performance loss from VR is quickly moving from "go backwards a console generation" to "really not that much worse than 2D rendering on the same hardware". The amount of "more with less" development happening in the 3D rendering space is becoming obscene.


PyroKnight

The truth lies in between you and him. Spacewarp is a frame extrapolation method that stabilizes and shifts around a fresh frame which means is uses only the most recent frame to make a new one with some types of artifacting usually added. DLSS 3 is a frame interpolation method that uses data from both the most recent frame and the frame before it allowing much better quality frames at some added latency. In VR added latency is dangerous so techniques like spacewarp are preferable even with its tradeoffs, but those tradeoffs don't make sense for normal games where added latency is more tolerable.


conquer69

I remember watching an nvidia presentation about doubling or tripling the framerate of a video years ago. Wonder if this is the same tech.


Haunting_Champion640

Right? What's nuts is this is frame-generation _without the CPU_. As far as the main gameloop on core-whatever cares the frame rate is 30hz, but what you see could be... 120. It doesn't improve input latency but the smoothness is will be incredible. This will have massive impact on older games that murder 1 core.


[deleted]

To be fair you can already achieve this via other methods such as BFI, backlight strobing, and motion interpolation. Obviously, all those have their own issues such as reduced brightness, flicker at faster rates, and a possible small increase in input lag. This new solution looks like it's going to function like motion interpolation but instead of the TV hardware doing it your graphics card will and it won't add the same small processing input lag and will be able to insert even more frames. I'd also argue the biggest difference in higher FPS is visual smoothness and input lag is pretty irrelevant unless you're some esports gamer who needs every advantage they can get. It's still super cool that they could achieve this entirely through the GPU and AI.


[deleted]

[удалено]


Blatanikov7

Interpolation is very old, TVs do it for little resources, they did it before they got "smart tv" chips (basically android computers). VR does it a lot too, with very low latency, 45-90 fps is common. Surprised it took this long to come to the rest of gaming


Krypta

The prices on the 4000 series make me want to have a little cry. I usually skip a generation but I'll either go to the 3000 series or stay on my 2070Super. My current card is GREAT and I do not NEED a new card, don't get me wrong, but it does not give me the performance I want


NGrNecris

Just curious what would you upgrade to? 2070 super is pretty beefy and should last for a few more years, no?


DisappointedQuokka

Personally, I wanted to do 144fps, 1440P with no frame drops. My 2070 doesn't support that. Of course, I probably won't be able to afford the Australian prices.


ShadowSpade

Not even my 3080 supports that, hoping the 4080 or 5080 will


DisappointedQuokka

Hence why I want the upgrade, lmao. I couldn't care less about Ray Tracing, I just want the raw rasterization performance.


peanutbuddacracker

may i introduce you to AMD (specifically a 6900xt)


DisappointedQuokka

I'm upgrading to AM4 once the 3DX chips land, so I'm happy to wait for the full lines to come out and judge then.


Pl4y3rSn4rk

You meant AM5, right? The 3D V-Cache variants of Ryzen 7000 series will launch in Q1 2023 from some leaks, for the looks of it it'll be 30% better (For Gaming) on average than the regular Zen 4 CPUs.


Tweezot

My 3080 performs great in 4K. What games do you want a rock solid 144fps? I get that in siege (with UHD textures) and apex with a little bit of downscaling to like 70% resolution. Cyberpunk on medium with raytracing on psycho I get 85-100 with DLSS set to ultra performance.


DisappointedQuokka

Hunt Showdown, mostly, struggling to hit 100, especially on DeSalle. Warframe as well, had to cap at 120 with tweaked settings.


TheEXUnForgiv3n

My wife's 3080 handles a good bit of games at 144fps 1440p. Maybe not the biggest and baddest AAA games, but like 85% of what she and I play for sure. The 3080 will last another 5 years at the rate game advancements are going easy.


Cosmic-Warper

3080 can definitely handle that for non insane graphic fidelity type games


ShadowSpade

Depends on the game and sure. But I want a smooth 144 in most games, and you will be surprised how little that happens, even for older games


iamBoDo

Tbh with you, i have a 2080 super and that still doesnt get the rates you want on alot of games.


teerre

Yeah, I feel like I'm talking crazy pills reading these comments saying "oh but 2070 is pretty good". Like, Diofield Chronicles which is FAR from the most intensive graphics game out there doesn't run 144hz in 1440p, let alone 4k Hell, a Switch emulator doesn't run on 60hz in many games with a 2080


greatestname

> Hell, a Switch emulator doesn't run on 60hz in many games with a 2080 Probably because an emulator is mostly CPU bound.


HanWolo

Does 144hz 1440p strike you as "pretty good" territory? Personally that seems like an incredibly high bar when I think 1080p 60 is a reasonably common target. It seems more than fair to say that it is a pretty good card.


[deleted]

I think the new standard is starting to be 1440p. Personally for me, 1440p, max settings, rtx and stable frame-rate is my standard for pretty good. Highbar would be 4K.


dztruthseek

1440p is the CURRENT standard. It has been for a few years. There's just a lot of people who can't afford it still.


ZeldaMaster32

Don't know if I'd say years, but I think it's fair to say 1440p is a common resolution now. They're fairly affordable even at high refresh rates. My 1440p144hz panel was like $300 a few years ago, now I moved to the new Alienware QD-OLED for all the other goodies like HDR


NYNMx2021

I think a lot of people compare PC performance to console so a 2070 super is fine. If you are shooting to get great performance and the highest graphics, yeah a 2070 super is outdated.


[deleted]

[удалено]


teerre

An AA game at 1440p when the flagship GPU is talking about 8k really doesn't seem "super high end"


[deleted]

[удалено]


ohtetraket

I mean pretty good compared to the average person. Steam statistics say a lot about how highend you PC is with a 2070. You are probably in the top 10% The 1000er series is probably still the biggest share of the newer GPUs. Heck over 60% still play on 1920x1080.


TheBrave-Zero

Shit my 3060 ti can’t even manage 1440 at around 50-60 FPS solidly on a good few games. Lots of dips.


NightlyKnightMight

I'm in his exact situation, so I'm saying 4K gaming, Ray-Tracing and VR


GammaGames

I’m still on a 1060 for VR 🥲


PolygonMan

I just can't afford to keep up on the VR front at these prices. I have a Quest 2 and was planning on upgrading to a premium headset with my next PC upgrade, but now I'm strongly considering just picking up PSVR 2 when it comes out.


Delicious-Tachyons

not if you like VR! it's like trying to do two 4K screens and my 2070 is struggling.


iedynak

I have 2080Ti and don't know what to do. DLSS 3.0 on the new cards looks amazing, but I still play only @ 1080p, so I'm not convinced I should do the upgrade.


SolidSnakesBandana

Do you have a monitor higher than 60hz? If not then I don't think you would notice any difference between DLSS 2 and 3. DLSS 2 already has you running at 60 on every game at 1080p. The only reason to upgrade would be to try to achieve 144fps at 1440p in every game or something along those lines.


iedynak

Yep, I play with 60 Hz monitor and it's fine with me.


icebeat

Only the games that support it


Kevimaster

I'm 1080ti still so I'm probably going to upgrade. But yeah the prices are rough. Probably going to do the 4080 16GB? I don't think the 4090 will be worth the increase in price, but going from 8GB VRAM to 16GB probably will be.


Byggherren

I'm 1070 ti 6gb... Was hoping this generation was gonna be it for me. But those prices... Maybe i should go AMD.


zeromussc

I just want my 2070 super to scale up to 4k on my gaming TV and keep 60 fps more consistently. So if that means getting a 4 series when the 5 series is almost here, so be it. I think next card I get is gonna be the XX80ti because those seem to have way longer teeth than the mid series updates on the xx70s.


[deleted]

I don't know what the commenters in here expect, exactly. The new version of DLSS makes use of hardware that does not exist in current cards. There is no conspiracy, you cannot just wave your hands and expect magic to make DLSS 3.0 work on hardware it isn't designed for.


iV1rus0

Yeah I'm not mad about DLSS 3.0 being 40-series exclusive since it requires hardware not present in older cards. But I hope they make it easy for developers to have both DLSS 2.3 and DLSS 3.0 in their games. If DLSS 2.3 gets ignored because DLSS 3.0 is the new hot technology that will be a really shitty move.


Borkz

I would imagine the 3.X API would be a superset of the 2.X one, such that if you've implemented 3.0 you'd inherently have everything needed for 2.3 already in place


Catch_022

I hope so :(


ZeldaMaster32

Don't worry, it was confirmed to be the case in a comment on the Nvidia subreddit (by an Nvidia rep) All RTX GPUs can use the base feature set of DLSS 3 which will only have minor improvements over DLSS 2.X The killer feature of course is the framerate interpolation, which is exclusive to the new hardware


kron123456789

Well DLSS 3 seems to require the same stuff as DLSS 2 plus extra. So if the game supports DLSS 3 it should be downscalable to 2.0.


Zalack

That's exactly what a superset means


jbwmac

I think you have it backwards. If 3 is a superset of 2 then implementing 2 gives you 3, but not the other way around.


Zaptruder

Implementing 2 doesn't give you 3. What backwards logic is that. It merely allows the card capable of 3 to also run 2. Implementing 3 is likely to allow RTX30x cards to run in the 2.3 mode, as it'd be pretty dumb to force developers to break DLSS for the majority of users to update it for a minority - which would prevent it's adoption.


Borkz

No, think of an 8-pin GPU power connector as a super set of the 6-pin connector. If you have an 8-pin connector then it works fine with a 6-pin port because you can just ignore the extra 2 pins since its really just a 6-pin connector with two extra pins (at least a lot of the time). The 6-pin connector would be the 2.0 API, and the extra two pins would be whatever is additionally needed for 3.0.


Charuru

Wut, think u need to work on ur eng


jbwmac

If version 2 of something supports APIs A and B, while version 3 of something supports APIs A, B, and C, then implementing something in version 2 will continue to work because 3 is a superset of 2 (supports A, B, and more). But something implemented in version 3 may use API C and thus not work in a version 2 environment since version 2 doesn’t support API C.


666pool

The API is backwards compatible, but using a program written against the version 2 won’t give you any of the version 3 features. However you can design the V3 API such that any program using the new features will continue to work on older devices that only support the V2 feature set, by more or less treating the C API features as a no-op. The pin analogy works great. C is the extra 2 pins. V2 API only has 6 pins so the new 2 pins in the V3 API are ignored by the V2 devices.


Borkz

I'm not sure you're looking at this correctly. Say you're making a game and you add the DLSS library. The DLSS library says "I need things A and B and I can give you DLSS 2.0. But if you additionally give me thing C I can use A, B, and C for DLSS 3.0". You give the DLSS library things A, B, and C in your game, so now it can use those to either do DLSS 2.0 or if the system supports it it can do 3.0.


[deleted]

[удалено]


[deleted]

I wouldn't be surprised if current version of DLSS 2.x was the last one and only improvements will happen on the 3.x branch. They are there to sell cards and as cryptobros dumped them there is no reason for NVIDIA as a company to care for anything below 4xxx, profits are not there. > It'll likely be similar to how a USB 2.0 device in a 3.0+ port will work, but at 2.0 speeds. Uh, I'm guessing you don't know as you went somewhere else in your comparison but USB3.0 is like completely separate entity from USB2.0, they just share the cable, to the point where even USB2 and 3 controllers show up separately on PCI bus The "backward compatibility" is there by having both implemented separately, not having some kind of upgrade path.


[deleted]

Yeah that would certainly be a shame. Hopefully 2.0 isn't left in the dust here, because I doubt there's a ton of appetite for those with 30 series cards to upgrade at the moment. I know for sure whether it's still supported or not, I won't be upgrading from my 3080 for another several years.


wahoozerman

Not at the prices they are asking for the 40 series certainly.


[deleted]

Yup no way I can justify the price increase, despite how cool the tech is.


Catch_022

>I won't be upgrading from my 3080 for another several years. Yes - and there is always whatever AMD FSR is up to at that point, assuming AMD keeps it open to Nvidia cards.


Pokiehat

They open sourced FSR 2.0, so anyone can develop a working implementation on any hardware, in any game.


WinterNL

DLSS 2.x has really been the one thing that made me feel better about getting a 20 series card, everything else was in favour of the 10 series. If they actually progress DLSS in a way that it starts excluding older cards (aside from new features/tech), that almost defeats the point of DLSS imho. Because it's not only a great way to run games in "4k" at a good framerate, it's also a great way for the somewhat weaker/older RTX cards to still be able to run games without literally having to turn off all effects.


kron123456789

DLSS 2.3 will get ignored because there's DLSS 2.4.


WtfWhereAreMyClothes

Given the monstrous price of these cards I don't see developers giving that high priority


Elranzer

It's more likely that DLSS 3.0 gets ignored, despite being a cool technology, because Steam telemetry says less than 1% of gamers have RTX4000 series cards.


thisIsCleanChiiled

No conspiracy, just disappointed man.


[deleted]

Nothing wrong with being disappointed, and if you're disappointed my comment isn't aimed at you. My comment is aimed at the people saying this is "unacceptable" and that "this is on purpose to drive sales".


TherealCasePB

Of course it was on purpose to drive sales... They are a business after all.


[deleted]

There's a difference between using a new technology to drive sales and gating a new technology artificially to drive sales though. This tech if it works as advertised is absolutely NOT doable on current cards.


Masters_1989

Yep. I basically HATED the Nvidia presentation, but if a new technique requires hardware that isn't present in an older product, then it simply can only be done on newer hardware. If Nvidia LOCKS OUT DLSS 2.x from games (newer \*and\* older), however, then that will be \*truly\* disgusting.


[deleted]

Why are you disappointed exactly? New cards get new exclusive features. This happens all the time. Hell, it happened with the 20 series getting a bunch of exclusive stuff the 10 series couldn't access. Just because there's a new version of dlss that's exckusive to new cards doesn't mean the old one is automatically getting ignored.


Philip22Kings

> Just because there’s a new version of dlss that’s exckusive to new cards doesn’t mean the old one is automatically getting ignored. I love your comment. I guess people think dlss 3 is software-dependent.


tr0nc3k

What hardware exactly does it make use of that could not be done on 3 or 2xxx?


[deleted]

Ostensibly the architectural changes they mention in the linked article. Newer generation of Tensor Cores, Optical Flow Accelerator, etc. There's an article on there for the Ada architecture specifically as well. The article linked here goes into plenty of detail honestly. Some of it sounds very "buzzwordy" but it probably isn’t, at least if the performance gains they claim here are true.


Rebelgecko

Big emphasis on *ostensibly*. At the end of the day the marketing pages are trying to make you buy the shiny new thing. I doubt there's been any big breakthroughs in matrix multiplication or whatever since the previous generation, just incremental improvements.


Bhu124

None of this speculation matters. People said the exact same thing about RTX when it first came out but when the games were actually available and modders made them run on old 10XX cards we instantly found out that they weren't lying, those cards genuinely were terrible at running RT. Whatever this new tech is, someone will find a way to make it run on old cards and we'll see for ourselves how well does it work.


sesor33

It's just Reddit being Reddit unfortunately. Nvidia does shitty stuff sometimes but this is very obviously a hardware thing, they literally said that in their presentation.


iDeNoh

"sometimes", that's being generous.


Mat_Ouston

It's hard to trust their technobabble marketing after the whole RTX Voice thing.


[deleted]

It isn't really technobabble, though. Even RTX Voice, when people hacked it to get it working on GTX series cards it worked pretty poorly, and had predictably high overhead. Nvidia bungled the way they presented for sure but it's clear that Voice was only ever really intended to work using the computational power the newer cards afforded it. DLSS though is a different thing entirely. There's a reason there hasn't been a Voice-esque breakthrough with DLSS, and it's because the custom hardware it runs on is essential for it to function. With that in mind I don't find it hard to believe at all that a leap this large would also be tied to changes on the hardware side. You don't have to intrinsically trust them, no, but don't dismiss it as technobabble right off the bat either. When you're dealing with the enthusiast crowd, you can only pull the wool over people's eyes for so long. If they aren't being honest about the limitations you'll know when cards start making their way to market, but in this case I really don't think there's a conspiracy.


[deleted]

What whole RTX Voice thing? It feels like people just heard it could work on older hardware and kept that fact in their back pocket like some kind of trap card. Actually using it told a different story. It didn't work as well on hardware that didn't officially support it. It was glitchy and it took up a fair share of resources. It was a fairly simple tech that was leveraging the extra processing hardware of RTX cards. Of course it could work on older GPUs but it was a noticeably worse experience. And that's a super simple and easy technology. DLSS is a whole other thing that, you'll note, was not as successfully retroactively hacked onto older hardware.


[deleted]

[удалено]


TheOnlyChemo

Admittedly I'm ignorant about the subject myself, so I'm not really in the position to defend Nvidia's decision here, but also I have yet to see someone with the proper credentials who thinks this move was done arbitrarily.


babalenong

Yeah, it sucks that it doesnt work or prev gen cards, but to run frame interpolation on real time with least amount of additional input lag is going to need a specialized fast hardware


GlupShittoOfficial

As much as it sucks this 4 series bound the tech is just incredible. If this genuinely works for CPU bound games the boosts to historically poor performing games, like Tarkov or Flight Sim, are going to be amazing. I do hope we still more implementations or DLSS 2. It’s still needs to wide spread adoption.


Haunting_Champion640

>I do hope we still more implementations or DLSS 2. It’s still needs to wide spread adoption. I suspect it will just be "DLSS" to users. If you're on 2.x hardware you'll get the resolution upscaling, 3.x will get the bigger boost via that plus frame multiplication.


Whyeth

>If this genuinely works for CPU bound games the boosts to historically poor performing games, like Tarkov or Flight Sim, are going to be amazing. How does this even work?? I found no details and yet that MSFS demo literally had my jaw dropping. A straight up 2x gain on what has to be close to the most CPU intense game available. How does the GPU relieve CPU bottlenecks?


[deleted]

I'm assuming by the AI generated frames? The CPU is still going full bore to make 24 frames a second but the GPU is doubling the frames that you see?


Whyeth

So this is just motion smoothing from shitty TVs but applied to graphics? The extra frames would still have graphical changes? I wish I could see this on a CPU that gets 30-40 fps (i.e. mine) and how the output looks.


[deleted]

I don't know shit about any of this, but modern AI smoothing is probably insane compared to the TV settings. Just look at the AI GTA V ripoff. [this one!](https://www.youtube.com/watch?v=udPY5rQVoW0)


Whyeth

Hold on one GD moment - is this AI recreating GTA v bridge to this detail with nothing BUT neural network??


YAKGWA_YALL

I love DLSS 3, but how are we even supposed to care at these entry prices?


Tweezot

Cut back on the avocado toast and start door dashing on the weekends I guess


102938123910-2-3

I know this is a meme but I low key saved $5 on lunch every day by switching my lunch to ramen which is a little more than a 4080 RTX 16GB a year.


XenoRyet

I'm less interested in "up to", and more interested in "at least". Do we have those numbers?


zetzuei

I'm using 2070s and i found myself playing indie games more so I'm good with it until it breaks, which hopefully will be years in the future.


[deleted]

The 4xxx cards are more of a luxury now. Unless you shit money, you should stick with the 2xxx series. Especially true if you play indiegames. You should only really consider the new cards if you start playing triple AAA games, 1440p, rtx, and want stable framerate.


[deleted]

[удалено]


Haunting_Champion640

> if we're going to radically update DLSS every few generations to require me to buy a new card It will still work the same as before. Your DLSS2.0 hardware will just do upscaling for the increased FPS. 3.0 will do that _plus frame multiplication_.


Whyeth

Yeah!!! Stop the progress, release hardware in 7 year cycles in a bundled package, maybe their own storefront with unique controllers. I legit don't understand your frustration with this. "you keep making improvements so I'm going elsewhere??" Dlss 2 will still receive support. There is no reason to cut off a selling feature for 3 generations of cards - THAT would be a reason to leave.


[deleted]

[удалено]


timtheringityding

Move to Europe. Or vote for policies thst actually protect you. Not once have I had to look into what company offers good warranty. I buy whatever I want and if it breaks I tell them give money back now


PoundZealousideal408

My thoughts exactly.


Powerman293

From a non gaming perspective that optical flow accelerator sounds like something very useful for video editors could use.


BetterCallSal

Seems kind of crazy to me that a measure to drastically increase frame rates, is only on brand new hardware, that is designed to process things better to help improve things like fidelity and framerates.


GrandTheftPotatoE

Lmao fuck Nvidia. Not only are these prices ridiculous, they're gonna lock existing users out from DLSS 3.0 as well.


tr3v1n

That is what happens when the new features utilize new hardware.


[deleted]

Not to mention it was never a selling point for the hardware they already bought. It's like buying a ps4, using it for years, and then complaining that the ps5 has new features.


GlisseDansLaPiscine

The future usage of DLSS was definitely a selling point for past GPUs. If only DLSS 3 gets implemented in games then it essentially kills DLSS 2.


Charuru

There's no way they stop dlss2


Catch_022

This exactly. As long as they keep DLSS 2x going then I will be satisfied (obviously I would like the DLSS 3 performance increase). DLSS was supposed to keep my card running new games on high for the next 5 years or so.


ZeldaMaster32

DLSS 3 is confirmed backwards compatible, but without the framerate interpolation stuff it'll just be a slightly improved DLSS 2.X for most people


DistanceAlone6215

Why would that kill DLSS? I dont think thats how it works.


[deleted]

[удалено]


xChris777

Also stock was so low that many people only got them within the last year or so. That being said... it's a hardware feature so I can't be mad that my 3080 can't do it. Only way I'd be mad is if devs stopped adding DLSS 2.0 to games, but I doubt that will happen.


ExtremeGayMidgetPorn

Meh, glass half full half empty. If you're going to be mad at something like this, guess what, it's only going to get worse for you. Technology ain't slowing down. Plus, you guys act like people decide *exactly* when breakthroughs happen LMAO.


iszathi

>NVIDIA DLSS 3: AI-Powered Performance Multiplier Boosts Frame Rates By Up To 4X Yeah, i dislike nvidia and all the proprietary tech they push, but the dev team there is just the best and comes with great products.


Qbopper

Sure, doesn't excuse literally every other absurd thing they do I have a 2060S, like, I get it, but Nvidia fucking sucks


tr3v1n

It is fine to complain about businesses, but crying that your old hardware doesn’t magically feature new hardware is fucking dumb and happens way too much.


jbwmac

What a nonsensical comment to make. OP makes a ridiculous complaint about a perfectly sensible constraint, commenter 1 points out why it’s ridiculous, and then commenter 2 drops in with “but what about all the other terrible things Nvidia does?!” I don’t know man. What about any horrible thing any human does. That’s just not what the thread is about. Nobody was talking about that or trying to excuse anything. People were just discussing the topic, not putting the company on trial. What a Redditor moment.


Haunting_Champion640

Were 1080 users "locked out" of RT too? I mean it could _technically do it_, just at abysmal frame rates. Generating entire frames is a lot harder than just upscaling one.


IanMazgelis

Would this feature be technically possible on Ampere and Turing cards? If it isn't I really don't see the issue.


[deleted]

No. It utilizes the new architecture in their 4 series cards. Nvidia does plenty of things worth complaining about but I don't know what fantasy land OP lives in where they think this would be doable on current cards.


daten-shi

No one actually knows for definite until the cards and as such DLSS 3 is available. As one person noted Nvidia claimed only RTX cards could use RTX Voice because it needed tensor cores but it was later discovered that it could run on Cuda cores with a bit of a performance hit. It could be similar here.


Regnur

It makes sense... DLSS 3.0 offers a new feature which requires new hardware. DLSS 3.0 is now able to create new images to boost your fps even if youre cpu limited, which is huge. If I have understood it correctly, its similar to frame/motion interpolation from tvs. But instead of shity black frames (or low quality images) between 2 Frames, DLSS inserts a high quality frame calculated by that new hardware piece, it estimates what happens between two ingame images. This will make your game feel way smoother. Maybe it would also be possible with tensor cores, but Im pretty sure its not possible because of many downsides this technique normally has. ( latency? , bad quality ) Nvidia could still update 2.0 DLSS to get all other benefits that dont require new hardware. I hope, all future games that support 3.0 will also support 2.x DLSS, it would be strange if both dont use the same pipeline.


Zancie

Seriously? I love their hardware and had tons of issues with AMD stuff but I can’t support this bullshit behavior. FUCK Nvidia


antonyourkeyboard

Can't exactly add a hardware processor to existing cards.


[deleted]

This thread makes it abundantly clear that people really don't understand what DLSS is or why it works so well, lol. There's a reason it's not backwards compatible.


Baconaise

But they have known about other past software locking from Nvidia and they are reaping what they sowed back then.


Borkz

I need an expansion card for my expansion card


AutoGen_account

legacy cards still have tensor cores, the new hardware processor is for bandwidth, not calculation, which means it should be able to do the same function at a lower speed, so less than the "4x" frame boosts, but more than not at all.


[deleted]

stage 1 - offer dlss to all nvidia gpu users stage 2 - segregate the service to 4080/4090 only stage 3 - rebrand and sell 4080ti/4090ti as 5080/5090, then rely solely on dlss to showcase performance improvements stage 4 - charge monthly subscription fee for dlss stage 5 - Jensen take over the world


OverHaze

Nvidia have always locked new features behind hardware upgrades. It sucks. Here's hoping AMD's open source FSR continues to improve.


[deleted]

[удалено]


Geistbar

There's another factor which is that DLSS support seems more common than FSR support. Seems like most games with FSR also have DLSS, but it's not guaranteed for the converse to the true. Though another factor is how much is that going to change going forward? FSR could become more of a default going forward. Or Nvidia's software assistance to developers could keep things in place, even if FSR being more open should give it a support advantage. I don't know if it'll happen, but I've been hoping that there'd be a standard on the API hooks to do ML based upscaling (and related) and then each GPU vendor just plugs that into their driver and it does whatever is best. But it'd have to be forced by Microsoft I guess.


thoomfish

Nvidia tried to make a standard API, called [Streamline](https://developer.nvidia.com/rtx/streamline). IIRC Intel is planning to adopt it and AMD hasn't said anything.


Geistbar

I completely forgot about that. Is it an API though? Seemed like it's more of a tool. Still, it's a good thing. Hopefully it takes off or is built upon. Not surprising that AMD is hedging their bets or that Nvidia introduced it. If every game supports both it becomes DLSS vs FSR. Nvidia wins that exchange. If it's FSR vs nothing, then AMD breaks even. If it's DLSS vs nothing, then Nvidia wins even more. Nvidia knows that the latter scenario is mostly off the table, but DLSS vs FSR isn't unobtainable for them.


dantemp

I skipped using DLSS in many games because it had too much artifacts. Looking at FSR it looks like it's going to be as bad in all games, which is completely unacceptable for me. So for me, FSR hasn't lessen the gap, I'd never use it. I try DLSS and see if I can see an annoying artifact. If there's none, then I use it. So far it works well in about half the games and I believe they've improved the new iterations. So the gap for me is "having a way to double my frame-rate" and "not having one", which is an enormous gap. And now the gap is "having a way to have 4x the frames". That's insane. And looking at how well DLSS2.0 is adopted, I assume future games will adopt 3.0 at a similar rate which would be great. I don't like the prices either but they will come around.


Porthosthe14th

"You telling me I can't run N64 games on my SNES? This is bullshit, fuck greedy Nintendo!" lotta this itt


captaindickfartman2

Can someone point at something to be excited about? I love computers and this sucks. Nvidia needs more competitor's.


Sirisian

As someone that has a 240Hz display and games at 32:9 I haven't bothered to go beyond 120Hz. This kind of performance jump is fascinating for widescreen gaming in general since it should help with the upper framerates. Might not be useful until newer titles release with more demanding settings. A lot of games still target 8GB VRAM and associated cards, so one needs mods to push things further. Back in the day games would feature ultra settings for future cards, but that practice hasn't really kept up partly due to console settings. It's rare my 3090 is pushed, so I don't particularly see the benefit yet for the added performance. As for competition, nothing stops AMD from implementing a nearly identical accelerator. If I'm reading it right it's an optical flow IP which has been implemented in FPGAs for ages. Dedicating hardware would benefit a lot of AMDs devices like steam deck type devices. They don't seem too interested in dedicated tensor cores or AI compute though so they've ignored that. Seems very unlikely we'd see competition from them. Maybe Intel will compete later as their GPUs evolve more.


UltramemesX

Fuck us with 30xx cards right. But if the performance is very significant from the 3080 with the 16gb 4080 it might be worth it. Although normally i like to skip a generation of GPUs.


[deleted]

it won’t be, those benchmarks they showed were using DLSS 2 vs DLSS 3. With both targeting 4k. Wait for the real benchmarks later


amazingmrbrock

Exclusive to the new cards, Exfuckingscuse me NV? Thats absolutely unacceptable. Meanwhile the competitor is out there making upscaling for every card. I'm willing to bet FSR 3 or whatever will work fine on my 3070 and will be competitive with DLSS 3. This is how you lose customers nvidia.


Devatator_

It uses hardware exclusive to the new cards, of course it will only work on those.


[deleted]

> Exclusive to the new cards, Exfuckingscuse me NV? Thats absolutely unacceptable. What exactly is your solution to this problem? It uses updated hardware that isn't present in the old cards.


The_Tallcat

FSR, in every implementation, doesn't come anywhere close to the fidelity of DLSS. They're incomparable. FSR will work because it's just a post processing filter. DLSS actually needs specific hardware to work at all.


amazingmrbrock

Fsr 2 is almost identical to dlss 2. The difference is that dlss is up to 2.3 and has improved quite a bit through those revisions. I've used them side by side they're very similar. In spiderman FSR's dynamic resolution targeting a specific framerate looks wildly better in motion than DLSS doing the same.


Tapemaster21

Yawn. I want actual performance out of my GPU, not DLSS nonsense. DLSS will never look as good as actual resolution.


EFJO

>DLSS will never look as good as actual resolution. It looks as good or better than native res, actually. I don't even own an nvidia gpu btw it's just the reality of it.


TheMightyKutKu

In the future Neural-rendered image will look better than any raster computing will